Abstract
The invention provides systems and methods for an Intelligent Road Infrastructure System (IRIS), which facilitates vehicle operations and control for connected automated vehicle highway (CAVH) systems. IRIS systems and methods provide vehicles with individually customized information and real-time control instructions for vehicle driving tasks such as car following, lane changing, and route guidance. IRIS systems and methods also manage transportation operations and management services for both freeways and urban arterials. The IRIS manages one or more of the following function categories: sensing, transportation behavior prediction and management, planning and decision making, and vehicle control. IRIS is supported by real-time wired and/or wireless communication, power supply networks, and cyber safety and security services.
Claims
1. A system comprising a road side unit (RSU) network that comprises a plurality of networked communication devices spaced along a roadway, wherein the RSU network is configured to communicate with: a) a traffic control unit (TCU) that communicates with and manages information from a plurality of RSU networks and communicates with and is managed by a traffic control center (TCC); and b) on board units (OBUs) of a plurality of vehicles traveling on said roadway, wherein said RSU network provides sensing functions, transportation behavior prediction and management functions, planning and decision making functions, and vehicle control functions that generate vehicle control instructions for said system; and wherein said vehicle control instructions comprise instructions for vehicle longitudinal acceleration and speed, vehicle lateral acceleration and speed, and vehicle orientation and direction.
2. The system of claim 1 wherein said RSU network is configured to generate and send vehicle-specific control instructions to vehicle OBUs; or to receive vehicle-specific control instructions from a TCU and/or a TCC and send said vehicle-specific control instructions to vehicle OBUs, wherein said vehicle-specific control instructions comprise vehicle-specific instructions for longitudinal acceleration and speed, vehicle-specific lateral acceleration and speed, and vehicle-specific orientation and direction.
3. The system of claim 1 wherein said RSU network is configured to sense vehicles on a road.
4. The system of claim 1 wherein each RSU of the RSU network comprises a sensing module, a communication module, a data processing module, an interface module, and/or an adaptive power supply module.
5. The system of claim 1 wherein each RSU of the RSU network comprises a radar based sensor, a vision based sensor, a satellite based navigation system component, and/or a vehicle identification component.
6. The system of claim 5 wherein said satellite based navigation system component is configured to communicate with OBUs and locate vehicles.
7. The system of claim 1 wherein the RSUs of the RSU network are deployed at spacing intervals within the range of 50 to 500 meters.
8. The system of claim 1 wherein said RSU network is configured to provide high-resolution maps comprising lane width, lane approach, grade, and road geometry information to vehicles.
9. The system of claim 1 wherein said RSU network is configured to collect information comprising weather information, road condition information, lane traffic information, vehicle information, and/or incident information; generate vehicle-specific control instructions; and broadcast said information and vehicle-specific control instructions to vehicles and/or to the TCU network.
10. The system of claim 1 wherein said RSU network is configured to communicate with a cloud database.
11. The system of claim 1 wherein said RSU network is configured to provide data to OBUs, said data comprising vehicle control instructions, travel route and traffic information, and services data.
12. The system of claim 1 wherein said RSU network comprises RSUs installed at one or more fixed locations selected from the group consisting of a freeway roadside, freeway on/off ramp, intersection, roadside building, bridge, tunnel, roundabout, transit station, parking lot, railroad crossing, and/or school zone.
13. The system of claim 1 wherein said RSU network comprises RSUs installed at one or more mobile platforms selected from the group consisting of vehicles and unmanned aerial drones.
14. The system of claim 1 wherein said RSU network is configured to communicate with said TCU network in real-time over wired and/or wireless channels.
15. The system of claim 1 wherein said RSU network is configured to communicate with said OBUs in real-time over wireless channels.
Description
DRAWINGS
(1) FIG. 1 shows exemplary OBU Components. 101: Communication module: that can transfer data between RSU and OBU. 102: Data collection module: that can collect data of the vehicle dynamic and static state and generated by human. 103: Vehicle control module: that can execute control command from RSU. When the control system of the vehicle is damaged, it can take over control and stop the vehicle safely. 104: Data of vehicle and human. 105: Data of RSU.
(2) FIG. 2 shows an exemplary IRIS sensing framework. 201: Vehicles send data collected within their sensing range to RSUs. 202: RSUs collect lane traffic information based on vehicle data on the lane; RSUs share/broadcast their collected traffic information to the vehicles within their range. 203: RSU collects road incidents information from reports of vehicles within its covering range. 204: RSU of the incident segment send incident information to the vehicle within its covering range. 205: RSUs share/broadcast their collected information of the lane within its range to the Segment TCUs. 206: RSUs collect weather information, road information, incident information from the Segment TCUs. 207/208: RSU in different segment share information with each other. 209: RSUs send incident information to the Segment TCUs. 210/211: Different segment TCUs share information with each other. 212: Information sharing between RSUs and CAVH Cloud. 213: Information sharing between Segment TCUs and CAVH Cloud.
(3) FIG. 3 shows an exemplary IRIS prediction framework. 301: data sources comprising vehicle sensors, roadside sensors, and cloud. 302: data fusion module. 303: prediction module based on learning, statistical and empirical algorithms. 304: data output at microscopic, mesoscopic and macroscopic levels.
(4) FIG. 4 shows an exemplary Planning and Decision Making function. 401: Raw data and processed data for three level planning. 402: Planning Module for macroscopic, mesoscopic, and microscopic level planning. 403: Decision Making Module for vehicle control instructions. 404 Macroscopic Level Planning. 405 Mesoscopic Level Planning. 406 Microscopic Level Planning. 407 Data Input for Macroscopic Level Planning: raw data and processed data for macroscopic level planning. 408 Data Input for Mesoscopic Level Planning: raw data and processed data for mesoscopic level planning. 409 Data Input for Microscopic Level Planning: raw data and processed data for microscopic level planning.
(5) FIG. 5 shows an exemplary vehicle control flow component. 501: The planning and prediction module send the information to control method computation module. 502: Data fusion module receives the calculated results from different sensing devices. 503: Integrated data sent to the communication module of RSUs. 504: RSUs sends the control command to the OBUs.
(6) FIG. 6 shows an exemplary flow chart of longitudinal control.
(7) FIG. 7 shows an exemplary flow chart of latitudinal control.
(8) FIG. 8 shows an exemplary flow chart of fail-safe control.
(9) FIG. 9 shows exemplary RSU Physical Components. 901 Communication Module. 902 Sensing Module. 903 Power Supply Unit. 904 Interface Module: a module that communicates between the data processing module and the communication module. 905 Data Processing Module: a module that processes the data. 909: Physical connection of Communication Module to Data Processing Module. 910: Physical connection of Sensing Module to Data Processing Module. 911: Physical connection of Data Processing Module to Interface Module. 912: Physical connection of Interface Module to Communication Module
(10) FIG. 10 shows exemplary RSU internal data flows. 1001 Communication Module. 1002 Sensing Module. 1004 Interface Module: a module that communicates between the data processing module and the communication module. 1005 Data Processing Module. 1006 TCU. 1007 Cloud. 1008 OBU. 1013: Data flow from Communication Module to Data Processing Module. 1014: Data flow from Data Processing Module to Interface Module. 1015: Data flow from Interface Module to Communication Module. 1016: Data flow from Sensing Module to Data Processing Module.
(11) FIG. 11 shows an exemplary TCC/TCU Network Structure. 1101: control targets and overall system information provided by macroscopic TCC to regional TCC. 1102: regional system and traffic information provided by regional TCC to macroscopic TCC. 1103: control targets and regional information provided by regional TCC to corridor TCC. 1104: corridor system and traffic information provided by corridor TCC to regional TCC. 1105: control targets and corridor system information provided by corridor TCC to segment TCU. 1106: segment system and traffic information provided by segment TCU to corridor TCC. 1107: control targets and segment system information provided by segment TCU to point TCU. 1108: point system and traffic information provided by point TCU to corridor TCU. 1109: control targets and local traffic information provided by point TCU to RSU. 1110: RSU status and traffic information provided by RSU to point TCU. 1111: customized traffic information and control instructions from RSU to vehicles. 1112: information provided by vehicles to RSU. 1113: the services provided by the cloud to RSU/TCC-TCU network.
(12) FIG. 12 shows an exemplary architecture of a cloud system.
(13) FIG. 13 shows an exemplary IRIS Computation Flowchart. 1301: Data Collected From RSU, including but not limited to image data, video data, radar data, On-board unit data. 1302: Data Allocation Module, allocating computation resources for various data processing. 1303 Computation Resources Module for actual data processing. 1304 GPU, graphic processing unit, mainly for large parallel data. 1305 CPU, central processing unit, mainly for advanced control data. 1306 Prediction module for IRIS prediction functionality. 1307 Planning module for IRIS planning functionality. 1308 Decision Making for IRIS decision-making functionality. 1309 data for processing with computation resource assignment. 1310 processed data for prediction module, planning module, decision making module. 1311 results from prediction module to planning module. 1312 results from planning module to decision making module.
(14) FIG. 14 shows an exemplary Traffic and Lane Management Flowchart. 1401 Lane management related data collected by RSU and OBU. 1402 Control target and traffic information from upper level IRIS TCU/TCC network. 1403 Lane management and control instructions.
(15) FIG. 15 shows an exemplary Vehicle Control in Adverse Weather component. 1501: vehicle status, location and sensor data. 1502: comprehensive weather and pavement condition data and vehicle control instructions. 1503: wide area weather and traffic information obtained by the TCU/TCC network.
(16) FIG. 16 shows an exemplary IRIS System Security Design. 1601: Network firewall. 1602: Internet and outside services. 1603: Data center for data services, such as data storage and processing. 1604: Local server. 1605: Data transmission flow.
(17) FIG. 17 shows an exemplary IRIS System Backup and Recovery component. 1701: Cloud for data services and other services. 1702: Intranet. 1703: Local Storage for backup. 1704: any IRIS devices, i.e. RSU, TCU, or TCC.
(18) FIG. 18 shows an exemplary System Failure Management component.
(19) FIG. 19 shows a sectional view of an exemplary RSU deployment.
(20) FIG. 20 shows a top view of an exemplary RSU deployment.
(21) FIG. 21 shows exemplary RSU lane management on a freeway segment.
(22) FIG. 22 shows exemplary RSU lane management on a typical urban intersection.
DETAILED DESCRIPTION
(23) Exemplary embodiments of the technology are described below. It should be understood that these are illustrative embodiments and that the invention is not limited to these particular embodiments.
(24) FIG. 1 shows an exemplary OBU containing a communication module 101, a data collection module 102, and a vehicle control module 103. The data collection module 102 collects data related to a vehicle and a human 104 and then sends it 104 to an RSU through communication module 101. Also, OBU can receive data of RSU 105 through communication module 101. Based on the data of RSU 105, the vehicle control module 103 helps control the vehicle.
(25) FIG. 2 illustrates an exemplary framework of a lane management sensing system and its data flow.
(26) The RSU exchanges information between the vehicles and the road and communicates with TCUs, the information including weather information, road condition information, lane traffic information, vehicle information, and incident information.
(27) FIG. 3 illustrates exemplary workflow of a basic prediction process of a lane management sensing system and its data flow. In some embodiments, fused multi-source data collected from vehicle sensors, roadside sensors and the cloud is processed through models including but not limited to learning based models, statistical models, and empirical models. Then predictions are made at different levels including microscopic, mesoscopic, and macroscopic levels using emerging models including learning based, statistic based, and empirical models.
(28) FIG. 4 shows exemplary planning and decision making processes in an IRIS. Data 401 is fed into planning module 402 according to three planning level respectively 407, 408, and 409. The three planning submodules retrieve corresponding data and process it for their own planning tasks. In a macroscopic level 404, route planning and guidance optimization are performed. In a mesoscopic level 405, special event, work zone, reduced speed zone, incident, buffer space, and extreme weather are handled. In a microscopic level 406, longitudinal control and lateral control are generated based on internal algorithm. After computing and optimization, all planning outputs from the three levels are produced and transmitted to decision making module 403 for further processing, including steering, throttle control, and braking.
(29) FIG. 5 shows exemplary data flow of an infrastructure automation based control system. The control system calculates the results from all sensing detectors, conducts data fusion, and exchanges information between RSUs and Vehicles. The control system comprises: a) Control Method Computation Module 501; b) Data Fusion Module 502; c) Communication Module (RSU) 503; and d) Communication Module (OBU) 504.
(30) FIG. 6 illustrates an exemplary process of vehicle longitudinal control. As shown in the figure, vehicles are monitored by the RSUs. If related control thresholds (e.g., minimum headway, maximum speed, etc.) are reached, the necessary control algorithms is triggered. Then the vehicles follow the new control instructions to drive. If instructions are not confirmed, new instructions are sent to the vehicles.
(31) FIG. 7 illustrates an exemplary process of vehicle latitudinal control. As shown in the figure, vehicles are monitored by the RSUs. If related control thresholds (e.g., lane keeping, lane changing, etc.) are reached, the necessary control algorithms are triggered. Then the vehicles follows the new control instructions to drive. If instructions are not confirmed, new instructions are sent to the vehicles.
(32) FIG. 8 illustrates an exemplary process of vehicle fail safe control. As shown in the figure, vehicles are monitored by the RSUs. If an error occurs, the system sends the warning message to the driver to warn the driver to control the vehicle. If the driver does not make any response or the response time is not appropriate for driver to take the decision, the system sends the control thresholds to the vehicle. If related control thresholds (e.g., stop, hit the safety equipment, etc.) are reached, the necessary control algorithms is triggered. Then the vehicles follows the new control instructions to drive. If instructions are not confirmed, new instructions are sent to the vehicles.
(33) FIG. 9 shows an exemplary physical component of a typical RSU, comprising a Communication Module, a Sensing Module, a Power Supply Unit, an Interface Module, and a Data Processing Module. The RSU may any of variety of module configurations. For example, for the sense module, a low cost RSU may only include a vehicle ID recognition unit for vehicle tracking, while a typical RSU includes various sensors such as LiDAR, cameras, and microwave radar.
(34) FIG. 10 shows an exemplary internal data flow within a RSU. The RSU exchanges data with the vehicle OBUs, upper level TCU and the cloud. The data processing module includes two processors: external object calculating Module (EOCM) and AI processing unit. EOCM is for traffic object detection based on inputs from the sensing module and the AI processing unit focuses more on decision-making processes.
(35) FIG. 11 show an exemplary structure of a TCC/TCU network. A macroscopic TCC, which may or may not collaborate with an external TOC, manages a certain number of regional TCCs in its coverage area. Similar, a regional TCC manages a certain number of corridor TCCs, a corridor TCC manages a certain number of segment TCUs, a segment TCU manages a certain number of point TCUs, and a point TCUs manages a certain number of RSUs. An RSU sends customized traffic information and control instructions to vehicles and receives information provided by vehicles. The network is supported by the services provided by the cloud.
(36) FIG. 12 shows how an exemplary cloud system communicates with sensors of RSU, TCC/TCU (1201) and TOC through communication layers (1202). The cloud system contains cloud infrastructure (1204), platform (1205), and application service (1206). The application services also support the applications (1203).
(37) FIG. 13 shows exemplary data collected from sensing module 1301 such as image data, video data, and vehicle status data. The data is divided into two groups by the data allocation module 1302: large parallel data and advanced control data. The data allocation module 1302 decides how to assign the data 1309 with the computation resources 1303, which are graphic processing units (GPUs) 1304 and central processing units (CPUs) 1305. Processed data 1310 is sent to prediction 1306, planning 1307, and decision making modules 1308. The prediction module provides results to the planning module 1311, and the planning module provides results 1312 to the decision making module.
(38) FIG. 14 shows how exemplary data collected from OBUs and RSUs together with control targets and traffic information from upper level IRIS TCC/TCC network 1402 are provided to a TCU. The lane management module of a TCU produces lane management and vehicle control instructions 1403 for a vehicle control module and lane control module.
(39) FIG. 15 shows exemplary data flow for vehicle control in adverse weather. Table 1, below, shows approaches for measurement of adverse weather scenarios.
(40) TABLE-US-00001 TABLE 1 IRIS Measures for Adverse Weather Scenarios IRIS Normal autonomous vehicle HDMap + TOC + RSU (only sensors) (Camera + Radar + Lidar)/OBU can greatly Camera mitigate the impact of adverse weather. Visibility of Radar Lidar Solution Solution Impact in lines/signs/ Detecting Detecting for degrade for degrade Enhancement adverse objects distance distance of of distance for vehicle weather degraded. degraded. degraded. visibility. detection. control. Rain ** ** ** HDMap RSU has a RSU can Snow *** ** ** provides whole control Fog **** **** **** info of vision of vehicle Sandstorm **** **** **** lane/line/sign/ all vehicles according geometry, on the to weather which road, so the (e.g., lower enhance chance of the speed RSU's vision. crash with on icy other road). vehicles are eliminated. Number of “*” means the degree of decrease.
(41) FIG. 16 shows exemplary IRIS security measures, including network security and physical equipment security. Network security is enforced by firewalls 1601 and periodically complete system scans at various levels. These firewalls protect data transmission 1605 either between the system and an Internet 1601 or between data centers 1603 and local servers 1604. For physical equipment security, the hardware is safely installed and secured by an identification tracker and possibly isolated.
(42) In FIG. 17, periodically, IRIS system components 1704 back up the data to local storage 1703 in the same Intranet 1702 through firewall 1601. In some embodiments, it also uploads backup copy through firewall 1601 to the Cloud 1701, logically locating in the Internet 1702.
(43) FIG. 18 shows an exemplary periodic IRIS system check for system failure. When failure happens, the system fail handover mechanism is activated. First, failure is detected and the failed node is recognized. The functions of failed node are handed over to shadow system and success feedback is sent back to an upper level system if nothing goes wrong. Meanwhile, a failed system/subsystem is restarted and/or recovered from a most recent backup. If successful, feedback is reported to an upper level system. When the failure is addressed, the functions are migrated back to the original system.
(44) Exemplary hardware and parameters that find use in embodiments of the present technology include, but are not limited to the following:
(45) OBU:
(46) a) Communication module Technical Specifications Standard Conformance: IEEE 802.11p-2010 Bandwidth: 10 MHz Data Rates: 10 Mbps Antenna Diversity CDD Transmit Diversity Environmental Operating Ranges: −40° C. to +55° C. Frequency Band: 5 GHz Doppler Spread: 800 km/h Delay Spread: 1500 ns Power Supply: 12/24V b) Data collection module Hardware technical Specifications Intuitive PC User Interface for functions such as configuration, trace, transmit, filter, log etc. High data transfer rate c) Software technical Specifications Tachograph Driver alerts and remote analysis. Real-Time CAN BUS statistics. CO2 Emissions reporting. d) Vehicle control module Technical Specifications Low power consumption Reliable longitudinal and lateral vehicle control
RSU Design a) communication module which include three communication channels: Communication with vehicles including DSRC/4G/5G (e.g., MK5 V2X from Cohda Wireless) Communication with point TCUs including wired/wireless communication (e.g., Optical Fiber from Cablesys) Communication with cloud including wired/wireless communication with at least 20M total bandwidth b) data Processing Module which include two processors: External Object Calculating Module (EOCM) Process Object detection using Data from the sensing module and other necessary regular calculation (e.g., Low power fully custom ARM/X86 based processor) AI processing Unit Machine learning Decision making/planning and prediction processing c) an interface Module: FPGA based Interface unit
(47) FPGA processor that acts like a bridge between the AI processors and the External Object Calculating Module processors and send instructions to the communication modules
(48) The RSU deployment
(49) a. Deployment location The RSU deployment is based on function requirement and road type. An RSU is used for sensing, communicating, and controlling vehicles on the roadway to provide automation. Since the LIDAR and other sensors (like loop detectors) need different special location, some of them can be installed separately from the core processor of RSU. Two exemplary types of RSU location deployment type: i. Fixed location deployment. The location of this type of RSU are fixed, which is used for serving regular roadways with fixed traffic demand on the daily basis. ii. Mobile deployment. Mobile RSU can be moved and settled in new place and situation swiftly, is used to serve stochastic and unstable demand and special events, crashes, and others. When an event happens, those mobile RSU can be moved to the location and perform its functions. b. Method for coverage The RSUs may be connected (e.g., wired) underground. RSUs are mounted on poles facing down so that they can work properly. The wings of poles are T-shaped. The roadway lanes that need CAVH functions are covered by sensing and communication devices of RSU. There are overlaps between coverage of RSUs to ensure the work and performance. c. Deployment Density The density of deployment depends on the RSU type and requirement. Usually, the minimum distance of two RSU depends on the RSU sensors with minimum covering range. d. Blind spot handling There may be blind sensing spots causing by vehicles blocking each other. The issue is common and especially serious when spacing between vehicles are close. A solution for this is to use the collaboration of different sensing technologies from both RSUs deployed on infrastructures and OBUs that are deployed on vehicles. This type of deployment is meant to improve traffic condition and control performance, under certain special conditions. Mobile RSU can be brought by agents to the deployment spot. In most cases, due to the temporary use of special RSUs, the poles for mounting are not always available. So, those RSU may be installed on temporary frames, buildings along the roads, or even overpasses that are location-appropriate.
(50) Certain exemplary RSU configurations are shown in FIGS. 19-22. FIG. 19 shows a sectional view of an exemplary RSU deployment. FIG. 20 shows an exemplary top view of an RSU deployment. In this road segment, sensing is covered by two types of RSU: 901 RSU A: camera groups, the most commonly used sensors for objects detection; and 902 RSU B: LIDAR groups, which makes 3D representation of targets, providing higher accuracy. Cameras sensor group employ a range that is lower than LIDAR, e.g. in this particular case, below 150 m, so a spacing of 150 m along the roads for those camera groups. Other type of RSUs have less requirement on density (e.g., some of them like LIDAR or ultrasonic sensors involve distances that can be greater).
(51) FIG. 21 shows an exemplary RSU lane management configuration for a freeway segment. The RSU sensing and communication covers each lane of the road segment to fulfill the lane management functions examples (showed in red arrows in figure) including, but not limited to: 1) Lane changing from one lane to another; 2) Merging manipulations from an onramp; 3) Diverging manipulations from highway to offramp; 4) Weaving zone management to ensure safety; and 5) Revisable lane management.
(52) FIG. 22 shows an exemplary lane management configuration for a typical urban intersection. The RSU sensing and communication covers each corner of the intersection to fulfill the lane management functions examples (showed in red in figure) including: 1) Lane changing from one lane to another; 2) Movement management (exclusive left turns in at this lane); 3) Lane closure management at this leg; and 4) Exclusive bicycle lane management.