Patent classifications
B60W2556/35
SYSTEMS AND METHODS FOR BAYESIAN LIKELIHOOD ESTIMATION OF FUSED OBJECTS
A sensor fusion system and method are disclosed. One or more processors are operable to receive a plurality of object detection measurements from a plurality of sensors. Each of the plurality of object detection measurements are associated with a potential object detection track. A plurality of sensor confidence values associated with each of the plurality of sensors are received. A track confidence value is determined for each of the potential object detection tracks based on the received plurality of object detection measurements and the received plurality of sensor confidence values. An object detection for a potential object detection track that has a determined track confidence value meeting a predetermined detection threshold is then determined, or confirmed, and stored in a memory for subsequent use, and is relatively unaffected by a measurement from a sensor that has a field of view that omits or is occluded with respect to the given object detection track.
Techniques for detecting acknowledgment from a driver of a vehicle
Disclosed embodiments include techniques for providing alerts to a driver of a vehicle. The techniques include detecting a condition that exceeds a threshold hazard potential; detecting a predetermined gesture of a driver, the predetermined gesture indicating that the driver has acknowledged the detected condition; in response to detecting the predetermined gesture, reducing an urgency level for alerting the driver of the detected condition; and determining whether to issue an alert to the driver based on the reduced urgency level.
Methods and apparatus for estimating and compensating for wind disturbance force at a tractor trailer of an autonomous vehicle
A method includes receiving, iteratively over time, sets of data including vehicle dynamics data, image data, sound data, third-party data, and wind speed sensor data, each detected at an autonomous vehicle and associated with a time period. The method also includes estimating a first wind speed and a first wind direction for each time period, in response to receiving the sets of data and based on the sets of data, via a processor of the autonomous vehicle. The method also includes iteratively modifying a lateral control and/or a longitudinal control of the autonomous vehicle based on the estimated first wind speed and the estimated first wind direction, via the processor of the autonomous vehicle and during operation of the autonomous vehicle.
VEHICLE AND VEHICLE CONTROL METHOD
A vehicle control method according to an embodiment includes selecting a target vehicle to share low-level sensor information with based on the low-level sensor information obtained by a host vehicle and information on a structure included in high-definition map information, and receiving low-level sensor information of the structure from the target vehicle to update a sensor data map of the host vehicle, and performing map matching with the high-definition map information through feature points of a road extracted based on the updated sensor data map.
VEHICLE INTELLIGENT UNIT
Provided herein is technology relating to automated driving and particularly, but not exclusively, to a Vehicle Intelligent Unit (VIU) configured to provide vehicle operations and control for Connected Automated Vehicles (CAV) and, more particularly, to a VIU configured to connect with a Collaborative Automated Driving System (CADS) and manage and/or control information exchange between CAV and CADS and manage and/or control CAV lateral and longitudinal movements, including vehicle following, lane changing, and route guidance.
METHOD FOR PERCEIVING AN AUGMENTED REALITY FOR AN AGRICULTURAL UTILITY VEHICLE
A method for perceiving an augmented reality during a work assignment of an agricultural utility vehicle includes processing data of a data communication system of the agricultural utility vehicle via a control unit, sending the data processed via the control unit to a perception device, projecting the data received from the control unit as an item of optical information via a perception device, and jointly perceiving the real surroundings of the agricultural utility vehicle and the projected optical information via the perception device
DRIVING ASSISTANCE SYSTEM, DRIVING ASSISTANCE METHOD, AND STORAGE MEDIUM
Provided is a driving assistance system including a storage device having a program stored therein and a hardware processor, wherein the hardware processor executes the program stored in the storage device, to thereby recognize an object which is present outside of a vehicle on the basis of a detection result of at least one of a radar device and an imaging device which are mounted in the vehicle, perform driving assistance for the vehicle on the basis of a recognition result, and determine a degree of matching between a portion of a contour line of the object and a road partition line and suppress an operation of the driving assistance in a case where the degree of matching is equal to or greater than a threshold.
Method and device for operating a vehicle
A method for operating a vehicle, in particular, a vehicle for highly automated driving. The method includes a step of reading in input data. The input data include sensor data and sensor state data of a multitude of sensor units of vehicle. The method also includes a step of generating a potential field, using the input data. The input data are used as attractive potentials and repulsive potentials of the potential field. The method furthermore includes a step of determining a trajectory through the potential field in order to generate a fusion signal, using the trajectory, for fusing the input data for a sensor data fusion for a highly automated driving operation of the vehicle.
Navigation system with traffic state detection mechanism and method of operation thereof
A navigation system includes: a control circuit configured to: generate a video clip by parsing an interval of a sensor data stream for a region of travel; analyze the video clip submitted to a deep learning model, already trained, including identifying a traffic flow estimate; access a position coordinate for calculating a distance to intersection; generate a traffic flow state by fusing a corrected speed, the traffic flow estimate, and the distance to intersection; merge a vehicle maneuvering instruction into the traffic flow state for maneuvering through the region of travel; and a communication circuit, coupled to the control circuit, configured to: communicate the traffic flow state for displaying on a device.
Sensor fusion for precipitation detection and control of vehicles
An apparatus includes a processor configured to be disposed with a vehicle and a memory coupled to the processor. The memory stores instructions to cause the processor to receive, at least two of: radar data, camera data, lidar data, or sonar data. The sensor data is associated with a predefined region of a vicinity of the vehicle while the vehicle is traveling during a first time period. At least a portion of the vehicle is positioned within the predefined region during the first time period. The method also includes detecting that no other vehicle is present within the predefined region. An environment of the vehicle during the first time period is classified as one state from a set of states that includes at least one of dry, light rain, heavy rain, light snow, or heavy snow, based on at least two of the sensor data to produce an environment classification. An operational parameter of the vehicle based on the environment classification is modified.