B60W2556/35

DYNAMIC SENSOR DATA AUGMENTATION VIA DEEP LEARNING LOOP

Systems and methods for dynamic sensor data adaptation using a deep learning loop are provided. A method includes classifying, using a discriminator model, a first object from first sensor data associated with a first sensing condition, wherein the discriminator model is trained for a second sensing condition different from the first sensing condition; generating, using a generator model in response to the discriminator model failing to classify the first object, second sensor data representing a second object comprising at least a modified element of the first object; classifying, using the discriminator model, the second object from the second sensor data; and adapting, based at least in part on a difference between the first object and the second object in response to the discriminator model successfully classifying the second object, a machine learning model associated with object classification for the first sensing condition.

Sensor fusion for autonomous machine applications using machine learning

In various examples, a multi-sensor fusion machine learning model—such as a deep neural network (DNN)—may be deployed to fuse data from a plurality of individual machine learning models. As such, the multi-sensor fusion network may use outputs from a plurality of machine learning models as input to generate a fused output that represents data from fields of view or sensory fields of each of the sensors supplying the machine learning models, while accounting for learned associations between boundary or overlap regions of the various fields of view of the source sensors. In this way, the fused output may be less likely to include duplicate, inaccurate, or noisy data with respect to objects or features in the environment, as the fusion network may be trained to account for multiple instances of a same object appearing in different input representations.

VEHICLE POSITIONING METHOD VIA DATA FUSION AND SYSTEM USING THE SAME

A vehicle positioning method via data fusion and a system using the same are disclosed. The method is performed in a processor electrically connected to a self-driving-vehicle controller and multiple electronic systems. The method is to perform a delay correction according to a first real-time coordinate, a second real-time coordinate, real-time lane recognition data, multiple vehicle dynamic parameters, and multiple vehicle information received from the multiple electronic systems with their weigh values, to generate a fusion positioning coordinate, and to determine confidence indexes. Then, the method is to output the first real-time coordinate, the second real-time coordinate, and the real-time lane recognition data that are processed by the delay correction, the fusion positioning coordinate, and the confidence indexes to the self-driving-vehicle controller for a self-driving operation.

Navigation based on free space determination

Systems and methods navigate a vehicle by determining a free space region in which the vehicle can travel. In one implementation, a system may include at least one processor programmed to receive from an image capture device, a plurality of images associated with the environment of a vehicle, analyze at least one of the plurality of images to identify a first free space boundary on a driver side of the vehicle and extending forward of the vehicle, a second free space boundary on a passenger side of the vehicle and extending forward of the vehicle, and a forward free space boundary forward of the vehicle and extending between the first free space boundary and the second free space boundary. The first free space boundary, the second free space boundary, and the forward free space boundary may define a free space region forward of the vehicle. The at least one processor of the system may be further programmed to determine a navigational path for the vehicle through the free space region and cause the vehicle to travel on at least a portion of the determined navigational path within the free space region forward of the vehicle.

SYSTEM AND METHOD FOR DETECTING RAINFALL FOR AN AUTONOMOUS VEHICLE
20230182742 · 2023-06-15 ·

A system includes an autonomous vehicle and a control device associated with the autonomous vehicle. The control device obtains a plurality of sensor data captured by sensors of the autonomous vehicle. The control device determines a plurality of rainfall levels based on the sensor data. Each rainfall level is captured by a different sensor. the control device determines an aggregated rainfall level in a particular time period by combining the plurality of rainfall levels determined during the particular time period. The control device selects a particular object detection algorithm for detecting objects by at least one sensor. The particular object detection algorithm is configured to filter at least a portion of interference caused by the aggregated rainfall level in the sensor data. The control device causes the particular object detection algorithm to be implemented for the at least one sensor.

Autonomous Driving Control Apparatus and Method Thereof
20230182768 · 2023-06-15 ·

An autonomous driving control apparatus includes a sensor device obtaining information around an autonomous vehicle and including a plurality of sensors, a memory storing information about a high definition map around the autonomous vehicle, and a controller classifying the sensors into at least one sensor set based on the information around the autonomous vehicle and the information about the high definition map, using a sensor set classification table, monitoring a computational resource utilization rate and a resource occupancy rate of the memory, calculating a determiner input drop rate and determining whether there is an available resource, using the monitored computational resource utilization rate and the monitored resource occupancy rate, determining whether to additionally allocate at least one determiner using the determiner input drop rate and whether there is the available resource, and changing an autonomous driving determination period.

AUTONOMOUS DRIVING ASSISTANCE SYSTEM
20230174097 · 2023-06-08 · ·

Provided is an autonomous driving assistance system capable of monitoring the whole surrounding area around a vehicle using both of a roadside sensor and a sensor mounted to the vehicle. This autonomous driving assistance system includes a roadside sensor device and an autonomous driving control device mounted to a vehicle in a predetermined area, and an obstacle information processing device which communicates with these. The obstacle information processing device outputs, to the autonomous driving control device of each vehicle in the area, obstacle information around the vehicle, using a first obstacle detection result in an absolute coordinate system which is obstacle information detected by the roadside sensor device and a second obstacle detection result in an absolute coordinate system which is obstacle information detected by a sensor mounted to the vehicle.

METHOD AND ARRANGEMENT FOR MONITORING AND ADAPTING THE PERFORMANCE OF A FUSION SYSTEM OF AN AUTONOMOUS VEHICLE
20170297571 · 2017-10-19 · ·

Disclosed herein is a method and arrangement for monitoring and adapting the performance of a fusion system of an autonomous road vehicle. A drivable area is determined by combining a localization function and high density map data. Information on surrounding objects is determined, comprising determining the localization and classifying the surrounding objects, estimating their physical property states, and assigning them extension descriptions. Information on the drivable area and surrounding objects is condensed into observed areas, monitored by sensors of the environmental perception function with a predetermined degree of certainty, and prioritized objects, represented by classes, state estimates and extension descriptions. Having the fusion system monitor itself retrospectively by evaluating its current determinations of drivable area, prioritized objects and observed areas against its previous determinations thereof, and if a previous determination differ more than a predetermined amount from a current determination, adapting the fusion system to account for that discrepancy.

SYSTEM, METHOD AND COMPUTER PROGRAM TO SUPPRESS VIBRATIONS IN A VEHICLE
20220306130 · 2022-09-29 · ·

An electronic system for controlling vibrations and/or inertial forces occurring at a plurality of areas of interest within an operating vehicle, the electronic device comprising circuitry configured to: receive input data comprising sensor data from one or more environment sensors (12) and/or one or more internal sensors (14); convert, by means of a machine learning system (18), the input data into actuator settings; and transmit the actuator settings to one or more actuators (20) to control vibrations and/or inertial forces occurring at each of the plurality of areas of interest within the vehicle.

TRAVEL ROUTE GENERATION SYSTEM, TRAVEL ROUTE GENERATION PROGRAM, AND TRAVEL ROUTE GENERATION METHOD

A travel route generation system includes: a first collector that collects running location information indicating running locations through which a plurality of vehicles have run; a second collector that collects vehicle-related information related to the plurality of vehicles; a memory that stores the vehicle-related information in association with the running location information; a criteria inputter that receives inputs of screening criteria including a criterion regarding the vehicle-related information; a processor that generates a recommended running route according to the screening criteria and based on the running location information and the vehicle-related information which are stored in the memory; and an outputter that outputs the recommended running route generated by the processor.