B60W50/06

Multi-sensor sequential calibration system

Techniques for performing a sensor calibration using sequential data is disclosed. An example method includes receiving, from a first camera located on a vehicle, a first image comprising at least a portion of a road comprising lane markers, where the first image is obtained by the camera at a first time; obtaining a calculated value of a position of an inertial measurement (IM) device at the first time; obtaining an optimized first extrinsic matrix of the first camera by adjusting a function of a first actual pixel location of a location of a lane marker in the first image and an expected pixel location of the location of the lane marker; and performing autonomous operation of the vehicle using the optimized first extrinsic matrix of the first camera when the vehicle is operated on another road or at another time.

Multi-sensor sequential calibration system

Techniques for performing a sensor calibration using sequential data is disclosed. An example method includes receiving, from a first camera located on a vehicle, a first image comprising at least a portion of a road comprising lane markers, where the first image is obtained by the camera at a first time; obtaining a calculated value of a position of an inertial measurement (IM) device at the first time; obtaining an optimized first extrinsic matrix of the first camera by adjusting a function of a first actual pixel location of a location of a lane marker in the first image and an expected pixel location of the location of the lane marker; and performing autonomous operation of the vehicle using the optimized first extrinsic matrix of the first camera when the vehicle is operated on another road or at another time.

De-Aliased Imaging for a Synthetic Aperture Radar
20230018183 · 2023-01-19 ·

This document describes techniques for enabling de-aliased imaging for a synthetic aperture radar. Radar signals processed by a synthetic aperture radar (SAR) system may include false detections in the form of aliasing induced by grating lobes. The techniques described herein can reduce the adverse effects of grating lobes by obtaining an initial SAR image using a back-projection algorithm. Aliasing effects (e.g., false detections) in this initial image may be common due to the limitations of an SAR system moving at non-uniform speeds. A refined image is produced from the initial image by applying a de-aliasing filter to the initial image. The refined image may have reduced or eliminated false detections that attribute to aliasing effects, resulting in a better representation of the environment of the vehicle.

De-Aliased Imaging for a Synthetic Aperture Radar
20230018183 · 2023-01-19 ·

This document describes techniques for enabling de-aliased imaging for a synthetic aperture radar. Radar signals processed by a synthetic aperture radar (SAR) system may include false detections in the form of aliasing induced by grating lobes. The techniques described herein can reduce the adverse effects of grating lobes by obtaining an initial SAR image using a back-projection algorithm. Aliasing effects (e.g., false detections) in this initial image may be common due to the limitations of an SAR system moving at non-uniform speeds. A refined image is produced from the initial image by applying a de-aliasing filter to the initial image. The refined image may have reduced or eliminated false detections that attribute to aliasing effects, resulting in a better representation of the environment of the vehicle.

SENSOR INFORMATION FUSION METHOD AND DEVICE, AND RECORDING MEDIUM RECORDING PROGRAM FOR EXECUTING THE METHOD
20230020920 · 2023-01-19 · ·

A sensor information fusion method of an embodiment includes obtaining N sensor tracks from each of a plurality of sensors with respect to a target located around a vehicle, calculating association costs of the N sensor tracks with respect to M reference tracks, and storing the association costs in a matrix form, and calculating an arrangement of reference tracks and sensor tracks that minimize the association costs with respect to the matrix, and outputting a sensing information result with respect to the target according to the arrangement of the reference tracks and the sensor tracks calculated by the plurality of sensors.

SENSOR INFORMATION FUSION METHOD AND DEVICE, AND RECORDING MEDIUM RECORDING PROGRAM FOR EXECUTING THE METHOD
20230020920 · 2023-01-19 · ·

A sensor information fusion method of an embodiment includes obtaining N sensor tracks from each of a plurality of sensors with respect to a target located around a vehicle, calculating association costs of the N sensor tracks with respect to M reference tracks, and storing the association costs in a matrix form, and calculating an arrangement of reference tracks and sensor tracks that minimize the association costs with respect to the matrix, and outputting a sensing information result with respect to the target according to the arrangement of the reference tracks and the sensor tracks calculated by the plurality of sensors.

Method for operating a driver assistance system of an ego vehicle having at least one surroundings sensor for detecting the surroundings of the ego vehicle, computer readable medium, system and vehicle

A driver assistance system of an ego vehicle is operated. The ego vehicle has at least one surroundings sensor for detecting the surroundings of the ego vehicle. Movements of multiple vehicles are detected with the at least one surroundings sensor in the surroundings of the ego vehicle. A movement model is generated based on the detected movements of the respective vehicles. A traffic situation is ascertained and a probability of correct classification of the traffic situation on the basis of the generated movement model by a machine learning method. The traffic situation and the probability of the correct classification of the traffic situation are ascertained by the machine learning method on the basis of the learned characteristic features of the movement model. The driver assistance system of the ego vehicle is adapted to the ascertained traffic situation.

Method for operating a driver assistance system of an ego vehicle having at least one surroundings sensor for detecting the surroundings of the ego vehicle, computer readable medium, system and vehicle

A driver assistance system of an ego vehicle is operated. The ego vehicle has at least one surroundings sensor for detecting the surroundings of the ego vehicle. Movements of multiple vehicles are detected with the at least one surroundings sensor in the surroundings of the ego vehicle. A movement model is generated based on the detected movements of the respective vehicles. A traffic situation is ascertained and a probability of correct classification of the traffic situation on the basis of the generated movement model by a machine learning method. The traffic situation and the probability of the correct classification of the traffic situation are ascertained by the machine learning method on the basis of the learned characteristic features of the movement model. The driver assistance system of the ego vehicle is adapted to the ascertained traffic situation.

TOOLS FOR PERFORMANCE TESTING AND/OR TRAINING AUTONOMOUS VEHICLE PLANNERS

A computer-implemented method of evaluating the performance of a target planner for an ego robot in a real or simulated scenario, the method comprising: receiving evaluation data for evaluating the performance of the target planner in the scenario, the evaluation data generated by applying the target planner at incrementing planning steps, in order to compute a series of ego plans that respond to changes in the scenario, the series of ego plans being implemented in the scenario to cause changes in an ego state the evaluation data comprising: the ego plan computed by the target planner at one of the planning steps, and a scenario state at a time instant of the scenario, wherein the evaluation data is used to evaluate the target planner by: computing a reference plan for said time instant based on the scenario state, the scenario state including the ego state at that time instant as caused by implementing one or more preceding ego plans of the series of ego plans computed by the target planner, and computing at least one evaluation score for comparing the ego plan with the reference plan.

TOOLS FOR PERFORMANCE TESTING AND/OR TRAINING AUTONOMOUS VEHICLE PLANNERS

A computer-implemented method of evaluating the performance of a target planner for an ego robot in a real or simulated scenario, the method comprising: receiving evaluation data for evaluating the performance of the target planner in the scenario, the evaluation data generated by applying the target planner at incrementing planning steps, in order to compute a series of ego plans that respond to changes in the scenario, the series of ego plans being implemented in the scenario to cause changes in an ego state the evaluation data comprising: the ego plan computed by the target planner at one of the planning steps, and a scenario state at a time instant of the scenario, wherein the evaluation data is used to evaluate the target planner by: computing a reference plan for said time instant based on the scenario state, the scenario state including the ego state at that time instant as caused by implementing one or more preceding ego plans of the series of ego plans computed by the target planner, and computing at least one evaluation score for comparing the ego plan with the reference plan.