G01S2013/93185

Parking assistant and method for adaptive parking of a vehicle to optimize overall sensing coverage of a traffic environment

A method can be used for adaptive parking of a vehicle. A parking area is determined around a programmed destination of the vehicle. The parking area has more than one available parking spot for the vehicle. Parking data is acquired via a wireless communication network. The parking data for each parked vehicle includes a parking position and an individual sensing coverage of an environment sensor system of the respective parked vehicle scanning the traffic environment within the parking area. Available parking spots are ranked based on a calculated overall sensing coverage and a recommended parking spot is determined among the available parking spots based on overall sensing coverage of the traffic environment in the parking area.

Radar system control to perform cross-traffic management in a vehicle with a trailer

Systems and methods to control a radar system to perform cross-traffic management in a vehicle with a trailer involve determining a location of the trailer behind the vehicle, and adjusting an initial field of view of the radar system to a modified field of view that excludes the location of the trailer. One or more other vehicles are detected with the radar system having the modified field of view. A cross-traffic alert is implemented based on the detecting the one or more other vehicles.

Comprehensive and efficient method to incorporate map features for object detection with LiDAR

According to various embodiments, systems and methods described in the disclosure combine mapped features with point cloud features to improve object detection precision of an autonomous driving vehicle (ADV). The map features and the point cloud features can be extracted from a perception area of the ADV within a particular angle view at each driving cycle based on a position of the ADV. The map features and the point cloud features can be concatenated and provided to a neutral network for object detections.

Automatic autonomous vehicle and robot LiDAR-camera extrinsic calibration

Extrinsic calibration of a Light Detection and Ranging (LiDAR) sensor and a camera can comprise constructing a first plurality of reconstructed calibration targets in a three-dimensional space based on physical calibration targets detected from input from the LiDAR and a second plurality of reconstructed calibration targets in the three-dimensional space based on physical calibration targets detected from input from the camera. Reconstructed calibration targets in the first and second plurality of reconstructed calibration targets can be matched and a six-degree of freedom rigid body transformation of the LiDAR and camera can be computed based on the matched reconstructed calibration targets. A projection of the LiDAR to the camera can be computed based on the computed six-degree of freedom rigid body transformation.

Radar system for internal and external environmental detection

Examples disclosed herein relate to radar systems to coordinate detection of objects external to the vehicle and distractions within the vehicle. A method of environmental detection with a radar system includes detecting an object in an external environment of a vehicle with the radar system positioned on the vehicle. The method includes determining a distraction metric from measurements of user activity obtained within the vehicle with the radar system. The method includes adjusting one or more detection parameters of the radar system based at least on the detected object and the distraction metric. Other examples disclosed herein relate to a radar sensing unit for a vehicle that includes an internal distraction sensor, an external object detection sensor, a coordination sensor and a central controller for internal and external environmental detection.

Vehicle and method of controlling the same
11511731 · 2022-11-29 · ·

A vehicle includes: recognizing a forward vehicle in response to the processing of image data captured by an image sensor disposed at the vehicle so as to have a field of view of the outside of the vehicle; obtaining a distance from the forward vehicle in response to the processing of detecting data captured by a radar disposed at the vehicle so as to have a detecting area of the outside of the vehicle; obtaining a change amount of vertical movement of the forward vehicle in the image data in response to the distance from the forward vehicle that is equal to or less than a reference distance; obtaining a height of an obstacle on a road surface corresponding to the change amount; obtaining the height of the obstacle on the road surface in the image data in response to the distance from the forward vehicle that exceeds the reference distance; identifying a driving speed of the vehicle; identifying a reference height corresponding to the driving speed of the vehicle; and outputting deceleration guide information in response to the height of the obstacle on the road surface that is greater than or equal to the reference height.

Simulating degraded sensor data

Simulated degraded sensor data may be generated for use in training a model. For instance, first sensor data collected by a sensor of a perception system of an autonomous vehicle may be received and converted into the simulated degraded sensor data for a particular degrading condition, such as a weather-related degrading condition. Then, the simulated degraded sensor data may be used to train a model for evaluating performance of the perception system to detect objects external to the autonomous vehicle under one or more conditions.

Driver assistance system and method
11505183 · 2022-11-22 · ·

A driver assistance system and method are disclosed. The driver assistance system includes a first sensor installed at a vehicle and configured to have a field of view directed forward from the vehicle to acquire front image data, a second sensor selected from a group of radar and LIDAR sensors, installed at the vehicle, and configured to have a field of view directed forward from the vehicle to acquire front detection data, and a controller having a processor configured to process the front image data and the front detection data, wherein the controller is configured to detect a lane, in which the vehicle is traveling, or detect a front object located in front of the vehicle, in response to the processing of the image data and the front detection data, output a braking signal to a braking system of the vehicle when a collision between the vehicle and the front object is expected, and output a steering signal to a steering system of the vehicle when a collision between the vehicle and the front object is expected even with braking control.

BRAKING CONTROL APPARATUS
20230059051 · 2023-02-23 ·

A braking control apparatus for controlling a braking operation for an own vehicle is configured to: acquire information on an object detected around the own vehicle; calculate, when a collision between the own vehicle and the object is predicted based on both an estimated route of the object estimated based on the acquired information on the object and an estimated route of the own vehicle, a collision range in the own vehicle at a collision timing between the own vehicle and the object or a collision range in the object at the collision timing; and control, according to a positional relationship between a predetermined braking-unrequired range in the own vehicle and the calculated collision range in the own vehicle or a positional relationship between a predetermined braking-unrequired range in the object and the calculated collision range in the object, whether to perform the braking operation for the own vehicle.

VELOCITY REGRESSION SAFETY SYSTEM

Techniques for accurately predicting and avoiding collisions with objects detected in an environment of a vehicle are discussed herein. A vehicle safety system can implement a model to output data indicating an intersection probability between the object and a portion of the vehicle in the future. The model may employ a rear collision filter, a distance filter, and a time to stop filter to determine whether a predicted collision may be a false positive, in which case the techniques may include refraining from reporting such predicted collision to other another vehicle computing device to control the vehicle.