G01S2013/9323

Motion Classification Using Low-Level Detections
20230013221 · 2023-01-19 ·

Techniques and apparatuses are described that implement motion classification using low-level detections. In particular, a radar system identifies fused detections associated with an object and determines whether the fused detections indicate that the object is moving. If it is determined to be moving or moving perpendicular to the host vehicle, a current motion counter or perpendicular motion counter is incremented, respectively. A current motion flag and/or a perpendicular motion flag are set as true if the current motion counter or the perpendicular motion counter has a value greater than a threshold value, respectively. In response to setting either flag as true, the radar system increments a historical motion counter as true. The host vehicle is then operated based on the current motion flag, the perpendicular motion flag, and the historical motion counter. In this way, the radar system introduces hysteresis to improve the reliability and stability of motion classification.

SENSOR UNIT
20230221407 · 2023-07-13 ·

A sensor unit for a vehicle includes an external sensor, a cleaning nozzle and a housing. The external sensor is configured to obtain information of an external environment, and to have a sensing area being set forward in a travel direction of the vehicle through an exposed surface exposed to the external environment. The cleaning nozzle has an injection port that is located in front of the exposed surface to inject a cleaning fluid to the exposed surface from above of the exposed surface in a yaw axis direction of the vehicle to clean the exposed surface. The housing is provided to hold the external sensor therein. The housing is configured to define a recess that is recessed toward a rearward in the travel direction from the exposed surface below the exposed surface in the yaw axis direction.

Extrinsic calibration of multiple vehicle sensors using combined target detectable by multiple vehicle sensors

Sensors coupled to a vehicle are calibrated, optionally using a dynamic scene with sensor targets around a motorized turntable that rotates the vehicle to different orientations. One vehicle sensor captures a representation of one feature of a sensor target, while another vehicle sensor captures a representation of a different feature of the sensor target, the two features of the sensor target having known relative positioning on the target. The vehicle generates a transformation that maps the captured representations of the two features to positions around the vehicle based on the known relative positioning of the two features on the target.

SENSOR EVALUATION DEVICE
20230010354 · 2023-01-12 ·

A sensor evaluation device evaluates a first sensor mounted on a sensor-mounting object. The sensor evaluation device is provided with a specific event detecting unit, and a recording unit. The specific event detecting unit detects a specific event which is at least one of (a) an unrecognized event where a second sensor mounted on the sensor-mounting object recognizes a first target whereas the first sensor does not recognize the first target, and (b) a misrecognized event where the first sensor recognizes a second target whereas the second sensor does not recognize the second target. The recording unit records information on the specific event when detecting the specific event.

Segmentation and classification of point cloud data

A system can include a computer including a processor and a memory, the memory storing instructions executable by the processor to receive point cloud data. The instructions further include instructions to generate a plurality of feature maps based on the point cloud data, each feature map of the plurality of feature maps corresponding to a parameter of the point cloud data. The instructions further include instructions to aggregate the plurality of feature maps into an aggregated feature map. The instructions further include instructions to generate, via a feedforward neural network, at least one of a segmentation output or a classification output based on the aggregated feature map.

Model for excluding vehicle from sensor field of view

The technology relates to developing a highly accurate understanding of a vehicle's sensor fields of view in relation to the vehicle itself. A training phase is employed to gather sensor data in various situations and scenarios, and a modeling phase takes such information and identifies self-returns and other signals that should either be excluded from analysis during real-time driving or accounted for to avoid false positives. The result is a sensor field of view model for a particular vehicle, which can be extended to other similar makes and models of that vehicle. This approach enables a vehicle to determine when sensor data is of the vehicle or something else. As a result, the detailed modeling allowing the on-board computing system to make driving decisions and take other actions based on accurate sensor information.

Multi-domain neighborhood embedding and weighting of sampled data
11693090 · 2023-07-04 · ·

This document describes “Multi-domain Neighborhood Embedding and Weighting” (MNEW) for use in processing point cloud data, including sparsely populated data obtained from a lidar, a camera, a radar, or combination thereof. MNEW is a process based on a dilation architecture that captures pointwise and global features of the point cloud data involving multi-scale local semantics adopted from a hierarchical encoder-decoder structure. Neighborhood information is embedded in both static geometric and dynamic feature domains. A geometric distance, feature similarity, and local sparsity can be computed and transformed into adaptive weighting factors that are reapplied to the point cloud data. This enables an automotive system to obtain outstanding performance with sparse and dense point cloud data. Processing point cloud data via the MNEW techniques promotes greater adoption of sensor-based autonomous driving and perception-based systems.

METHOD FOR DETECTING AN OBSTACLE ON A ROUTE

A computer-implemented method for detecting an obstacle on a route ahead of a first vehicle. In the method, information on a second vehicle driving ahead on the route is recorded in the first vehicle by at least one sensor of the first vehicle. In the first vehicle, depending on the recorded information, a computer detects an avoidance maneuver of the second vehicle due to an obstacle or detects that the second vehicle has driven over an obstacle. An obstacle is detected on the route depending on the detected avoidance maneuver or the detection that the vehicle has driven over an obstacle. A measure for protecting the vehicle and/or the obstacle is initiated depending on the detected obstacle.

Driving assist device and driving assist method
11541888 · 2023-01-03 · ·

A driving assist device includes a first sensor, a second sensor, and a control device. The control device does not execute an inter-vehicle distance control under a predetermined first condition upon determination that at least one preceding object is detected based on the output of one of the first sensor and the second sensor without being detected based on the output of the other of the first and second sensors; and an environment of a non-detection sensor that is the other of the first and second sensors satisfies a first requirement for determination of a reliability of the output of the non-detection sensor; and the control device executes the inter-vehicle distance control under a predetermined second condition upon determination that the environment of the non-detection sensor satisfies a second requirement for determination of the reliability of the output of the non-detection sensor.