G01S13/865

Sensor fusion for precipitation detection and control of vehicles

An apparatus includes a processor configured to be disposed with a vehicle and a memory coupled to the processor. The memory stores instructions to cause the processor to receive, at least two of: radar data, camera data, lidar data, or sonar data. The sensor data is associated with a predefined region of a vicinity of the vehicle while the vehicle is traveling during a first time period. At least a portion of the vehicle is positioned within the predefined region during the first time period. The method also includes detecting that no other vehicle is present within the predefined region. An environment of the vehicle during the first time period is classified as one state from a set of states that includes at least one of dry, light rain, heavy rain, light snow, or heavy snow, based on at least two of the sensor data to produce an environment classification. An operational parameter of the vehicle based on the environment classification is modified.

Switching between object detection and data transfer with a vehicle radar

In one embodiment, a method includes determining an operational status of a vehicle including a radar antenna. The operational status is related to autonomous-driving operations of the vehicle in an environment. The method includes determining an expected amount of signaling resources associated with the radar antenna to be utilized by the vehicle while the vehicle performs the autonomous-driving operations, based at least on the operational status of the vehicle and the environment. The method includes determining to switch one or more of the signaling resources associated with the radar antenna from a first mode to a second mode based on the expected amount of signaling resources to be utilized by the radar antenna while the vehicle performs the autonomous-driving operations. The method includes causing the one or more of the signaling resources associated with the radar antenna to switch from the first mode to the second mode.

PREDICTIVE SENSOR ARRAY CONFIGURATION SYSTEM FOR AN AUTONOMOUS VEHICLE
20180009441 · 2018-01-11 ·

An autonomous vehicle (AV) can include a set of sensors generating sensor data corresponding to a surrounding environment of the AV. The AV can further include a control system that determines imminent lighting conditions for one or more cameras of the set of sensors, and executes a set of configurations for the one or more cameras to preemptively compensate for the imminent lighting conditions.

Estimating three-dimensional target heading using a single snapshot
11709250 · 2023-07-25 · ·

Provided herein is a system and method to determine a three-dimensional heading of a target. The system includes a radar sensor that obtains a three-dimensional snapshot of radar data comprising Doppler velocities and spatial positions of a plurality of detection points of a target, one or more processors, and a memory storing instructions that, when executed by the one or more processors, causes the system to perform conducting a first estimation of a three-dimensional heading of the target based on the spatial positions; conducting a second estimation of the three-dimensional heading of the target based on the Doppler velocities; and obtaining a combined estimation of the three-dimensional heading of the target based on a weighted sum of the first estimation and the second estimation.

Data driven resolution function derivation

Techniques for determining a probability of a false negative associated with a location of an environment are discussed herein. Data from a sensor, such as a radar sensor, can be received that includes point cloud data, which includes first and second data points. The first data point has a first attribute and the second data point has a second attribute. A difference between the first and second attributes is determined such that a frequency distribution may be determined. The frequency distribution may then be used to determine a distribution function, which allows for the determination of a resolution function that is associated with the sensor. The resolution function may then be used to determine a probability of a false negative at a location in an environment. The probability can be used to control a vehicle in a safe and reliable manner.

Autonomy first route optimization for autonomous vehicles

Embodiments herein can determine an optimal route for an autonomous electric vehicle. The system may score viable routes between the start and end locations of a trip using a numeric or other scale that denotes how viable the route is for autonomy. The score is adjusted using a variety of factors where a learning process leverages both offline and online data. The scored routes are not based simply on the shortest distance between the start and end points but determine the best route based on the driving context for the vehicle and the user.

SENSOR FOR DEGRADED VISUAL ENVIRONMENT
20230006348 · 2023-01-05 ·

A sensing system. In some embodiments, the system includes a first imaging radio frequency receiver, a second imaging radio frequency receiver, a first optical beam combiner, a first imaging optical receiver, a second optical beam combiner, and an optical detector array. The first optical beam combiner may be configured to combine optical signals of the imaging radio frequency receivers. The second optical beam combiner may be configured to combine the optical signals of the imaging radio frequency receivers, and the optical signal of the first imaging optical receiver.

ACTIVE ALIGNMENT OF AN OPTICAL ASSEMBLY WITH INTRINSIC CALIBRATION
20230237701 · 2023-07-27 ·

Provided are methods for active alignment of an optical assembly with intrinsic calibration. Some methods described include performing a first active alignment using a multi-collimator assembly, determining a principal point of the camera assembly using a diffractive optical element (DOE) intrinsic calibration module, and adjusting the relative position of one or more of the lens and the image sensor to align the principal point of the camera assembly with an image center of the image sensor and to perform a second active alignment. Systems and computer program products are also provided.

AUTOMOTIVE SENSOR INTEGRATION MODULE
20230004764 · 2023-01-05 ·

An automotive sensor integration module including a plurality of sensors which differ in at least one of a sensing period or an output data format, and a signal processing unit, which simultaneously outputs, as sensing data, pieces of detection data respectively output from the plurality of sensors on the basis of the sensing period of any one of the plurality of sensors, determines whether each region of an outer cover corresponding to a location of each of the plurality of sensors is contaminated on the basis of the pieces of detection data, and outputs a determination result as contamination data.

SENSOR FUSION

A plurality of images can be acquired from a plurality of sensors and a plurality of flattened patches can be extracted from the plurality of images. An image location in the plurality of images and a sensor type token identifying a type of sensor used to acquire an image in the plurality of images from which the respective flattened patch was acquired can be added to each of the plurality of flattened patches. The flattened patches can be concatenated into a flat tensor and add a task token indicating a processing task to the flat tensor, wherein the flat tensor is a one-dimensional array that includes two or more types of data. The flat tensor can be input to a first deep neural network that includes a plurality of encoder layers and a plurality of decoder layers and outputs transformer output. The transformer output can be input to a second deep neural network that determines an object prediction indicated by the token and the object predictions can be output.