G01S7/497

Random hardware fault and degradation protection apparatus for time-of-flight receiver

A time-of-flight light detection system includes: a plurality of circuits arranged sequentially along a signal path that comprises a plurality of signal channels, the plurality of circuits including a first circuit and a second circuit arranged downstream from the first circuit; a reference signal source configured to generate a plurality of reference signals, where each of the plurality of signal channels at the first circuit receives at least one of the plurality of reference signals; and an evaluation circuit coupled to the plurality of signal channels to receive a processed reference signal from the signal path, the evaluation circuit further configured to compare the processed reference signal to a first expected result to generate a first comparison result.

Random hardware fault and degradation protection apparatus for time-of-flight receiver

A time-of-flight light detection system includes: a plurality of circuits arranged sequentially along a signal path that comprises a plurality of signal channels, the plurality of circuits including a first circuit and a second circuit arranged downstream from the first circuit; a reference signal source configured to generate a plurality of reference signals, where each of the plurality of signal channels at the first circuit receives at least one of the plurality of reference signals; and an evaluation circuit coupled to the plurality of signal channels to receive a processed reference signal from the signal path, the evaluation circuit further configured to compare the processed reference signal to a first expected result to generate a first comparison result.

Vehicle sensor assembly

A sensor assembly includes a first sensor including a first cylindrical sensor window defining an axis; an annular member substantially centered around the axis, fixed relative to the first sensor, and supporting the first sensor; a second sensor fixed relative to the annular member and suspended from the annular member, the second sensor including a second cylindrical sensor window defining the axis; a first tubular ring fixed relative to the annular member and substantially centered around the axis, the first tubular ring including a plurality of first nozzles aimed at the first cylindrical sensor window; a second tubular ring fixed relative to the annular member and substantially centered around the axis, the second tubular ring including a plurality of second nozzles aimed at the second cylindrical sensor window; and two legs extending downward from the annular member and supporting the annular member.

Methods for controlling an apparatus adapted to clean a sensor assembly
11554755 · 2023-01-17 · ·

A method for controlling a sensor assembly cleaning apparatus includes receiving sensor data from various vehicle sensors, determining a level of obscurement of the transparent surface, and determining whether the level of obscurement exceeds a threshold level. If the transparent surface is obscured beyond the threshold level, a control signal may be sent to the apparatus to initiate the ejection of pressurized air onto the transparent surface. Optionally, the method may further evaluate other parameters such as the vehicle velocity in relation to a threshold vehicle velocity prior to sending the control signal to ensure that the cleaning operation using pressurized air would not be superfluous in light of the vehicle velocity. In addition, a method for selectively activating the sensor assembly cleaning apparatus includes determining an activation schedule for the apparatus based on an arrangement of transparent surfaces and controlling the apparatus to operate based on the activation schedule.

ADAPTIVE MOTION COMPENSATION OF PERCEPTION CHANNELS
20230009736 · 2023-01-12 · ·

A method may include obtaining sensor data describing a total measurable world around a motion sensor. The method may include processing the sensor data to generate a pre-compensation scan of the total measurable world around the motion sensor based on the sensor data. The method may include determining a delay between the obtaining the sensor data and the generation of the pre-compensation scan. The method may include obtaining motion data corresponding to motion of the motion sensor and generating a motion model of the motion sensor based on the motion data. The method may include generating an after-compensation scan of the motion sensor using the delay and the motion model to compensate for continued motion during the delay.

ADAPTIVE MOTION COMPENSATION OF PERCEPTION CHANNELS
20230009736 · 2023-01-12 · ·

A method may include obtaining sensor data describing a total measurable world around a motion sensor. The method may include processing the sensor data to generate a pre-compensation scan of the total measurable world around the motion sensor based on the sensor data. The method may include determining a delay between the obtaining the sensor data and the generation of the pre-compensation scan. The method may include obtaining motion data corresponding to motion of the motion sensor and generating a motion model of the motion sensor based on the motion data. The method may include generating an after-compensation scan of the motion sensor using the delay and the motion model to compensate for continued motion during the delay.

TIMING COMPENSATION DEVICE FOR OPTICAL OUTPUT SIGNAL OF LIDAR AND METHOD THEREOF

The present invention relates to a timing compensation device for an optical output signal of a Lidar and a method thereof, including an encoder for detecting a rotation period of a motor provided in a scanner, a Lidar controller for detecting a jitter time from the rotation period of the motor detected by the encoder, creating a histogram including a mode of a jitter time, and performing optical output control at a time point of the rotation period of the motor or when the mode of the jitter time is compensated for the rotation period of the motor; and a light transmitter for outputting laser light to the scanner according to the optical output control of the Lidar controller.

Optical acquisition device for a motor vehicle, wherein the operation of a light source unit is carried out in dependence on a functional state of the housing, method, and motor vehicle

The invention relates to an optical acquisition device (3) for a motor vehicle (1), having a housing (8) of the optical acquisition device (3), in which a light source unit (10) of the optical acquisition device (3) is arranged, wherein light beams (6) can be emitted by means of the light source unit (10) through a housing part (9) of the housing (8) into surroundings (4) of the motor vehicle (1), wherein the optical acquisition device (3) comprises a checking unit (16), by means of which a functional state of the housing (8) is checkable, and if an actual functional state of the housing (8) deviating from a reference functional state of the housing (8) is detected, a control signal can be generated. The invention furthermore relates to a motor vehicle (1) and a method.

Optical acquisition device for a motor vehicle, wherein the operation of a light source unit is carried out in dependence on a functional state of the housing, method, and motor vehicle

The invention relates to an optical acquisition device (3) for a motor vehicle (1), having a housing (8) of the optical acquisition device (3), in which a light source unit (10) of the optical acquisition device (3) is arranged, wherein light beams (6) can be emitted by means of the light source unit (10) through a housing part (9) of the housing (8) into surroundings (4) of the motor vehicle (1), wherein the optical acquisition device (3) comprises a checking unit (16), by means of which a functional state of the housing (8) is checkable, and if an actual functional state of the housing (8) deviating from a reference functional state of the housing (8) is detected, a control signal can be generated. The invention furthermore relates to a motor vehicle (1) and a method.

Sensor alignment
11592539 · 2023-02-28 · ·

Described herein are systems, methods, and non-transitory computer readable media for performing an alignment between a first vehicle sensor and a second vehicle sensor. Two-dimensional (2D) data indicative of a scene within an environment being traversed by a vehicle is captured by the first vehicle sensor such as a camera or a collection of multiple cameras within a sensor assembly. A three-dimensional (3D) representation of the scene is constructed using the 2D data. 3D point cloud data also indicative of the scene is captured by the second vehicle sensor, which may be a LiDAR. A 3D point cloud representation of the scene is constructed based on the 3D point cloud data. A rigid transformation is determined between the 3D representation of the scene and the 3D point cloud representation of the scene and the alignment between the sensors is performed based at least in part on the determined rigid transformation.