G01S7/497

Extrinsic calibration of multiple vehicle sensors using combined target detectable by multiple vehicle sensors

Sensors coupled to a vehicle are calibrated, optionally using a dynamic scene with sensor targets around a motorized turntable that rotates the vehicle to different orientations. One vehicle sensor captures a representation of one feature of a sensor target, while another vehicle sensor captures a representation of a different feature of the sensor target, the two features of the sensor target having known relative positioning on the target. The vehicle generates a transformation that maps the captured representations of the two features to positions around the vehicle based on the known relative positioning of the two features on the target.

Systems and methods to enhance early detection of performance induced risks for an autonomous driving vehicle
11554783 · 2023-01-17 · ·

Systems and methods of adjusting zone associated risks of a coverage zone covered by one or more sensors of an autonomous driving vehicle (ADV) operating in real-time are disclosed. As an example, the method includes defining a performance limit detection window associated with a first sensor based on a mean time between failure (MTBF) lower limit of the first sensor and a MTBF upper limit of the first sensor. The method further includes determining whether an operating time of the ADV operating in autonomous driving (AD) mode is within the performance limit detection window associated with the first sensor. The method further includes in response to determining that the operating time of the ADV operating in AD mode is within the performance limit detection window of the first sensor, adjusting a zone associated risk of the coverage zone to a performance risk of a second sensor.

Systems and methods to enhance early detection of performance induced risks for an autonomous driving vehicle
11554783 · 2023-01-17 · ·

Systems and methods of adjusting zone associated risks of a coverage zone covered by one or more sensors of an autonomous driving vehicle (ADV) operating in real-time are disclosed. As an example, the method includes defining a performance limit detection window associated with a first sensor based on a mean time between failure (MTBF) lower limit of the first sensor and a MTBF upper limit of the first sensor. The method further includes determining whether an operating time of the ADV operating in autonomous driving (AD) mode is within the performance limit detection window associated with the first sensor. The method further includes in response to determining that the operating time of the ADV operating in AD mode is within the performance limit detection window of the first sensor, adjusting a zone associated risk of the coverage zone to a performance risk of a second sensor.

Infrared Beacon for Location Sharing

An electronic device may include an infrared light source and an infrared image sensor to enable infrared beacon functionality. In a location sharing scenario, a first electronic device may use the infrared light source to emit infrared light and serve as an infrared beacon. A second electronic device may use the infrared image sensor to detect the infrared beacon and identify the location of the first electronic device. The infrared image sensor that is used to detect the infrared beacon may also serve as a time-of-flight sensor for a light detection and ranging (LiDAR) module. The second electronic device (that detects the infrared beacon) may provide output such as visual, audio, and/or haptic output to inform a user of the location of the infrared beacon.

Sensor calibration using dense depth maps
11555903 · 2023-01-17 · ·

This disclosure is directed to calibrating sensors mounted on an autonomous vehicle. A dense depth map can be generated in a two-dimensional camera space using point cloud data generated by one of the sensors. Image data from another of the sensors can be compared to the dense depth map in the two-dimensional camera space. Differences determined by the comparison can indicate alignment errors between the sensors. Calibration data associated with the errors can be determined and used to calibrate the sensors without the need for calibration infrastructure.

Sensor calibration using dense depth maps
11555903 · 2023-01-17 · ·

This disclosure is directed to calibrating sensors mounted on an autonomous vehicle. A dense depth map can be generated in a two-dimensional camera space using point cloud data generated by one of the sensors. Image data from another of the sensors can be compared to the dense depth map in the two-dimensional camera space. Differences determined by the comparison can indicate alignment errors between the sensors. Calibration data associated with the errors can be determined and used to calibrate the sensors without the need for calibration infrastructure.

DATA ACQUISITION DEVICE, DATA CORRECTION METHOD AND APPARATUS, AND ELECTRONIC DEVICE
20230012240 · 2023-01-12 ·

Embodiments of the present disclosure disclose a data acquisition device, a data correction method and apparatus, and an electronic device. The data acquisition device includes: a rotation component, a first ranging component, and an image acquisition component. The rotation component is configured to drive the data acquisition device to rotate in a first direction. The first ranging component is configured to rotate in the first direction along with the data acquisition device, to rotate in a second direction, and to measure first ranging data. The first direction is different from the second direction. The image acquisition component is configured to rotate in the first direction along with the data acquisition device, and to acquire image data in a three-dimensional scene.

DATA ACQUISITION DEVICE, DATA CORRECTION METHOD AND APPARATUS, AND ELECTRONIC DEVICE
20230012240 · 2023-01-12 ·

Embodiments of the present disclosure disclose a data acquisition device, a data correction method and apparatus, and an electronic device. The data acquisition device includes: a rotation component, a first ranging component, and an image acquisition component. The rotation component is configured to drive the data acquisition device to rotate in a first direction. The first ranging component is configured to rotate in the first direction along with the data acquisition device, to rotate in a second direction, and to measure first ranging data. The first direction is different from the second direction. The image acquisition component is configured to rotate in the first direction along with the data acquisition device, and to acquire image data in a three-dimensional scene.

IMAGING APPARATUS

The illumination device has a plurality of light-emitting pixels which are individually on/off controllable, and emits a reference light having a random intensity distribution. The photodetector detects light reflected from an object. The processing device reconstructs an image of the object OBJ, by calculating a correlation between a detection intensity b based on an output of the photodetector, and the intensity distribution I of the reference light. The plurality of light-emitting pixels are divided into the m (m≥2) areas each containing n (n≥2) adjoining light-emitting pixels. By selecting one light-emitting pixel from each of the m areas without overlapping, the n light-emitting pixel groups are determined. The imaging apparatus carries out sensing for every light-emitting pixel group.

METHOD FOR CALIBRATING LIDAR AND POSITIONING DEVICE, DEVICE, AND STORAGE MEDIUM
20230008398 · 2023-01-12 ·

A method for calibrating a Lidar and a positioning device, a device, and a storage medium. The method includes: acquiring a point-cloud data sequence of the Lidar and a pose data sequence of the positioning device, in which, the Lidar and the positioning device are on a same traveling device; determining first trajectory information of the Lidar and second trajectory information of the positioning device according to the point-cloud data sequence and the pose data sequence; and determining a calibration offset between the Lidar and the positioning device according to the first trajectory information and the second trajectory information, in which, a matching degree between the first trajectory information and the second trajectory information satisfies a preset matching degree condition under a trajectory information correspondence determined based on the calibration offset.