G01S17/06

Infrared Beacon for Location Sharing

An electronic device may include an infrared light source and an infrared image sensor to enable infrared beacon functionality. In a location sharing scenario, a first electronic device may use the infrared light source to emit infrared light and serve as an infrared beacon. A second electronic device may use the infrared image sensor to detect the infrared beacon and identify the location of the first electronic device. The infrared image sensor that is used to detect the infrared beacon may also serve as a time-of-flight sensor for a light detection and ranging (LiDAR) module. The second electronic device (that detects the infrared beacon) may provide output such as visual, audio, and/or haptic output to inform a user of the location of the infrared beacon.

Multiple pulse, lidar based 3-D imaging

Methods and systems for performing multiple pulse LIDAR measurements are presented herein. In one aspect, each LIDAR measurement beam illuminates a location in a three dimensional environment with a sequence of multiple pulses of illumination light. Light reflected from the location is detected by a photosensitive detector of the LIDAR system during a measurement window having a duration that is greater than or equal to the time of flight of light from the LIDAR system out to the programmed range of the LIDAR system, and back. The pulses in a measurement pulse sequence can vary in magnitude and duration. Furthermore, the delay between pulses and the number of pulses in each measurement pulse sequence can also be varied. In some embodiments, the multi-pulse illumination beam is encoded and the return measurement pulse sequence is decoded to distinguish the measurement pulse sequence from exogenous signals.

Multiple pulse, lidar based 3-D imaging

Methods and systems for performing multiple pulse LIDAR measurements are presented herein. In one aspect, each LIDAR measurement beam illuminates a location in a three dimensional environment with a sequence of multiple pulses of illumination light. Light reflected from the location is detected by a photosensitive detector of the LIDAR system during a measurement window having a duration that is greater than or equal to the time of flight of light from the LIDAR system out to the programmed range of the LIDAR system, and back. The pulses in a measurement pulse sequence can vary in magnitude and duration. Furthermore, the delay between pulses and the number of pulses in each measurement pulse sequence can also be varied. In some embodiments, the multi-pulse illumination beam is encoded and the return measurement pulse sequence is decoded to distinguish the measurement pulse sequence from exogenous signals.

Beam steering aware pixel clustering of segmented sensor area and implementing averaging algorithms for pixel processing

A scanning system includes a scanning structure configured to rotate about at least one first scanning axis; a driver configured to drive the scanning structure about the at least one first scanning axis and detect a position of the scanning structure with respect to the at least one first scanning axis during movement of the scanning structure; a segmented pixel sensor including a plurality of sub-pixel elements arranged in a pixel area; and a controller configured to selectively activate and deactivate the plurality of sub-pixel elements into at least one active cluster and at least one deactivated cluster to form at least one active pixel from the at least one active cluster, receive first position information from the driver indicating the detected position of the scanning structure, and dynamically change a clustering of activated sub-pixel elements and a clustering of deactivated sub-pixel elements based on the first position information.

LIDAR WITH THERMAL PHASE SHIFTER
20230003858 · 2023-01-05 ·

A light detection and ranging system can have an array of solid-state optical energy emitters coupled to a controller and at least one antennae. Each emitter may be coupled to a phase shifter that has a first waveguide and a second waveguide with a heating element continuously extending between the respective waveguides.

LIDAR WITH THERMAL PHASE SHIFTER
20230003858 · 2023-01-05 ·

A light detection and ranging system can have an array of solid-state optical energy emitters coupled to a controller and at least one antennae. Each emitter may be coupled to a phase shifter that has a first waveguide and a second waveguide with a heating element continuously extending between the respective waveguides.

SENSOR PERTURBATION

Perception sensors of a vehicle can be used for various operating functions of the vehicle. A computing device may receive sensor data from the perception sensors, and may calibrate the perception sensors using the sensor data, to enable effective operation of the vehicle. To calibrate the sensors, the computing device may project the sensor data into a voxel space, and determine a voxel score comprising an occupancy score and a residual value for each voxel. The computing device may then adjust an estimated position and/or orientation of the sensors, and associated sensor data, from at least one perception sensor to minimize the voxel score. The computing device may calibrate the sensor using the adjustments corresponding to the minimized voxel score. Additionally, the computing device may be configured to calculate an error in a position associated with the vehicle by calibrating data corresponding to a same point captured at different times.

SENSOR PERTURBATION

Perception sensors of a vehicle can be used for various operating functions of the vehicle. A computing device may receive sensor data from the perception sensors, and may calibrate the perception sensors using the sensor data, to enable effective operation of the vehicle. To calibrate the sensors, the computing device may project the sensor data into a voxel space, and determine a voxel score comprising an occupancy score and a residual value for each voxel. The computing device may then adjust an estimated position and/or orientation of the sensors, and associated sensor data, from at least one perception sensor to minimize the voxel score. The computing device may calibrate the sensor using the adjustments corresponding to the minimized voxel score. Additionally, the computing device may be configured to calculate an error in a position associated with the vehicle by calibrating data corresponding to a same point captured at different times.

Indoor SLAM method based on 3D lidar and UWB

Disclosed is an indoor SLAM method based on three-dimension (3D) lidar and ultra-wide band (UWB). Firstly, a UWB positioning system is deployed in an indoor area, then a robot carrying 3D lidar sensors explores the indoor area, and finally, a SLAM algorithm integrating lidar data and UWB data is used to generate a map of the explored area. The method specifically includes the following steps: determining a relative pose transformation between the 3D laser SLAM coordinate system and the UWB positioning coordinate system; using UWB data to provide initial value for inter-frame matching of laser odometer; using UWB data to add constraints to SLAM back-end pose graph; and performing loop detection based on curvature feature coding. This application breaks through the limitation of lighting conditions of indoor areas, eliminates the accumulated errors of SLAM, and constructs a high-quality indoor map.

CAMERA SYSTEMS AND DEVICES FOR BALLISTIC PARAMETER MEASUREMENTS IN AN OUTDOOR ENVIRONMENT
20220413119 · 2022-12-29 ·

A ballistic detection system includes a first camera; a second camera; a solar block device associated with at least one camera of the first and second cameras, wherein the solar block device is configured and arranged to block a solar disc in a field of view of the at least one camera; and a ballistics analysis computer configured to obtain image data captured by the first and second cameras, determine at least two points in three-dimensional space, which correspond to image artifacts of a projectile, using intrinsic and extrinsic parameters of the first and second cameras, define a trajectory of the projectile within a target volume using the at least two points in three-dimensional space, and find a point of intersection of the trajectory of the projectile with an object associated with the target volume.