G01S7/483

Non-mechanical beam steering for depth sensing

A depth camera assembly (DCA) for depth sensing of a local area. The DCA includes a transmitter, a receiver, and a controller. The transmitter illuminates a local area with outgoing light in accordance with emission instructions. The transmitter includes a fine steering element and a coarse steering element. The fine steering element deflects one or more optical beams at a first deflection angle to generate one or more first order deflected scanning beams. The coarse steering element deflects the one or more first order deflected scanning beams at a second deflection angle to generate the outgoing light projected into the local area. The receiver captures one or more images of the local area including portions of the outgoing light reflected from the local area. The controller determines depth information for one or more objects in the local area based in part on the captured one or more images.

RANGE DETECTOR DEVICE WITH ESTIMATED SPREAD VALUE AND RELATED METHODS
20170315217 · 2017-11-02 ·

A range detector device may include a pulsed light source configured to emit pulsed light to an object, a detector configured to receive reflected pulsed light from the object, and a processor cooperating with the pulsed light source and the detector. The processor may be configured to generate a measured range value to the object, and generate an estimated statistical value for a spread of possible range values based upon a characteristic of the pulsed light source.

Modular LADAR sensor

A mobile ladar platform having a ladar sensor, a positioning system, and a digital processor adapted to perform analysis of a scene in a field of view. The ladar sensor includes a ladar transmitter, a zero range reference circuit, a two-dimensional array of light sensitive detectors positioned at a focal plane of a light collecting and focusing system, and a readout integrated circuit with a plurality of unit cell electrical circuits. The ladar sensor also includes a detector bias circuit and a communications port.

RANGE ESTIMATION FOR LIDAR SYSTEMS

Embodiments of the disclosure provide an optical sensing system, a range estimation system for the optical sensing system, and a method for the optical sensing system. The exemplary optical sensing system includes a transmitter configured to emit a laser pulse towards an object. The optical sensing system further includes a range estimation system configured to estimate a range between the object and the optical sensing system. The range estimation system includes an analog to digital converter (ADC) configured to generate a plurality of pulse samples based on the laser pulse returned from the object. The returned laser pulse has a substantially triangular waveform including a rising edge and a falling edge. The range estimation system further includes a processor. The processor is configured to generate synthesized pulse samples on the substantially triangular waveform based on the pulse samples. The processor is further configured to determine an arrival time of the returned laser pulse based on the ADC generated pulse samples and the synthesized pulse samples. The processor is also configured to estimate a range between the object and the optical sensing system based on the arrival time of the returned laser pulse.

RANGE ESTIMATION FOR LIDAR SYSTEMS

Embodiments of the disclosure provide an optical sensing system, a range estimation system for the optical sensing system, and a method for the optical sensing system. The exemplary optical sensing system includes a transmitter configured to emit a laser pulse towards an object. The optical sensing system further includes a range estimation system configured to estimate a range between the object and the optical sensing system. The range estimation system includes an analog to digital converter (ADC) configured to generate a plurality of pulse samples based on the laser pulse returned from the object. The returned laser pulse has a substantially triangular waveform including a rising edge and a falling edge. The range estimation system further includes a processor. The processor is configured to generate synthesized pulse samples on the substantially triangular waveform based on the pulse samples. The processor is further configured to determine an arrival time of the returned laser pulse based on the ADC generated pulse samples and the synthesized pulse samples. The processor is also configured to estimate a range between the object and the optical sensing system based on the arrival time of the returned laser pulse.

Adjusting vehicle sensor field of view volume
11671564 · 2023-06-06 · ·

An example method includes receiving, from one or more sensors associated with an autonomous vehicle, sensor data associated with a target object in an environment of the vehicle during a first environmental condition, where at least one sensor of the sensor(s) is configurable to be associated with one of a plurality of operating field of view volumes. The method also includes based on the sensor data, determining at least one parameter associated with the target object. The method also includes determining a degradation in the parameter(s) between the sensor data and past sensor data, where the past sensor data is associated with the target object in the environment during a second environmental condition different from the first and, based on the degradation, adjusting the operating field of view volume of the at least one sensor to a different one of the operating field of view volumes.

Adjusting vehicle sensor field of view volume
11671564 · 2023-06-06 · ·

An example method includes receiving, from one or more sensors associated with an autonomous vehicle, sensor data associated with a target object in an environment of the vehicle during a first environmental condition, where at least one sensor of the sensor(s) is configurable to be associated with one of a plurality of operating field of view volumes. The method also includes based on the sensor data, determining at least one parameter associated with the target object. The method also includes determining a degradation in the parameter(s) between the sensor data and past sensor data, where the past sensor data is associated with the target object in the environment during a second environmental condition different from the first and, based on the degradation, adjusting the operating field of view volume of the at least one sensor to a different one of the operating field of view volumes.

DISTANCE DETECTION METHOD AND DISTANCE DETECTION DEVICE USING THE SAME
20170248697 · 2017-08-31 ·

Distance detection method and device are provided. The distance detection method comprises: providing a directional signal emitting module; providing a directional signal receiving module having a constant bandwidth; providing a distance detection signal to the directional signal emitting module; changing a frequency of the distance detection signal provided to the directional signal emitting module, and judging whether the directional signal receiving module can decode a reflected directional signal into a received signal; and judging a distance between an external object and the directional signal receiving module according to whether the received signal corresponding to the frequency of the distance detection signal can be decoded.

High Dynamic Range Imaging of Environment with a High Intensity Reflecting/Transmitting Source

Active-gated imaging system and method for imaging environment with at least one high-intensity source. A light source emits light pulses toward the environment, and an image sensor with a pixelated sensor array receives reflected pulses from a selected depth of field and generates a main image. The image sensor exposure mechanism includes a pixelated transfer gate synchronized with the emitted pulses. An image processor identifies oversaturated image portions of the main image resulting from a respective high-intensity source, and interprets the oversaturated image portions using supplementary image information acquired by image sensor. The supplementary information may be obtained from: a low-illumination secondary image having substantially fewer gating cycles than the main image; by accumulating reflected pulses from the high-intensity source after the reflected pulses undergo internal reflections between optical elements of the camera; or a low-illumination secondary image acquired by residual photon accumulation during a non-exposure state of image sensor.

MEMS PACKAGE WITH SHOCK AND VIBRATION PROTECTION
20220033253 · 2022-02-03 ·

An optical micro-electromechanical system (MEMS) system is disclosed. The optical MEMS system includes a printed circuit board (PCB), and a MEMS optical integrated circuit (IC) package mounted to the PCB. The IC package includes a MEMS optical die, and a plurality of leads electrically and mechanically connected to the MEMS optical die and to the PCB. The optical MEMS system also includes one or more elastomeric grommets contacting one or more of the leads, where the grommets are configured to absorb mechanical vibration energy from the contacted leads.