G01S7/483

Electronic circuit and time-of-flight sensor comprising such an electronic circuit

An electronic circuit comprises at least one radiation-emitting element (2), a current regulator (12) with a current-producing terminal (O), and a measurement element (14) generating a signal (V.sub.mes) that is representative of the current flowing therethrough. A switch (6) is controlled by a modulation signal (M) so as to open and close, successively, an electrical path passing through the current-producing terminal (O), the radiation-emitting element (2) and the measurement element (14). A conversion circuit (16) is further interposed between the measurement element (14) and the current regulator (12) so as to transform the representative signal (V.sub.mes) into a smoothed signal (S) that is intended for a regulation terminal (Reg). A time-of-flight sensor comprising such an electronic circuit is also provided.

Adjusting Vehicle Sensor Field Of View Volume
20230262202 · 2023-08-17 ·

An example method includes receiving, from one or more sensors associated with an autonomous vehicle, sensor data associated with a target object in an environment of the vehicle during a first environmental condition, where at least one sensor of the sensor(s) is configurable to be associated with one of a plurality of operating field of view volumes. The method also includes based on the sensor data, determining at least one parameter associated with the target object. The method also includes determining a degradation in the parameter(s) between the sensor data and past sensor data, where the past sensor data is associated with the target object in the environment during a second environmental condition different from the first and, based on the degradation, adjusting the operating field of view volume of the at least one sensor to a different one of the operating field of view volumes.

Adjusting Vehicle Sensor Field Of View Volume
20230262202 · 2023-08-17 ·

An example method includes receiving, from one or more sensors associated with an autonomous vehicle, sensor data associated with a target object in an environment of the vehicle during a first environmental condition, where at least one sensor of the sensor(s) is configurable to be associated with one of a plurality of operating field of view volumes. The method also includes based on the sensor data, determining at least one parameter associated with the target object. The method also includes determining a degradation in the parameter(s) between the sensor data and past sensor data, where the past sensor data is associated with the target object in the environment during a second environmental condition different from the first and, based on the degradation, adjusting the operating field of view volume of the at least one sensor to a different one of the operating field of view volumes.

NON-MECHANICAL BEAM STEERING ASSEMBLY
20220141447 · 2022-05-05 ·

A depth camera assembly (DCA) for depth sensing of a local area. The DCA includes a transmitter, a receiver, and a controller. The transmitter illuminates a local area with outgoing light in accordance with emission instructions. The transmitter includes a fine steering element and a coarse steering element. The fine steering element deflects one or more optical beams at a first deflection angle to generate one or more first order deflected scanning beams. The coarse steering element deflects the one or more first order deflected scanning beams at a second deflection angle to generate the outgoing light projected into the local area. The receiver captures one or more images of the local area including portions of the outgoing light reflected from the local area. The controller determines depth information for one or more objects in the local area based in part on the captured one or more images.

Combined radar and lighting unit and laser radar apparatus
11724636 · 2023-08-15 · ·

A combined radar and lighting unit includes a laser radar apparatus, a lighting device, and a controller. The laser radar apparatus is mounted to a vehicle, emits laser light toward an outside of the vehicle, and detects reflected light. The lighting device is mounted to the vehicle and emits visible light toward the outside of the vehicle. The controller controls the lighting device to cause the lighting device to alternately operate in a first emission mode to perform emission of the visible light, and in a second emission mode to stop emission of the visible light or reduce a quantity of the visible light. The controller controls the laser radar apparatus to interrupt the measurement of the distance while the lighting device is operating in the first emission mode, and executes the measurement of the distance while the lighting device is operating in the second emission mode.

Super resolution and color motion artifact correction in a pulsed hyperspectral, fluorescence, and laser mapping imaging system

Super resolution and color motion artifact correction in a pulsed hyperspectral, fluorescence, and laser mapping imaging system. A method includes actuating an emitter to emit pulses of electromagnetic radiation and sensing reflected electromagnetic radiation with a pixel array of an image sensor. The method includes detecting motion across two or more sequential exposure frames, compensating for the detected motion, and combining the two or more sequential exposure frames to generate an image frame. The method is such that at least a portion of the pulses of electromagnetic radiation emitted by the emitter comprises one or more of: electromagnetic radiation having a wavelength from about 513 nm to about 545 nm, from about 565 nm to about 585 nm, from about 900 nm to about 1000 nm, an excitation wavelength of electromagnetic radiation that causes a reagent to fluoresce, or a laser mapping pattern.

LiDAR ADAPTIVE SCANNING SYSTEM AND METHOD USING IMAGE INFORMATION CONVERGENCE
20220137190 · 2022-05-05 ·

The present invention relates to a light detection and ranging (LiDAR) adaptive scanning system and method using image information convergence, and more particularly, to a LiDAR adaptive scanning system and method using image information convergence, which are capable of deriving light ranging through LiDAR without restrictions on space and obstacles through adaptive scanning by converging multiple pieces of information according to images.

Scalable Depth Sensor
20220130060 · 2022-04-28 ·

A system and method for a scalable depth sensor. The scalable depth sensor having an emitter, a receiver, and a processor. The emitter is configured to uniformly illuminate a scene within a field-of-view of the emitter. The receiver including a plurality of detectors, each detector configured to capture depth and intensity information corresponding to a subset of the field-of-view. The a processor connected to the detector and configured to selectively sample a subset of the plurality of the detectors in accordance with compressive sensing techniques, and provide an image in accordance with an output from the subset of the plurality of the detectors, the image providing a depth and intensity image corresponding to the field-of-view of the emitter.

Scalable Depth Sensor
20220130060 · 2022-04-28 ·

A system and method for a scalable depth sensor. The scalable depth sensor having an emitter, a receiver, and a processor. The emitter is configured to uniformly illuminate a scene within a field-of-view of the emitter. The receiver including a plurality of detectors, each detector configured to capture depth and intensity information corresponding to a subset of the field-of-view. The a processor connected to the detector and configured to selectively sample a subset of the plurality of the detectors in accordance with compressive sensing techniques, and provide an image in accordance with an output from the subset of the plurality of the detectors, the image providing a depth and intensity image corresponding to the field-of-view of the emitter.

Scalable depth sensor

A system and method for a scalable depth sensor. The scalable depth sensor having an emitter, a receiver, and a processor. The emitter is configured to uniformly illuminate a scene within a field-of-view of the emitter. The receiver including a plurality of detectors, each detector configured to capture depth and intensity information corresponding to a subset of the field-of-view. The a processor connected to the detector and configured to selectively sample a subset of the plurality of the detectors in accordance with compressive sensing techniques, and provide an image in accordance with an output from the subset of the plurality of the detectors, the image providing a depth and intensity image corresponding to the field-of-view of the emitter.