G01S7/491

Methods for calculating distance using depth sensor and apparatuses performing the same

A method of calculating, using a depth sensor, a distance excluding an ambiguity distance including outputting a modulated light signal output from a light source to an object, receiving the modulated light signal reflected by the object, calculating a distance between the light source and the object using the reflected modulated light signal input to photo gates in conjunction with demodulation signals supplied to the photo gates, the calculating including calculating, using the modulated light signal, at least one distance farther than a maximum measurable distance, and setting the at least one distance to be equal to the maximum measurable distance may be provided. A range of the distance farther than the maximum measurable distance can be determined according to a duty ratio of the modulated light signal.

Variable resolution pixel

A photosensor having a plurality of light sensitive pixels each of which comprises a light sensitive region and a plurality of storage regions for accumulating photocharge generated in the light sensitive region, a transfer gate for each storage region that is selectively electrifiable to transfer photocharge from the light sensitive region to the storage region, and an array of microlenses that for each storage region directs a different portion of light incident on the pixel to a region of the light sensitive region closer to the storage region than to other storage regions.

Time of flight sensor binning

A time-of-flight sensor device generates and analyzes a high-resolution depth map frame from a high-resolution image to determine a mode of operation for the time-of-flight sensor and an illuminator and to control the time-of-flight sensor and illuminator according to the mode of operation. A binned depth map frame can be created from a binned image from the time-of-flight sensor and combined with the high-resolution depth map frame to create a compensated depth map frame.

Depth sensing method, 3D image generation method, 3D image sensor, and apparatus including the same

A three-dimensional (3D) image sensor module including: an oscillator configured to output a distortion-compensated oscillation frequency as a driving voltage of a sine wave biased with a bias voltage; an optical shutter configured to vary transmittance of reflective light reflected from a subject, according to the driving voltage, and to modulate the reflective light into at least two optical modulation signals having different phases; and an image generator configured to generate image data about the subject, the image data including depth information that is calculated based on a difference between the phases of the at least two optical modulation signals.

Relative speed measuring doppler LiDAR

The general field of the invention is that of Doppler lidars intended to measure the speed of a target. The lidar according to the invention comprises: First means for modulating the optical frequency of the transmission signal, said frequency being the sum of a constant frequency and of a variable frequency of determined amplitude modulated by a periodic temporal function; Second means for computing the spectrum of the measured heterodyne signal and for creating two measurement spectra obtained by shifting the spectrum of the heterodyne signal by a positive and negative frequency value, said realignment frequency equal to the difference between the instantaneous frequency of the transmission signal and the frequency of a signal transmitted at a time shifted by the round-trip travel time between the lidar and the target; Third means for comparing the two measurement spectra, the difference in amplitude between the two spectra at the Doppler frequency determining the direction of the speed of the target.

Methods and apparatus for coded time-of-flight camera

In illustrative implementations, a time-of-flight camera robustly measures scene depths, despite multipath interference. The camera emits amplitude modulated light. An FPGA sends at least two electrical signals, the first being to control modulation of radiant power of a light source and the second being a reference signal to control modulation of pixel gain in a light sensor. These signals are identical, except for time delays. These signals comprise binary codes that are m-sequences or other broadband codes. The correlation waveform is not sinusoidal. During measurements, only one fundamental modulation frequency is used. One or more computer processors solve a linear system by deconvolution, in order to recover an environmental function. Sparse deconvolution is used if the scene has only a few objects at a finite depth. Another algorithm, such as Wiener deconvolution, is used is the scene has global illumination or a scattering media.

System and Method for Tracking Objects Using Lidar and Video Measurements
20170248698 · 2017-08-31 ·

A system uses range and Doppler velocity measurements from a lidar system and images from a video system to estimate a six degree-of-freedom trajectory of a target. The system estimates this trajectory in two stages: a first stage in which the range and Doppler measurements from the lidar system along with various feature measurements obtained from the images from the video system are used to estimate first stage motion aspects of the target (i.e., the trajectory of the target); and a second stage in which the images from the video system and the first stage motion aspects of the target are used to estimate second stage motion aspects of the target. Once the second stage motion aspects of the target are estimated, a three-dimensional image of the target may be generated.

Hybrid Transmitter Receiver Optical Imaging System

An image capture device includes, in part, N optical transmit antennas forming a first array, N phase modulators each associated with and adapted to control a phase of a different one of the transmit antennas, M optical receive antennas forming a second array, M phase modulators each associated with and adapted to control a phase of a different one of the receive antennas, and a controller adapted to control phases of the first and second plurality of phase modulators to capture an image of an object. The first and second arrays may be one-dimensional arrays positioned substantially orthogonal to one another. Optionally, the first array is a circular array of transmitters, and the second array is a one-dimensional array of receivers positioned in the same plane as that in which the circular array of the transmitters is disposed.

APPARATUS AND METHODS FOR DIMENSIONING AN OBJECT CARRIED BY A VEHICLE MOVING IN A FIELD OF MEASUREMENT
20170227629 · 2017-08-10 ·

The dimensions of an object are measured as it is transported by a forklift within an area of measurement. A first scanner is on a first side of the area of measurement; a second scanner is on an opposite second side and across the first scanner. The first and second scanners provide a dual-head scanner arrangement to capture dimensions of the object. A third scanner is on the first side of the area of measurement, parallel to the first scanner. The first and third scanners are configured to capture speed and direction of the object. Each scanner has a processor to operate it. The first and second scanners are synchronized, and operation of the first and third scanners is correlated. Placement of the first and second scanners establishes a width of the area of measurement and the first and third scanners establish a length thereof.

Time of flight camera system
09726762 · 2017-08-08 · ·

A light transit time camera system and method for operating such a system which can be operated with at least three modulation frequencies, having the steps a) determining a phase shift (φ.sub.i) of an emitted signal (Sp1) and a received signal (Sp2) for a modulation frequency (f.sub.1, f.sub.2, f.sub.3) in a phase-measuring cycle (PM.sub.1, PM.sub.2, . . . ), b) carrying out a plurality of phase-measuring cycles (PM.sub.1, PM.sub.2, . . . ), c) determining a distance value (d.sub.n,n+1) on the basis of the phase shifts (φ.sub.n, φ.sub.n+1) determined in two successive phase-measuring cycles (PM.sub.n, PM.sub.n+1), in a distance-measuring cycle (M.sub.1, M.sub.2, . . . ), d) carrying out a plurality of distance-measuring cycles (M.sub.1, M.sub.2, . . . ), e) determining a distance deviation (Δd) between the distance values of successive distance-measuring cycles, f) outputting of a distance value (d.sub.n,n+1) as a valid distance value if the distance deviation (Δd) is within a tolerance limit (Δd.sub.tol) is provided.