Patent classifications
G01S17/32
SENSOR ASSEMBLY WITH LIDAR FOR AUTONOMOUS VEHICLES
A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.
DETECTION METHOD AND DETECTION APPARATUS
The present invention relates to the field of radar detection. Provided are a detection method and a detection apparatus. The method comprises: emitting a first waveform signal to a target to undergo detection, and receiving a second waveform signal reflected by the target on the basis of the first waveform signal, the second waveform signal carrying spatial modulation information; generating, on the basis of the second waveform signal, a detection signal corresponding to the spatial modulation information, and obtaining a signal flight time carried on the detection signal; and determining distance data of the target on the basis of multiple pieces of the spatial modulation information and signal flight times corresponding thereto.
OPTICAL MEASUREMENT DEVICE AND MEASUREMENT METHOD
An optical measurement device includes at least a multi-frequency laser configured to simultaneously generate a frequency-fixed carrier and at least one frequency-modulated subcarrier, an optical branching element, a dual frequency beat signal generator, a difference signal generator, and an arithmetic processing unit. Either the carrier or the subcarrier within the output light of the multi-frequency laser is used as first measurement light and either the carrier or the subcarrier having a frequency different from that of the first measurement light is used as second measurement light. The dual frequency beat signal generator separates and outputs a first complex beat signal derived from the first measurement light and a second complex beat signal derived from the second measurement light. The difference signal generator outputs a difference signal between the first complex beat signal and the second complex beat signal.
OPTICAL MEASUREMENT DEVICE AND MEASUREMENT METHOD
An optical measurement device includes at least a multi-frequency laser configured to simultaneously generate a frequency-fixed carrier and at least one frequency-modulated subcarrier, an optical branching element, a dual frequency beat signal generator, a difference signal generator, and an arithmetic processing unit. Either the carrier or the subcarrier within the output light of the multi-frequency laser is used as first measurement light and either the carrier or the subcarrier having a frequency different from that of the first measurement light is used as second measurement light. The dual frequency beat signal generator separates and outputs a first complex beat signal derived from the first measurement light and a second complex beat signal derived from the second measurement light. The difference signal generator outputs a difference signal between the first complex beat signal and the second complex beat signal.
PHASE DIFFERENCE CALCULATION DEVICE, PHASE DIFFERENCE CALCULATION METHOD, AND PROGRAM
Provided is a phase difference calculation device including a first light amount acquisition unit that acquires a first light amount of reflected light of light applied in a first time window and received in the first time window and a second light amount of the reflected light received in a second time window, a time window shift control unit that shifts the first and second time windows and a third time window in the negative direction of the time axis to set fourth, fifth, and sixth time windows, and shifts the fourth, fifth, and sixth time windows in the negative direction of the time axis until no reflected light is received in the fourth time window, a second light amount acquisition unit that acquires a third light amount of the reflected light received in the sixth time window, and a phase difference calculation unit that calculates a phase difference between the light and the reflected light on the basis of a first corrected light amount obtained by adding the third light amount to the first light amount and a second corrected light amount obtained by subtracting the third light amount from the second light amount.
Chip-scale Lidar with enhanced range performance
A vehicle, Lidar system and method of detecting an object is disclosed. The Lidar system includes a photonic chip, and a laser integrated into the photonic chip. The laser has a front facet located at a first aperture of the photonic chip to direct a transmitted light beam into free space. A reflected light beam that is a reflection of the transmitted light beam is received at the photonic chip and a parameter of the object is determined from a comparison of the transmitted light beam and the reflected light beam. A navigation system operates the vehicle with respect to the object based on a parameter of the object.
Chip-scale Lidar with enhanced range performance
A vehicle, Lidar system and method of detecting an object is disclosed. The Lidar system includes a photonic chip, and a laser integrated into the photonic chip. The laser has a front facet located at a first aperture of the photonic chip to direct a transmitted light beam into free space. A reflected light beam that is a reflection of the transmitted light beam is received at the photonic chip and a parameter of the object is determined from a comparison of the transmitted light beam and the reflected light beam. A navigation system operates the vehicle with respect to the object based on a parameter of the object.
3D range imaging method using optical phased array and photo sensor array
A 3D range imaging method using a LiDAR system includes sequentially generating multiple far field patterns to illuminate a target scene, each far field pattern including a plurality of light spots where each spot illuminates only a segment of a scene region unit that corresponds to a sensor pixel of the LiDAR receiver. Within each scene region unit, the multiple segments illuminated in different rounds are non-overlapping with each other, and they collectively cover the entire scene region unit or a part thereof. With each round of illumination, the signal light reflected from the scene is detected by the sensor pixels, and processed to calculate the depth of the illuminated segments. The calculation may take into consideration optical aberration which causes reflected light from an edge segment to be received by two sensor pixels. The depth data calculated from the sequential illuminations are combined to form a ranged image.
3D range imaging method using optical phased array and photo sensor array
A 3D range imaging method using a LiDAR system includes sequentially generating multiple far field patterns to illuminate a target scene, each far field pattern including a plurality of light spots where each spot illuminates only a segment of a scene region unit that corresponds to a sensor pixel of the LiDAR receiver. Within each scene region unit, the multiple segments illuminated in different rounds are non-overlapping with each other, and they collectively cover the entire scene region unit or a part thereof. With each round of illumination, the signal light reflected from the scene is detected by the sensor pixels, and processed to calculate the depth of the illuminated segments. The calculation may take into consideration optical aberration which causes reflected light from an edge segment to be received by two sensor pixels. The depth data calculated from the sequential illuminations are combined to form a ranged image.
Determining the distance of an object
An optoelectronic sensor for determining the distance of an object in a monitoring area has a light transmitter for transmitting transmitted light, a light receiver for generating a received signal from remitted light remitted by the object, and a control and evaluation unit configured to modulate the transmitted light with at least a first frequency and a second frequency, to determine a phase offset between transmitted light and remitted light for the first frequency and the second frequency, and to determine a light time of flight. The control and evaluation unit is configured to determine a first amplitude and a second amplitude for the first frequency and the second frequency from the received signal and to detect whether the transmitted light impinges on an edge in the monitoring area on the basis of an evaluation of the first amplitude and the second amplitude.