Patent classifications
G01S17/46
3D range imaging method using optical phased array and photo sensor array
A 3D range imaging method using a LiDAR system includes sequentially generating multiple far field patterns to illuminate a target scene, each far field pattern including a plurality of light spots where each spot illuminates only a segment of a scene region unit that corresponds to a sensor pixel of the LiDAR receiver. Within each scene region unit, the multiple segments illuminated in different rounds are non-overlapping with each other, and they collectively cover the entire scene region unit or a part thereof. With each round of illumination, the signal light reflected from the scene is detected by the sensor pixels, and processed to calculate the depth of the illuminated segments. The calculation may take into consideration optical aberration which causes reflected light from an edge segment to be received by two sensor pixels. The depth data calculated from the sequential illuminations are combined to form a ranged image.
Alternating light distributions for active depth sensing
Aspects of the present disclosure relate to systems and methods for active depth sensing. An example apparatus configured to perform active depth sensing includes a projector. The projector is configured to emit a first distribution of light during a first time and emit a second distribution of light different from the first distribution of light during a second time. A set of final depth values of one or more objects in a scene is based on one or more reflections of the first distribution of light and one or more reflections of the second distribution of light. The projector may include a laser array, and the apparatus may be configured to switch between a first plurality of lasers of the laser array to emit light during the first time and a second plurality of laser to emit light during the second time.
System and method of registering point cloud data using subsample data
A system of generating a three-dimensional (3D) scan of an environment includes multiple 3D scanners including a first 3D scanner at respective first and second positions. The system further includes a controller coupled to the 3D scanners via a common communications network. The first scanner and second scanner transmit a subset of data to the controller while acquiring a set of 3D coordinates. The controller registers the subsets of data to each other while the sets of 3D coordinates is being acquired.
System and method of registering point cloud data using subsample data
A system of generating a three-dimensional (3D) scan of an environment includes multiple 3D scanners including a first 3D scanner at respective first and second positions. The system further includes a controller coupled to the 3D scanners via a common communications network. The first scanner and second scanner transmit a subset of data to the controller while acquiring a set of 3D coordinates. The controller registers the subsets of data to each other while the sets of 3D coordinates is being acquired.
OPTICAL DETECTOR
An optical detector (110) is disclosed, comprising: at least one optical sensor (122) adapted to detect a light beam (116) and to generate at least one sensor signal, wherein the optical sensor (122) has at least one sensor region (126), wherein the sensor signal of the optical sensor (122) is dependent on an illumination of the sensor region (126) by the light beam (116), wherein the sensor signal, given the same total power of the illumination, is dependent on a width of the light beam (116) in the sensor region (126); at least one focus-tunable lens (130) located in at least one beam path (132) of the light beam (116), the focus-tunable lens (130) being adapted to modify a focal position of the light beam (116) in a controlled fashion; at least one focus-modulation device (136) adapted to provide at least one focus-modulating signal (138) to the focus-tunable lens (130), thereby modulating the focal position; and at least one evaluation device (140), the evaluation device (140) being adapted to evaluate the sensor signal.
VISUAL POSITIONING DEVICE AND THREE-DIMENSIONAL SURVEYING AND MAPPING SYSTEM AND METHOD BASED ON SAME
Disclosed are a visual positioning device (101) and a three-dimensional surveying and mapping system (100) including at least one visual positioning device (101). The visual positioning device (101) includes an infrared light source (101b), an infrared camera (101a), a signal transceiver module (101d) and a visible light camera (101c). The three-dimensional surveying and mapping system (100) further includes a plurality of position identification points (102), a plurality of active signal points (103) and an image processing server (104). The image processing server (104) is configured to cache infrared images and real scene images shot by the infrared camera (101a) and the visible light camera (101c) and positioning information thereabout and store a three-dimensional model obtained through reconstruction. The present invention has the advantages of simple structure, no need for a power supply, convenience in use and high precision, etc.
VISUAL POSITIONING SYSTEM AND METHOD BASED ON HIGH REFLECTIVE INFRARED IDENTIFICATION
A visual positioning system based on highly infrared-reflective identification, including a plurality of identification points (102), an infrared photographing device (101) and an image processing unit (103). The plurality of identification points (102) is passive identification points made of a highly infrared-reflective material and are arranged at equal intervals in a plane to be positioned; the infrared photographing device (101) is used for shooting a reflective image of the identification points (102); and the image processing unit (103) obtains a relative position and relative attitude variation by acquiring and analyzing information about an image shot by an infrared camera (101a). Also provided is a visual positioning method based on highly infrared-reflective identification. The visual positioning system and method have the advantages of simple structure, no need of power supply, low costs, no delay and high positioning precision.
Remote distance estimation system and method
Provided is a tangible, non-transitory, machine readable medium storing instructions that when executed by the image processor effectuates operations including: capturing, with a first image sensor, a first image of at least two light points projected on a surface by the at least one laser light emitter; extracting, with at least one image processor, a first distance between the at least two light points in the first image in a first direction; and estimating, with the at least one image processor, a first distance to the surface on which the at least two light points are projected based on at least the first distance between the at least two light points and a predetermined relationship relating a distance between at least two light points in the first direction and a distance to the surface on which the at least two light points are projected.
DISTANCE IMAGE ACQUISITION APPARATUS AND DISTANCE IMAGE ACQUISITION METHOD
A distance image acquisition apparatus includes a projection unit which projects a first pattern of structured light in a plurality of wavelength bandwidths, an imaging unit which is provided in parallel with and apart from the projection unit by a baseline length, performs imaging with sensitivities to a plurality of wavelength bandwidths, and generates a plurality of captured images corresponding to a plurality of wavelength bandwidths, a determination unit which determines whether or not a second pattern of structured light projected from another distance image acquisition apparatus is included in the captured images, and a pattern extraction unit which extracts the first pattern from a captured image determined as the second pattern being not included by the determination unit, and a distance image acquisition unit which acquires a distance image indicating a distance of a subject within a distance measurement region based on the first pattern.
DISTANCE IMAGE ACQUISITION APPARATUS AND DISTANCE IMAGE ACQUISITION METHOD
The distance image acquisition apparatus (10) includes a projection unit (12) which projects a first pattern of structured light distributed in a two-dimensional manner with respect to a subject within a distance measurement region, a light modulation unit (22) which spatially modulates the first pattern projected from the projection unit (12), an imaging unit (14) which is provided in parallel with and apart from the projection unit (12) by a baseline length, and captures an image including the first pattern reflected from the subject within the distance measurement region, a pattern extraction unit (20A) which extracts the first pattern spatially modulated by the light modulation unit (22) from the image captured by the imaging unit (14), and a distance image acquisition unit (20B) which acquires a distance image indicating a distance of the subject within the distance measurement region based on the first pattern.