Patent classifications
G01C3/14
TRIANGULATION DEVICE, TRIANGULATION METHOD, AND RECORDING MEDIUM RECORDING PROGRAM THEREFOR
A triangulation device for computing a three-dimensional position of a measurement target point using a stereo method, the triangulation device includes: optimum image coordinate estimation unit configured to, based on coordinates of corresponding points corresponding to the measurement target point in two images each of which including an image of the measurement target point, and intrinsic parameters and extrinsic parameters of optical instruments generating the two images, calculate correction vectors by which coordinates of the corrected corresponding points satisfy an epipolar equation composed of the intrinsic parameters and the extrinsic parameters, using a characteristic polynomial including a correction amount of coordinates of the corresponding points or a reciprocal of the correction amount as a variable, and compute coordinates of the corrected corresponding points based on calculated correction vectors; and three-dimensional coordinate calculation unit configured to calculate three-dimensional coordinates of the measurement target point based on coordinates of the corrected corresponding points, the intrinsic parameters and the extrinsic parameters.
Depth measuring method and system
A depth measuring method and system applicable to a first binocular camera having a zoom lens is provided. The method includes: obtaining a current depth of a target object (S101); determining a focus with which the current depth is measured as a current focus (S102); determining, according to the preset correspondence between depth ranges and focuses, a current reference focus corresponding to a current reference depth range; wherein, the current reference depth range is a depth range in which the current depth falls (S103); determining whether the current focus is the same as the current reference focus; (S104); if the current focus is the same as the current reference focus, determining the current depth as the target depth of the target object (S105); or if the current focus is not the same as the current reference focus, adjusting the current focus to the current reference focus, measuring a current depth of the target object with the adjusted current focus (S106), and proceeding to the operation (S103) of determining, according to preset correspondence between depth ranges and focuses, a current reference focus corresponding to a current reference depth range. An object in various depth ranges is measured with a varying focus. The accuracy of the depth measurement of the target object is thus improved.
Depth measuring method and system
A depth measuring method and system applicable to a first binocular camera having a zoom lens is provided. The method includes: obtaining a current depth of a target object (S101); determining a focus with which the current depth is measured as a current focus (S102); determining, according to the preset correspondence between depth ranges and focuses, a current reference focus corresponding to a current reference depth range; wherein, the current reference depth range is a depth range in which the current depth falls (S103); determining whether the current focus is the same as the current reference focus; (S104); if the current focus is the same as the current reference focus, determining the current depth as the target depth of the target object (S105); or if the current focus is not the same as the current reference focus, adjusting the current focus to the current reference focus, measuring a current depth of the target object with the adjusted current focus (S106), and proceeding to the operation (S103) of determining, according to preset correspondence between depth ranges and focuses, a current reference focus corresponding to a current reference depth range. An object in various depth ranges is measured with a varying focus. The accuracy of the depth measurement of the target object is thus improved.
METHOD AND APPARATUS FOR BINOCULAR RANGING
The present disclosure provides a method and an apparatus for binocular ranging, capable of achieving an improved accuracy of binocular ranging. The method includes: extracting features from a left image and a right image to obtain a left feature image and a right feature image; selecting a standard feature image and obtaining a cost volume of the standard feature image by applying a correlation calculation to the left feature image and the right feature image using a block matching algorithm; obtaining a confidence volume by normalizing computational costs of all disparity values in a disparity dimension for each pixel point in the cost volume; obtaining a confidence map by selecting a maximum value from confidence levels of all the disparity values in the disparity dimension for each pixel point in the confidence volume; obtaining a mask map by mapping each pixel point having a confidence level higher than a predetermined threshold in the confidence map to 1 and mapping each pixel point having a confidence level lower than or equal to the threshold in the confidence map to 0; obtaining a disparity map by calculating an argmax value for the confidence levels of all disparity values in the disparity dimension for each pixel point in the confidence volume; obtaining a target disparity map by multiplying the mask map with the disparity map; and estimating a distance based on the target disparity map.
Stereo camera
Provided is a stereo camera that is capable of reducing the distance error created by entrance pupil center movement between different principal ray angles of incidence. In the present invention, imaging system unit 100a images a standard image of an object. Imaging system unit 100b images a reference image of the object. A geometric correction information storage unit 114 stores geometric correction information for the standard image and reference image, which each have error depending on the differences between the positions of the object in the standard image and reference image if the entrance pupil center indicating the point of intersection between the principal ray and optical axis moves according to the angle of incidence and the positions of the object in the standard image and reference image if it is assumed that the entrance pupil center does not move according to the angle of incidence. The geometric correction unit 119 geometrically corrects the standard image and reference image using the geometric correction information.
Signal processing apparatus, signal processing method, program, mobile object, and signal processing system
The present disclosure relates to a signal processing apparatus and system. In calibration, a sensor section is used to acquire a time of day when a stereo camera image capturing a target is acquired and a time of day when a radar image capturing the target is acquired. The target has a radar reflector and a marker. The image acquisition times of day are acquired while at the same time changing a distance between the sensor section and the target so as to find an amount of time-of-day discrepancy, a difference in time of day between the stereo camera image and the radar image at the same distance. At the time of object detection, the radar image acquired first is buffered and then output together with the stereo camera image acquired by the time-of-day discrepancy later, thus combining the stereo camera image and the radar image.
Imaging system configured to use time-of-flight imaging and stereo imaging
An imaging system is configured to use an array of time-of-flight (ToF) pixels to determine depth information using the ToF imaging method and/or the stereo imaging method. A light emitting component emits light to illuminate a scene and a light detecting component detects reflected light via the array of ToF pixels. A ToF pixel is configured to determine phase shift data based on a phase shift between the emitted light and the reflected light, as well as intensity data based on an amplitude of the reflected light. Multiple ToF pixels are shared by a single micro-lens. This enables multiple offset images to be generated using the intensity data measured by each ToF pixel. Accordingly, via a configuration in which multiple ToF pixels share a single micro-lens, depth information can be determined using both the ToF imaging method and the stereo imaging method.
Imaging system configured to use time-of-flight imaging and stereo imaging
An imaging system is configured to use an array of time-of-flight (ToF) pixels to determine depth information using the ToF imaging method and/or the stereo imaging method. A light emitting component emits light to illuminate a scene and a light detecting component detects reflected light via the array of ToF pixels. A ToF pixel is configured to determine phase shift data based on a phase shift between the emitted light and the reflected light, as well as intensity data based on an amplitude of the reflected light. Multiple ToF pixels are shared by a single micro-lens. This enables multiple offset images to be generated using the intensity data measured by each ToF pixel. Accordingly, via a configuration in which multiple ToF pixels share a single micro-lens, depth information can be determined using both the ToF imaging method and the stereo imaging method.
Signal processing apparatus, signal processing method, and program
A signal processing apparatus including a first position calculation unit that calculates a three-dimensional position of a target on a first coordinate system from a stereo image captured by a stereo camera, a second position calculation unit that calculates a three-dimensional position of the target on a second coordinate system from a sensor signal of a sensor capable of obtaining position information of at least one of a lateral direction and a longitudinal direction and position information of a depth direction, a correspondence detection unit that detects a correspondence relationship between the target on the first coordinate system and the target on the second coordinate system, and a positional relationship information estimating unit that estimates positional relationship information of the first coordinate system and the second coordinate system on the basis of the detected correspondence relationship.
Signal processing apparatus, signal processing method, and program
A signal processing apparatus including a first position calculation unit that calculates a three-dimensional position of a target on a first coordinate system from a stereo image captured by a stereo camera, a second position calculation unit that calculates a three-dimensional position of the target on a second coordinate system from a sensor signal of a sensor capable of obtaining position information of at least one of a lateral direction and a longitudinal direction and position information of a depth direction, a correspondence detection unit that detects a correspondence relationship between the target on the first coordinate system and the target on the second coordinate system, and a positional relationship information estimating unit that estimates positional relationship information of the first coordinate system and the second coordinate system on the basis of the detected correspondence relationship.