Patent classifications
H04N13/214
Apparatus for generating depth image
An apparatus for generating depth image is provided, the apparatus according to an exemplary embodiment of the present disclosure being configured to perform an accurate stereo matching even in a low light level by obtaining RGB images and/or IR images, and using the obtained RGB images and/or IR images to extraction of a depth image.
Depth Measurement Techniques for a Multi-Aperture Imaging System
A multi-aperture imaging system determines depth map information. A series of image frames of a scene are captured. The frames include a normal image frame and at least one structured image frame. The multi-aperture imaging system determines edge information of an object in the scene using a deblur technique and the normal image frame. The multi-aperture imaging system determines fill depth information for the object based in part on the at least one structured image frame. The multi-aperture imaging system generates a depth map of the scene using the edge depth information and the fill depth information.
IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD
A depth map generation unit 22 generates a depth map that generates the depth map from images obtained by picking up a subject at a plurality of viewpoint positions by an image pickup unit 21. On the basis of the depth map generated by the depth map generation unit 22, an alignment unit 23 aligns polarized images obtained by the image pickup unit 21 picking up the subject at the plurality of viewpoint positions through polarizing filters in different polarization directions at the different viewpoint positions. A polarization characteristic acquisition unit 24 acquires a polarization characteristic of the subject from a desired viewpoint position by using the polarized images aligned by the alignment unit 23 to obtain the high-precision polarization characteristic with little degradation in temporal resolution and spatial resolution. It becomes possible to acquire the polarization characteristic of the subject at the desired viewpoint position.
Image processing apparatus, image-capturing apparatus and image processing method
An image processing apparatus includes a receiving unit configured to receive at least two parallax images that are obtained from a subject image captured via a single optical system, where the at least two parallax images include an image in a first viewpoint direction and an image in a second viewpoint direction, an average calculating unit configured to calculate, for each pixel, an arithmetic average and a geometric average between the image in the first viewpoint direction and the image in the second viewpoint direction, a ratio calculating unit configured to calculate, for each pixel, a ratio of the arithmetic average to the geometric average, and a disparity calculating unit configured to calculate, on a pixel-by-pixel basis, a disparity between the image in the first viewpoint direction and the image in the second viewpoint direction based on the ratio.
Image processing apparatus and image processing method for aligning polarized images based on a depth map and acquiring a polarization characteristic using the aligned polarized images
A depth map generation unit generates a depth map from images obtained by picking up a subject at a plurality of viewpoint positions by an image pickup unit. On the basis of the depth map generated by the depth map generation unit, an alignment unit aligns polarized images obtained by the image pickup unit picking up the subject at the plurality of viewpoint positions through polarizing filters in different polarization direction at the different viewpoint positions. A polarization characteristic acquisition unit acquires a polarization characteristic of the subject from a desired viewpoint position by using the polarized images aligned by the alignment unit to obtain the high-precision polarization characteristic with little degradation in temporal resolution and spatial resolution. It becomes possible to acquire the polarization characteristic of the subject at the desired viewpoint position.
Image processing apparatus and image processing method for aligning polarized images based on a depth map and acquiring a polarization characteristic using the aligned polarized images
A depth map generation unit generates a depth map from images obtained by picking up a subject at a plurality of viewpoint positions by an image pickup unit. On the basis of the depth map generated by the depth map generation unit, an alignment unit aligns polarized images obtained by the image pickup unit picking up the subject at the plurality of viewpoint positions through polarizing filters in different polarization direction at the different viewpoint positions. A polarization characteristic acquisition unit acquires a polarization characteristic of the subject from a desired viewpoint position by using the polarized images aligned by the alignment unit to obtain the high-precision polarization characteristic with little degradation in temporal resolution and spatial resolution. It becomes possible to acquire the polarization characteristic of the subject at the desired viewpoint position.
Camera module and depth information extraction method therefor
A camera module according to one embodiment of the present invention comprises: a lighting unit for outputting an output light signal emitted at an object; a lens unit including an infrared (IR) filter and at least one sheet of a lens arranged on the IR filter, and condensing an input light signal reflected from the object; a tilting unit for shifting the optical path of the input light signal by controlling the tilt of the IR filter; an image sensor unit for generating an electric signal from the input light signal condensed by the lens unit and shifted by the tilting unit; an image control unit for extracting depth information of the object by using a phase difference between the output light signal and the input light signal received by the image sensor unit; and a detection unit for detecting tilt information of the IR filter and providing the tilt information of the IR filter to the image control unit.
Camera module and depth information extraction method therefor
A camera module according to one embodiment of the present invention comprises: a lighting unit for outputting an output light signal emitted at an object; a lens unit including an infrared (IR) filter and at least one sheet of a lens arranged on the IR filter, and condensing an input light signal reflected from the object; a tilting unit for shifting the optical path of the input light signal by controlling the tilt of the IR filter; an image sensor unit for generating an electric signal from the input light signal condensed by the lens unit and shifted by the tilting unit; an image control unit for extracting depth information of the object by using a phase difference between the output light signal and the input light signal received by the image sensor unit; and a detection unit for detecting tilt information of the IR filter and providing the tilt information of the IR filter to the image control unit.
System and method for concurrent odometry and mapping
An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used to correct drift in the tracked motion. A motion tracking module estimates poses of the electronic device based on feature descriptors corresponding to the visual appearance of spatial features of objects in the environment. A mapping module builds a three-dimensional visual representation of the environment based on a stored plurality of maps, and feature descriptors and estimated device poses received from the motion tracking module. The mapping module provides the three-dimensional visual representation of the environment to a localization module, which identifies correspondences between stored and observed feature descriptors. The localization module performs a loop closure by minimizing the discrepancies between matching feature descriptors to compute a localized pose. The localized pose corrects drift in the estimated pose generated by the motion tracking module.
System and method for concurrent odometry and mapping
An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used to correct drift in the tracked motion. A motion tracking module estimates poses of the electronic device based on feature descriptors corresponding to the visual appearance of spatial features of objects in the environment. A mapping module builds a three-dimensional visual representation of the environment based on a stored plurality of maps, and feature descriptors and estimated device poses received from the motion tracking module. The mapping module provides the three-dimensional visual representation of the environment to a localization module, which identifies correspondences between stored and observed feature descriptors. The localization module performs a loop closure by minimizing the discrepancies between matching feature descriptors to compute a localized pose. The localized pose corrects drift in the estimated pose generated by the motion tracking module.