H04N13/211

PASSIVE THREE-DIMENSIONAL IMAGE SENSING BASED ON REFERENTIAL IMAGE BLURRING
20210352263 · 2021-11-11 ·

Techniques are described for passive three-dimensional image sensing based on referential image blurring. For example, a filter mask is integrated with a lens assembly to provide one or more normal imaging bandpass (NIB) regions and one or more reference imaging bandpass (RIB) regions, the regions being optically distinguishable and corresponding to different focal lengths and/or different focal paths. As light rays from a scene object pass through the different regions of the filter mask, a sensor can detect first and second images responsive to those light rays focused through the NIB region and the RIB region, respectively (according to their respective focal lengths and/or respective focal paths). An amount of blurring between the images can be measured and correlated to an object distance for the scene object. Some embodiments project additional reference illumination to enhance blurring detection in the form of reference illumination flooding and/or spotted illumination.

Image pickup apparatus, image correction method, and medium storing program
11317014 · 2022-04-26 · ·

An image pickup apparatus has a first/second optical system transmitting light through a first/second optical path, the second optical system having parallax with the first optical path; a switcher switching between the first and second optical paths in time series; an image sensor forming a first image and a second image by capturing subject images according to the light transmitted through the first optical path and the second optical path respectively; and a processor processing a signal output from the image sensor, wherein the processor calculates a motion vector in each divided region of a first base image from the first base image and a first reference image captured, interpolates an interpolation motion vector for each pixel of the first base image from the motion vector, and corrects the first base image or the first reference image to form a prediction image of the first image.

Camera module and depth information obtaining method therefore

A camera module including a lighting unit configured to output an incident light signal to be emitted to an object, a lens unit configured to concentrate a reflected light signal reflected from the object, an image sensor unit configured to generate electric signals from the reflected light signal concentrated by the lens unit, a tilting unit configured to shift an optical path of at least one of the incident light signal and the reflected light signal for each image frame in units of subpixels of the image sensor unit, and an image control unit configured to extract depth information of the object using a phase difference between the incident light signal and the reflected light signal. The image control unit includes an image controller configured to extract the depth information having a higher resolution than a plurality of subframes generated using the electric signals on the basis of the subframes.

Camera module and depth information obtaining method therefore

A camera module including a lighting unit configured to output an incident light signal to be emitted to an object, a lens unit configured to concentrate a reflected light signal reflected from the object, an image sensor unit configured to generate electric signals from the reflected light signal concentrated by the lens unit, a tilting unit configured to shift an optical path of at least one of the incident light signal and the reflected light signal for each image frame in units of subpixels of the image sensor unit, and an image control unit configured to extract depth information of the object using a phase difference between the incident light signal and the reflected light signal. The image control unit includes an image controller configured to extract the depth information having a higher resolution than a plurality of subframes generated using the electric signals on the basis of the subframes.

Dynamic aperture positioning for stereo endoscopic cameras

A stereoscopic endoscope comprises at least one image sensor for sensing a first image and a second image of a pair of stereo images. The first image is sensed based on light passing through a first aperture within the stereoscopic endoscope and the second image is sensed based on light passing through a second aperture within the stereoscopic endoscope. The stereoscope endoscope comprises a liquid crystal layer disposed between two layers of glass comprising a first arrangement of electrodes, such that each of the first aperture and the second aperture is created in the liquid crystal layer using a portion of the first arrangement of electrodes, wherein a spacing between the first aperture and the second aperture, and a polarization state associated with each of the first and second apertures are controlled using corresponding control signals provided through the first arrangement of electrodes.

Passive three-dimensional image sensing based on referential image blurring
11831858 · 2023-11-28 · ·

Techniques are described for passive three-dimensional image sensing based on referential image blurring. For example, a filter mask is integrated with a lens assembly to provide one or more normal imaging bandpass (NIB) regions and one or more reference imaging bandpass (RIB) regions, the regions being optically distinguishable and corresponding to different focal lengths and/or different focal paths. As light rays from a scene object pass through the different regions of the filter mask, a sensor can detect first and second images responsive to those light rays focused through the NIB region and the RIB region, respectively (according to their respective focal lengths and/or respective focal paths). An amount of blurring between the images can be measured and correlated to an object distance for the scene object. Some embodiments project additional reference illumination to enhance blurring detection in the form of reference illumination flooding and/or spotted illumination.

Passive three-dimensional image sensing based on referential image blurring
11831858 · 2023-11-28 · ·

Techniques are described for passive three-dimensional image sensing based on referential image blurring. For example, a filter mask is integrated with a lens assembly to provide one or more normal imaging bandpass (NIB) regions and one or more reference imaging bandpass (RIB) regions, the regions being optically distinguishable and corresponding to different focal lengths and/or different focal paths. As light rays from a scene object pass through the different regions of the filter mask, a sensor can detect first and second images responsive to those light rays focused through the NIB region and the RIB region, respectively (according to their respective focal lengths and/or respective focal paths). An amount of blurring between the images can be measured and correlated to an object distance for the scene object. Some embodiments project additional reference illumination to enhance blurring detection in the form of reference illumination flooding and/or spotted illumination.

Passive three-dimensional image sensing based on referential image blurring with spotted reference illumination
11831859 · 2023-11-28 · ·

Techniques are described for passive three-dimensional image sensing based on referential image blurring. For example, a filter mask is integrated with a lens assembly to provide one or more normal imaging bandpass (NIB) regions and one or more reference imaging bandpass (RIB) regions, the regions being optically distinguishable and corresponding to different focal lengths and/or different focal paths. As light rays from a scene object pass through the different regions of the filter mask, a sensor can detect first and second images responsive to those light rays focused through the NIB region and the RIB region, respectively (according to their respective focal lengths and/or respective focal paths). An amount of blurring between the images can be measured and correlated to an object distance for the scene object. Some embodiments project additional reference illumination to enhance blurring detection in the form of reference illumination flooding and/or spotted illumination.

Passive three-dimensional image sensing based on referential image blurring with spotted reference illumination
11831859 · 2023-11-28 · ·

Techniques are described for passive three-dimensional image sensing based on referential image blurring. For example, a filter mask is integrated with a lens assembly to provide one or more normal imaging bandpass (NIB) regions and one or more reference imaging bandpass (RIB) regions, the regions being optically distinguishable and corresponding to different focal lengths and/or different focal paths. As light rays from a scene object pass through the different regions of the filter mask, a sensor can detect first and second images responsive to those light rays focused through the NIB region and the RIB region, respectively (according to their respective focal lengths and/or respective focal paths). An amount of blurring between the images can be measured and correlated to an object distance for the scene object. Some embodiments project additional reference illumination to enhance blurring detection in the form of reference illumination flooding and/or spotted illumination.

Method for epipolar time of flight imaging

Energy-efficient epipolar imaging is applied to the ToF domain to significantly expand the versatility of ToF sensors. The described system exhibits 15+ m range outdoors in bright sunlight; robustness to global transport effects such as specular and diffuse inter-reflections; interference-free 3D imaging in the presence of many ToF sensors, even when they are all operating at the same optical wavelength and modulation frequency; and blur- and distortion-free 3D video in the presence of severe camera shake. The described embodiments are broadly applicable in consumer and robotics domains.