Patent classifications
H04N5/2256
Systems, methods, and media for encoding and decoding signals used in time of flight imaging
In accordance with some embodiments, systems, methods and media for encoding and decoding signals used in time-of-flight imaging are provided. In some embodiments, a method for estimating the depth of a scene is provided, comprising: causing a light source to emit modulated light toward the scene based on a modulation function; causing the image sensor to generate a first value based on the modulated light and a first demodulation function of K modulation functions; causing the image sensor to generate a second value; causing the image sensor to generate a third value; and determining a depth estimate for the portion of the scene based on the first value, the second value, the third value, and three correlation functions each including at least one half of a trapezoid wave.
Image processing apparatus, image processing system and non-transitory computer-readable storage medium
An image processing apparatus includes an image acquisition part that acquires a plurality of different measured images, a modeling part that identifies, for each pixel, a modeled parameter approximating an approximation function of a data sequence where pixel values of pixels corresponding to the respective measured images are placed in an order of capturing, a reconstructed image generation part that generates reconstructed images which are images corresponding to the respective measured images and reconstructed with an approximation value of each pixel identified based on the modeled parameter of each pixel, and an image changing part that changes the pixel values of the measured images based on statistics of the pixel values of the measured images and that of the corresponding reconstructed images.
DISTANCE MEASUREMENT DEVICE, DISTANCE MEASUREMENT METHOD, AND DISTANCE MEASUREMENT PROGRAM
A distance measurement device includes an imaging unit which captures a subject image formed by an imaging optical system, an emission unit which emits directional light as light having directivity along an optical axis direction of the imaging optical system, a light receiving unit which receives reflected light of the directional light from the subject, a derivation unit which derives a distance to the subject based on the timing at which the directional light is emitted and the timing at which the reflected light is received, a display unit which displays the subject image, and a control unit which performs control such that, in a case of performing a distance measurement, the display unit displays the subject image as a motion image and transition is made to a state where actual exposure by the imaging unit is possible at the timing of the end of the distance measurement.
Distance measurement device, distance measurement method, and distance measurement program
A distance measurement device includes an imaging unit which captures a subject image formed by an imaging optical system, an emission unit which emits directional light as light having directivity along an optical axis direction of the imaging optical system, a light receiving unit which receives reflected light of the directional light from the subject, a derivation unit which derives a distance to the subject based on the timing at which the directional light is emitted and the timing at which the reflected light is received, a display unit which displays the subject image, and a control unit which performs control such that, in a case of performing a distance measurement, the display unit displays the subject image as a motion image and transition is made to a state where actual exposure by the imaging unit is possible at the timing of the end of the distance measurement.
CONTROL APPARATUS, ACCESSORY, IMAGING APPARATUS, AND IMAGING SYSTEM
A control apparatus includes a controller configured to switch a simultaneous light emission mode that causes all of the at least three light source units to emit lights and a sequential light emission mode that causes the at least three light source units to sequentially emit the lights in association with imaging. The controller causes the at least three light source units to sequentially emit the lights in synchronization with each of at least three imaging signals transmitted from the imaging apparatus in the sequential light emission mode. The controller is configured to switch between the simultaneous light emission mode and the sequential light emission mode based on information from a selector provided in at least one of the imaging apparatus and the accessory and configured to select the simultaneous light emission mode and the sequential light emission mode.
Information processing apparatus and information processing method
A speckle pattern is effectively utilized. An information processing apparatus is an information processing apparatus including a light output unit and an acquisition unit. The light output unit included in the information processing apparatus is for outputting a plurality of light beams for generating a speckle pattern, to a plurality of locations of objects which are within an imaging range. Also, the acquisition unit included in the information processing apparatus is for acquiring the speckle patterns formed by scattering of the plurality of light beams striking the plurality of locations on a location-by-location basis.
Imaging apparatus and image sensor
To increase the space efficiency of an imaging apparatus which deals with a speckle pattern. An imaging apparatus includes a first image sensor and a second image sensor. The first image sensor images light entering the first image sensor from an object through an optical system to generate image data of the object. The second image sensor images a speckle pattern formed by scattering of light striking the object to generate image data of the speckle pattern, the speckle pattern entering the second image sensor through the optical system. In the imaging apparatus, the first image sensor and the second image sensor are placed side by side in an optical axis direction of the optical system.
Photographing apparatus and method
Provided is a photographing apparatus and photographing method. The photographing apparatus includes a fish eye lens; a plurality of light emitters disposed around the fish eye lens and configured to emit light at different angles with respect to an optical axis of the fish eye lens; and an image sensor configured to receive the light emitted from the plurality of light emitters and reflected by at least one object, and convert the light into an electric signal including depth information about the object.
IMAGE SENSOR CAPABLE OF ENHANCING IMAGE RECOGNITION AND APPLICATION OF THE SAME
An image sensor capable of enhancing image recognition and application of the same, wherein the image sensor includes: a photosensitive pixel array, connected to a packaging circuit, that is used to drive the photosensitive pixel array to capture outside light, and convert outside light into combined image signal, the photosensitive pixel array captures full color RGB visible light and infrared (IR) invisible light, to perform photoelectric conversion; the packaging circuit is connected electrically to the photosensitive pixel array; and an image enhanced process unit, embedded in the packaging circuit, to control and regulate the image captured by the photosensitive pixel array. The captured image includes: a full color RGB visible light wide range image signal, and at least two Infrared (IR) invisible lights narrow range image signals. The two kinds of image signals are superimposed and combined into clear output image having stereoscopic sense of layers.
SYSTEMS, METHODS AND, MEDIA FOR ENCODING AND DECODING SIGNALS USED IN TIME OF FLIGHT IMAGING
In accordance with some embodiments, systems, methods and media for encoding and decoding signals used in time-of-flight imaging are provided. In some embodiments, a method for estimating the depth of a scene is provided, comprising: causing a light source to emit modulated light toward the scene based on a modulation function; causing the image sensor to generate a first value based on the modulated light and a first demodulation function of K modulation functions; causing the image sensor to generate a second value; causing the image sensor to generate a third value; and determining a depth estimate for the portion of the scene based on the first value, the second value, the third value, and three correlation functions each including at least one half of a trapezoid wave.