H04N23/957

Multi-processor support for array imagers

Using the techniques discussed herein, a set of images is captured by one or more array imagers (106). Each array imager includes multiple imagers configured in various manners. Each array imager captures multiple images of substantially a same scene at substantially a same time. The images captured by each array image are encoded by multiple processors (112, 114). Each processor can encode sets of images captured by a different array imager, or each processor can encode different sets of images captured by the same array imager. The encoding of the images is performed using various image-compression techniques so that the information that results from the encoding is smaller, in terms of storage size, than the uncompressed images.

IMAGING DEVICE INCLUDING SHAKE CORRECTION MECHANISM, AND OPERATION METHOD AND OPERATION PROGRAM THEREOF

A lens interchangeable digital camera includes an image sensor in which a subject image is formed on an imaging surface through an imaging optical system including a zoom lens and a sensor movement type shake correction mechanism that performs a sensor movement operation of moving the image sensor in a direction to cancel a shake. A zoom operation determination unit determines whether or not a zoom operation in which the zoom lens moves is being performed. In a case where the zoom operation determination unit determines that the zoom operation is being performed, an operation deciding unit prohibits a shift operation which is at least a part of a sensor movement operation which is allowed while the zoom operation is stopped.

Holographic superimposition of real world plenoptic opacity modulation through transparent waveguide arrays for light field, virtual and augmented reality

Disclosed are transparent energy relay waveguide systems for the superimposition of holographic opacity modulation states for holographic, light field, virtual, augmented and mixed reality applications. The light field system may comprise one or more energy waveguide relay systems with one or more energy modulation elements, each energy modulation element configured to modulate energy passing therethrough, whereby the energy passing therethrough may be directed according to 4D plenoptic functions or inverses thereof.

HIGH DYNAMIC RANGE LIGHT-FIELD IMAGING
20170332000 · 2017-11-16 ·

A high dynamic range light-field image may be captured through the use of a light-field imaging system. In a first sensor of the light-field imaging system, first image data may be captured at a first exposure level. In the first sensor or in a second sensor of the light-field imaging system, second imaging data may be captured at a second exposure level greater than the first exposure level. In a data store, the first image data and the second image data may be received. In a processor, the first image data and the second image data may be combined to generate a light-field image with high dynamic range.

THIN-FILM OPTICAL SYSTEM
20230168564 · 2023-06-01 ·

A planar optical element (e.g., a camera) is provided comprising a diverter for diverting light from an object into an imaging plane; a planar lens waveguide in the imaging plane, receiving the diverted light and focusing it onto a line; and a sensor line located on the focus line, for forming a one-dimensional image of the object. Many such elements can be applied to a planar substrate at different angles, and the one-dimensional inputs Fourier-analysed to reconstruct the desired two-dimensional image. The elements may be transparent, so that the substrate may be a display screen; eliminating the need to locate a camera to the side of the screen. The elements can cover all or most of the screen, and a subset chosen at any given time to constitute the camera. The system can also be run backwards as a projector, with light-emitting elements instead of sensors.

LOW LATENCY NETWORKING OF PLENOPTIC DATA
20230171396 · 2023-06-01 ·

One embodiment provides a method, including: obtaining, using at least one image obtaining sensor of an image capture device, image data; tracking, using a gaze detection sensor, a user’s gaze on a display of the image capture device, wherein a plurality of lens tiles are disposed on the display and wherein each of the plurality of lens tiles cover a portion of the display; predicting, using a processor and based on the tracking, a subsequent gaze location on the display that the user’s gaze is expected to be directed toward; transmitting, using a processor, additional image data to the display associated with the portion of the plurality of lens tiles the user is currently viewing and the another portion of the display associated with the new set of the plurality of lens tiles; and updating, responsive to identifying that the user’s gaze has transitioned to the subsequent gaze location and using the additional image data, the image data associated with the another portion of the display corresponding to the new set of the plurality of lens tiles. Other embodiments are described herein.

LOW LATENCY NETWORKING OF PLENOPTIC DATA
20230171396 · 2023-06-01 ·

One embodiment provides a method, including: obtaining, using at least one image obtaining sensor of an image capture device, image data; tracking, using a gaze detection sensor, a user’s gaze on a display of the image capture device, wherein a plurality of lens tiles are disposed on the display and wherein each of the plurality of lens tiles cover a portion of the display; predicting, using a processor and based on the tracking, a subsequent gaze location on the display that the user’s gaze is expected to be directed toward; transmitting, using a processor, additional image data to the display associated with the portion of the plurality of lens tiles the user is currently viewing and the another portion of the display associated with the new set of the plurality of lens tiles; and updating, responsive to identifying that the user’s gaze has transitioned to the subsequent gaze location and using the additional image data, the image data associated with the another portion of the display corresponding to the new set of the plurality of lens tiles. Other embodiments are described herein.

Stereoscopic camera

Stereoscopic cameras include two wide-angle lenses, such as panoramic lenses, stacked one above the other to create 3D images and maps with very wide fields of view of the environment. The cameras may include panoramic annual lenses (PALs) that take a 360 degree view of the environment. Image processing is used, on a frame-by-frame basis, to map the apparent distance to all features within the scene. The camera may be operated to produce a video or map output in which each pixel has not only red (R), green (G), and blue (B) values, but also has depth (D) value.

Light Field Imaging Device and Method for Depth Acquisition and Three-Dimensional Imaging
20220057550 · 2022-02-24 ·

A light field imaging device and method are provided. The device can include a diffraction grating assembly receiving a wavefront from a scene and including one or more diffraction gratings, each having a grating period along a grating axis and diffracting the wavefront to generate a diffracted wavefront. The device can also include a pixel array disposed under the diffraction grating assembly and detecting the diffracted wavefront in a near-field diffraction regime to provide light field image data about the scene. The pixel array has a pixel pitch along the grating axis that is smaller than the grating period. The device can further include a color filter array disposed over the pixel array to spatio-chromatically sample the diffracted wavefront prior to detection by the pixel array. The device and method can be implemented in backside-illuminated sensor architectures. Diffraction grating assemblies for use in the device and method are also disclosed.

Multi-aperture ranging devices and methods

Embodiments of systems and methods for multi-aperture ranging are disclosed. An embodiment of a device includes a main lens, configured to receive an image from the field of view of the main lens, a multi-aperture optical component having optical elements optically coupled to the main lens and configured to create a multi-aperture image set that includes a plurality of subaperture images, wherein at least one point in the field of view is captured by at least two of the subaperture images, an array of sensing elements, a ROIC configured to receive the signals, to convert the signals to digital data, and to output the digital data, and an image processing system, responsive to the digital data that is output from the ROIC, which is configured to generate disparity values that correspond to at least one point in common between the at least two subaperture images.