H04N23/16

Systems and Methods for Estimating Depth and Visibility from a Reference Viewpoint for Pixels in a Set of Images Captured from Different Viewpoints

Systems in accordance with embodiments of the invention can perform parallax detection and correction in images captured using array cameras. Due to the different viewpoints of the cameras, parallax results in variations in the position of objects within the captured images of the scene. Methods in accordance with embodiments of the invention provide an accurate account of the pixel disparity due to parallax between the different cameras in the array, so that appropriate scene-dependent geometric shifts can be applied to the pixels of the captured images when performing super-resolution processing. In a number of embodiments, generating depth estimates considers the similarity of pixels in multiple spectral channels. In certain embodiments, generating depth estimates involves generating a confidence map indicating the reliability of depth estimates.

VEHICLE ASSISTANCE SYSTEMS

The disclosure describes an example vehicle assistance system including a light sensor, a pixelated filter array adjacent the light sensor, and a full-field optically-selective element adjacent the pixelated filter array. The optically-selective element is configured to selectively direct an optical component of light incident on the optically-selective element across the pixelated filter array to the light sensor.

Configurable platform

A fluorescence imaging system for imaging an object, the system includes a white light provider that emits white light, an excitation light provider that emits excitation light in a plurality of excitation wavebands for causing the object to emit fluorescent light, a component that directs the white light and excitation light to the object and collects reflected white light and emitted fluorescent light from the object, a filter that blocks light in the excitation wavebands and transmits at least a portion of the reflected white light and fluorescent light, and an image sensor assembly that receives the transmitted reflected white light and the fluorescent light.

IMAGING APPARATUS, SIGNAL PROCESSING APPARATUS, SIGNAL PROCESSING METHOD, AND PROGRAM
20210144345 · 2021-05-13 ·

Provided is an imaging apparatus including: a splitter that splits incident light into pieces of light of two or more wavelength bands; and two or more detectors that detect the pieces of light of two or more wavelength bands respectively and output signals from which wavelengths can be extracted tunably by post-processing.

3 MOS CAMERA

A 3 MOS camera includes a first prism that has a first reflection film which reflects IR light that causes a first image sensor to receive the IR light, a second prism that has a second reflection film which reflects A % (A: a predetermined real number) visible light and that causes a second image sensor to receive the A % visible light, a third prism that causes a third image sensor to receive a (100−A) % visible light, and a video signal processor that combines a first video signal, a second video signal, and a third video signal of an observation part. The video signal processor performs pixel shifting on one of the second video signal and the third video signal having substantially same brightness to generate a fourth video signal and outputs a video signal obtained by combining the fourth video signal and the first video signal.

Method for Obtaining Light Source Spectrum and Device

A method for obtaining a light source spectrum includes: obtaining first information in a current photographing scene, where the first information includes at least one of a first image generated by a red, green, and blue (RGB) sensor or a light intensity of light received by each pixel on a first multispectral sensor; inputting the first information into a first model to obtain a probability that a light source in the current photographing scene belongs to each type of light source; and determining a spectrum of the light source in the current photographing scene based on the probability that the light source in the current photographing scene belongs to each type of light source and a spectrum of each type of light source.

High resolution thin multi-aperture imaging systems

A multi-aperture imaging system comprising a first camera with a first sensor that captures a first image and a second camera with a second sensor that captures a second image, the two cameras having either identical or different FOVs. The first sensor may have a standard color filter array (CFA) covering one sensor section and a non-standard color CFA covering another. The second sensor may have either Clear or standard CFA covered sections. Either image may be chosen to be a primary or an auxiliary image, based on a zoom factor. An output image with a point of view determined by the primary image is obtained by registering the auxiliary image to the primary image.

Miniature telephoto lens assembly

An optical lens assembly includes five lens elements and provides a TTL/EFL<1.0. In an embodiment, the focal length of the first lens element f1<TTL/2, an air gap between first and second lens elements is smaller than half the second lens element thickness, an air gap between the third and fourth lens elements is greater than TTL/5 and an air gap between the fourth and fifth lens elements is smaller than about 1.5 times the fifth lens element thickness. All lens elements may be aspheric.

DIGITAL CAMERAS WITH DIRECT LUMINANCE AND CHROMINANCE DETECTION

Digital camera systems and methods are described that provide a color digital camera with direct luminance detection. The luminance signals are obtained directly from a broadband image sensor channel without interpolation of RGB data. The chrominance signals are obtained from one or more additional image sensor channels comprising red and/or blue color band detection capability. The red and blue signals are directly combined with the luminance image sensor channel signals. The digital camera generates and outputs an image in YCrCb color space by directly combining outputs of the broadband, red and blue sensors.

High resolution thin multi-aperture imaging systems

A multi-aperture imaging system comprising a first camera with a first sensor that captures a first image and a second camera with a second sensor that captures a second image, the two cameras having either identical or different FOVs. The first sensor may have a standard color filter array (CFA) covering one sensor section and a non-standard color CFA covering another. The second sensor may have either Clear or standard CFA covered sections. Either image may be chosen to be a primary or an auxiliary image, based on a zoom factor. An output image with a point of view determined by the primary image is obtained by registering the auxiliary image to the primary image.