Patent classifications
H04N9/01
IMAGE SENSOR AND ELECTRONIC APPARATUS INCLUDING THE IMAGE SENSOR
An image sensor includes a sensor substrate including first, second, third, and fourth pixels, and a color separating lens array, wherein each of the first pixels includes a first focusing signal region and a second focusing signal region that independently generate focusing signals, and the first focusing signal region and the second focusing signal region are arranged to be adjacent to each other in the first pixel in a first direction, and each of the fourth pixels includes a third focusing signal region and a fourth focusing signal region that independently generate focusing signals, and the third focusing signal region and the fourth focusing signal region are arranged to be adjacent to each other in the fourth pixel in a second direction that is different from the first direction.
IMAGE SENSOR AND ELECTRONIC APPARATUS INCLUDING THE IMAGE SENSOR
An image sensor includes a sensor substrate including first, second, third, and fourth pixels, and a color separating lens array, wherein each of the first pixels includes a first focusing signal region and a second focusing signal region that independently generate focusing signals, and the first focusing signal region and the second focusing signal region are arranged to be adjacent to each other in the first pixel in a first direction, and each of the fourth pixels includes a third focusing signal region and a fourth focusing signal region that independently generate focusing signals, and the third focusing signal region and the fourth focusing signal region are arranged to be adjacent to each other in the fourth pixel in a second direction that is different from the first direction.
CONTROL METHOD, CAMERA ASSEMBLY, AND MOBILE TERMINAL
Provided are a control method, a camera assembly, and a mobile terminal. The control method is implemented by an image sensor including a two-dimensional pixel array and a lens array. The two-dimensional pixel array includes multiple color pixels and multiple panchromatic pixels. The color pixels have narrower spectral responses than the panchromatic pixels. The two-dimensional pixel array includes minimum repeating units, each of which includes multiple sub-units, and each of the multiple sub-units includes at least two color pixels of the multiple color pixels and at least two panchromatic pixels of the multiple panchromatic pixels. The lens array includes multiple lenses, each of which covers a corresponding one sub-unit of the sub-units. The control method includes: obtaining phase information of different pixels of the corresponding one sub-unit covered by the lens; and calculating a phase difference according to the phase information of the different pixels, to perform focusing.
Image sensor and electronic device including the same
An image sensor includes: a light detector including a plurality of photosensitive cells configured to sense light; a color separation lens array provided above the light detector and including a plurality of pattern structures, the color separation lens array being configured to collect light having different wavelength spectra respectively on at least two photosensitive cells of the plurality of photosensitive cells; and a variable interlayer element configured to adjust an optical distance between the light detector and the color separation lens array.
Image sensor and electronic device including the same
An image sensor includes: a light detector including a plurality of photosensitive cells configured to sense light; a color separation lens array provided above the light detector and including a plurality of pattern structures, the color separation lens array being configured to collect light having different wavelength spectra respectively on at least two photosensitive cells of the plurality of photosensitive cells; and a variable interlayer element configured to adjust an optical distance between the light detector and the color separation lens array.
IMAGE SENSOR AND IMAGING DEVICE
An image sensor includes: light receiving units disposed two-dimensionally on a substrate; color filters disposed on the light receiving units and including at least one of: a blue color filter for passing both of blue light and blue-violet light; a cyan color filter for passing both of green light and the blue-violet light; and a magenta color filter for passing both of red light and the blue-violet light; a first film arranged on a light receiving unit on which the cyan color filter is disposed, among the light receiving units, the first film having a peak of reflectivity near 450 nm; and a second film arranged on a light receiving unit on which the magenta color filter is disposed, among the light receiving units, the second film having a peak of reflectivity between 450 nm and 500 nm.
VEHICLE-MOUNTED CAMERA SYSTEM
In a vehicle-mounted camera system, an appropriate white balance correction processing according to a situation is executed to a long exposure time image and a short exposure time image shot in variously changing illumination environments. A vehicle-mounted camera system includes a vehicle-mounted camera that performs relatively long exposure time shooting and short exposure time shooting, a signal processing device (signal processing device) that executes a white balance correction processing for each of a long exposure time image and a short exposure time image, composes these images after the white balance correction processing, and generates a high dynamic range image, and a system control part (processing switch device) that obtains an illumination environment in which the vehicle is placed, and switches the white balance correction processing for the long exposure time image and the white balance correction processing for the short exposure time image according to the illumination environment.
METHODS, SYSTEMS, AND COMPUTER READABLE MEDIA FOR HARDWARE-IN-THE-LOOP PHASE RETRIEVAL FOR HOLOGRAPHIC NEAR EYE DISPLAYS
A method for learned hardware-in-the-loop phase retrieval for holographic near-eye displays includes generating simulated ideal output images of a holographic display. The method further includes capturing real output images of the holographic display. The method further includes learning a mapping between the simulated ideal output images and the real output images. The method further includes using the learned mapping to solve for an aberration compensating hologram phase and using the aberration compensating hologram phase to adjust a phase pattern of a spatial light modulator of the holographic display.
METHODS, SYSTEMS, AND COMPUTER READABLE MEDIA FOR HARDWARE-IN-THE-LOOP PHASE RETRIEVAL FOR HOLOGRAPHIC NEAR EYE DISPLAYS
A method for learned hardware-in-the-loop phase retrieval for holographic near-eye displays includes generating simulated ideal output images of a holographic display. The method further includes capturing real output images of the holographic display. The method further includes learning a mapping between the simulated ideal output images and the real output images. The method further includes using the learned mapping to solve for an aberration compensating hologram phase and using the aberration compensating hologram phase to adjust a phase pattern of a spatial light modulator of the holographic display.
Imaging device and focusing control method
The present invention provides an imaging device and a focusing control method capable of enhancing accuracy of a focusing control regardless of subjects even when levels of detection signals of phase difference detection pixels are low. A phase difference AF processing unit (19) generates a defocus amount (Df1) from a result of a correlation operation performed with respect to detection signals obtained by adding up detection signals of plural phase difference detection pixels (52A, 52B) in a pair line (PL1) present in a selected AF area (53) and detection signals of phase difference detection pixels (52A, 52B) in pair lines (PL2, PL3) present in a selected direction among plural directions which are crossing directions crossing an X direction with respect to each of the plural phase difference detection pixels (52A, 52B) in the pair line (PL1). A system control unit (11) performs a focusing control based on the defocus amount (Df1).