H04N25/131

PHOTOSENSITIVE IMAGING DEVICES AND ASSOCIATED METHODS
20230335544 · 2023-10-19 ·

A monolithic sensor for detecting infrared and visible light according to an example includes a semiconductor substrate and a semiconductor layer coupled to the semiconductor substrate. The semiconductor layer includes a device surface opposite the semiconductor substrate. A visible light photodiode is formed at the device surface. An infrared photodiode is also formed at the device surface and in proximity to the visible light photodiode. A textured region is coupled to the infrared photodiode and positioned to interact with electromagnetic radiation.

DUAL MODE CAMERA AND QUASI-BANDPASS FILTER
20230336847 · 2023-10-19 ·

A dual mode camera may have a camera module and a light source. The camera module may include a quasi-bandpass filter for passing visible light and passing an attenuated portion of near-infrared light to an image sensor for detection. A processor may determine an ambient lighting condition corresponding with an amount of ambient visible light detected by a photodetector. In response to a first ambient lighting condition, the processor may send a first control signal to an encoder to encode image data in monochrome, and another signal to activate a light source. In response to a second ambient lighting condition, the processor may send a second control signal to the encoder to encode image data in color. The light source may emit a band of near-infrared light corresponding with an atmospheric absorption band. The quasi-bandpass filter may attenuate a portion of near-infrared light corresponding with the same atmospheric absorption band.

DUAL MODE CAMERA AND QUASI-BANDPASS FILTER
20230336847 · 2023-10-19 ·

A dual mode camera may have a camera module and a light source. The camera module may include a quasi-bandpass filter for passing visible light and passing an attenuated portion of near-infrared light to an image sensor for detection. A processor may determine an ambient lighting condition corresponding with an amount of ambient visible light detected by a photodetector. In response to a first ambient lighting condition, the processor may send a first control signal to an encoder to encode image data in monochrome, and another signal to activate a light source. In response to a second ambient lighting condition, the processor may send a second control signal to the encoder to encode image data in color. The light source may emit a band of near-infrared light corresponding with an atmospheric absorption band. The quasi-bandpass filter may attenuate a portion of near-infrared light corresponding with the same atmospheric absorption band.

Sensor array, method for calculating a color image and a hyperspectral image, method for carrying out a white balance and use of the sensor array in medical imaging

A sensor array (1) for recording a color image in the visible spectrum (8) and hyperspectral information that is spatially linked with the color image, wherein the sensor array (1) includes an image sensor (2) composed of a plurality of photocells (3), wherein respectively a color filter (4) is fixedly assigned to at least one portion of the photocells (3), wherein each photocell (3) is assigned to a subcell (5) and each subcell (5) is assigned to a supercell (6). Each subcell (5) has at least one additional filter of a channel, and all the channels together cover at least the entire visible spectrum (8), and the characteristic wavelengths (9) of the individual filters belonging to a channel in each case differ from one another between the subcells (5) of a supercell (6).

Sensor array, method for calculating a color image and a hyperspectral image, method for carrying out a white balance and use of the sensor array in medical imaging

A sensor array (1) for recording a color image in the visible spectrum (8) and hyperspectral information that is spatially linked with the color image, wherein the sensor array (1) includes an image sensor (2) composed of a plurality of photocells (3), wherein respectively a color filter (4) is fixedly assigned to at least one portion of the photocells (3), wherein each photocell (3) is assigned to a subcell (5) and each subcell (5) is assigned to a supercell (6). Each subcell (5) has at least one additional filter of a channel, and all the channels together cover at least the entire visible spectrum (8), and the characteristic wavelengths (9) of the individual filters belonging to a channel in each case differ from one another between the subcells (5) of a supercell (6).

High-resolution automotive lens and sensor

A camera for use in automotive applications includes a lens system having a modulation transfer function (MTF) tuned to process light in a spectral range from red to green with greater resolution than light in a spectral range from blue to violet. The camera also includes an imager having pixel sensors arranged in a matrix and a color filter matrix including multiple color filter elements, each corresponding to a pixel sensor of the imager. The color filter matrix includes red filter elements and yellow filter elements and the number of yellow filter elements is greater than the number of red filter elements.

High-resolution automotive lens and sensor

A camera for use in automotive applications includes a lens system having a modulation transfer function (MTF) tuned to process light in a spectral range from red to green with greater resolution than light in a spectral range from blue to violet. The camera also includes an imager having pixel sensors arranged in a matrix and a color filter matrix including multiple color filter elements, each corresponding to a pixel sensor of the imager. The color filter matrix includes red filter elements and yellow filter elements and the number of yellow filter elements is greater than the number of red filter elements.

Depth acquisition device and depth acquisition method

A depth acquisition device includes memory and processor performing: acquiring, from the memory, intensities of infrared light emitted from a light source and measured by imaging with the infrared light reflected on a subject by pixels in an imaging element; generating a depth image by calculating a distance to the subject as a depth for each pixel based on an intensity received by the pixel; acquiring, from the memory, a visible light image generated by imaging, with visible light, a substantially same scene with a substantially same viewpoint and at a substantially same timing as those of imaging the infrared light image; detecting, from the visible light image, an edge region including an edge along a direction perpendicular to a direction of movement of the visible light image; and correcting, in the depth image, a depth of a target region corresponding to the edge region in the depth image.

IMAGING DEVICE
20230283866 · 2023-09-07 · ·

An imaging device includes a first optical sensor, and a second optical sensor disposed on the side opposite to the light incidence side with respect to the first optical sensor and bonded to the first optical sensor. The first optical sensor includes a plurality of first pixels disposed two-dimensionally. The second optical sensor includes a plurality of second pixels disposed two-dimensionally. Each of the plurality of first pixels includes an embedded photodiode that generates charge in response to incidence of light in a first wavelength band. Each of the plurality of second pixels includes a charge generation region that generates charge in response to incidence of the light in a second wavelength band, a charge collection region to which the charge is transferred, a photogate electrode that attracts the charge, and a transfer gate electrode that transfers the charge to the charge collection region.

IMAGING DEVICE
20230283866 · 2023-09-07 · ·

An imaging device includes a first optical sensor, and a second optical sensor disposed on the side opposite to the light incidence side with respect to the first optical sensor and bonded to the first optical sensor. The first optical sensor includes a plurality of first pixels disposed two-dimensionally. The second optical sensor includes a plurality of second pixels disposed two-dimensionally. Each of the plurality of first pixels includes an embedded photodiode that generates charge in response to incidence of light in a first wavelength band. Each of the plurality of second pixels includes a charge generation region that generates charge in response to incidence of the light in a second wavelength band, a charge collection region to which the charge is transferred, a photogate electrode that attracts the charge, and a transfer gate electrode that transfers the charge to the charge collection region.