Patent classifications
H04N9/04
IMAGING DEVICE
A second substrate including a pixel circuit that outputs a pixel signal on a basis of electric charges outputted from the sensor pixel and a third substrate including a processing circuit that performs signal processing on the pixel signal are provided. The first substrate, the second substrate, and the third substrate are stacked in this order. A semiconductor layer including the pixel circuit is divided by an insulating layer. The insulating layer divides the semiconductor layer to allow a center position of a continuous region of the semiconductor layer or a center position of a region that divides the semiconductor layer to correspond to a position of an optical center of the sensor pixel, in at least one direction on a plane of the sensor pixel perpendicular to an optical axis direction.
Image Sensor and Image Apparatus
An image capturing element according to the present disclosure includes a pixel array formed by a plurality of pixels arranged in an array on a substrate, each of the plurality of pixels including a photoelectric conversion element, a transparent layer formed on the pixel array, and a spectroscopic element array formed by a plurality of spectroscopic elements arranged in an array, and each of the plurality of spectroscopic elements is at a position corresponding to one of the plurality of spectroscopic elements inside or on the transparent layer. Each of the plurality of spectroscopic elements includes a plurality of microstructures formed from a material having a refractive index higher than a refractive index of the transparent layer. The plurality of microstructures have a microstructure pattern. Each of the plurality of spectroscopic elements separates incident light into deflected light beams having different propagation directions according to the wavelength and emits the deflected light beams.
IMAGE SENSOR, IMAGING DEVICE, ELECTRONIC DEVICE, IMAGE PROCESSING SYSTEM AND SIGNAL PROCESSING METHOD
Embodiments of the present disclosure are directed to an image sensor. The image sensor includes a color filter array, a pixel array, and a plurality of analog-to-digital converters (ADCs). The ADCs convert the analog pixel signal obtained by pixels corresponding to first color filters into a digital pixel signal based on a first bit precision and to convert the analog pixel signal obtained by pixels corresponding to second color filters and third color filters into a digital pixel signal based on a second bit precision. The second bit precision is lower than the first bit precision.
PIXELATED PROGRAMMABLE NEUTRAL DENSITY FILTER ARRAY
In some aspects, a device may receive, from a pixel array of a camera, a first image. The device may configure, based at least in part on the first image, a setting of a filter. The filter may be included within a filter array that is arranged within the camera in association with the pixel array. The device may cause the pixel array to capture a second image. Numerous other aspects are described.
IMAGE SENSOR AND METHOD OF OPERATING THE SAME
An image sensor and a method of operating the same are provided. The image sensor includes a semiconductor substrate of a first conductivity type; a photoelectric conversion region provided in the semiconductor substrate and doped to have a second conductivity type; a first floating diffusion region provided to receive photocharges accumulated in the photoelectric conversion region; a transfer gate electrode disposed between and connected to the first floating diffusion region and the photoelectric conversion region; a dual conversion gain transistor disposed between and connected to the first floating diffusion region and a second floating diffusion region; and a reset transistor disposed between and connected to the second floating diffusion region and a pixel power voltage region, wherein a channel region of the reset transistor has a potential gradient increasing in a direction from the second floating diffusion region toward the pixel power voltage region.
Solid state imaging device and electronic apparatus
A solid state imaging device includes a pixel array unit in which color filters of a plurality of colors are arrayed with four pixels of vertical 2 pixels×horizontal 2 pixels as a same color unit that receives light of the same color, shared pixel transistors that are commonly used by a plurality of pixels are intensively arranged in one predetermined pixel in a unit of sharing, and a color of the color filter of a pixel where the shared pixel transistors are intensively arranged is a predetermined color among the plurality of colors. The present technology can be applied, for example, to a solid state imaging device such as a back-surface irradiation type CMOS image sensor.
Imaging element, stacked-type imaging element, and solid-state imaging apparatus
There is provided an imaging element including: a photoelectric conversion unit formed by stacking a first electrode 21, a photoelectric conversion layer, and a second electrode, in which the photoelectric conversion unit further includes a charge storage electrode 24 that has an opposite region 24a opposite to the first electrode 21 via an insulating layer 82, and a transfer control electrode 25 that is opposite to the first electrode 21 and the charge storage electrode 24 via the insulating layer 82, and the photoelectric conversion layer is disposed above at least the charge storage electrode 24 via the insulating layer 82.
Image sensor and operating method thereof
An image sensor includes: a pixel array including a plurality of pixels divided into a plurality of binning areas; a readout circuit configured to, from the plurality of binning areas, receive a plurality of pixel signals including a first sensing signal of first pixels and a second sensing signal of second pixels during a single frame period and output a first pixel value corresponding to the first pixels and a second pixel value corresponding to the second pixels based on the plurality of pixel signals; and an image signal processor configured to generate first image data based on a plurality of first pixel values corresponding to the plurality of binning areas, generate second image data based on a plurality of second pixel values corresponding to the plurality of binning areas, and generate output image data by merging the first image data with the second image data.
Depth and vision sensors for challenging agricultural environments
Provided is a method for three-dimensional imaging a plant in an indoor agricultural environment having an ambient light power spectrum that differs from a power spectrum of natural outdoor light. The method comprises directing a spatially separated stereo pair of cameras at a scene including the plant, illuminating the scene with a non-uniform pattern provided by a light projector utilizing light in a frequency band having a lower than average ambient intensity in the indoor agricultural environment, filtering light entering image sensors of each of the cameras with filters which selectively pass light in the frequency band utilized by the light projector, capturing an image of the scene with each of the cameras to obtain first and second camera images, and generating a depth map including a depth value corresponding to each pixel in the first camera image.
Enhanced fluorescence imaging for imaging system
A fluorescence imaging system is configured to generate a video image onto a display. The system includes a light source for emitting infrared light and white light, an infrared image sensor for capturing infrared image data, and a white light image sensor for capturing white light image data. Data processing hardware performs operations that include filtering the infrared image data with a first digital finite impulse response (FIR) filter configured to produce a magnitude response of zero at a horizontal Nyquist frequency and a vertical Nyquist frequency. The operations also include filtering the infrared image data with a second digital FIR filter configured with a phase response to spatially align the white light image data with the infrared image data. The operations also include combining the white light image data and the infrared image data into combined image data and transmitting the combined image data to the display.