H10F39/8027

SOLID-STATE IMAGING DEVICE

An imaging device that includes a substrate, a photoelectric conversion section disposed in the substrate, an element isolation region disposed adjacent to the photoelectric conversion section, a floating diffusion electrically connected to the photoelectric conversion section, an amplification transistor having a gate electrode and an active region, and a contact section disposed on the gate electrode of the amplification transistor. The contact section overlaps the active region of the amplification transistor. The floating diffusion is electrically connected to the gate electrode of the amplification transistor via the contact section. The width of the gate electrode of the amplification transistor is larger than a width of the active region of the amplification transistor. The photoelectric conversion section includes a first type impurity, and the element isolation region includes a second type impurity having a conductivity opposite to the first type impurity.

SEMICONDUCTOR IMAGE SENSOR MODULE AND METHOD OF MANUFACTURING THE SAME
20170195602 · 2017-07-06 ·

A CMOS type semiconductor image sensor module wherein a pixel aperture ratio is improved, chip use efficiency is improved and furthermore, simultaneous shutter operation by all the pixels is made possible, and a method for manufacturing such semiconductor image sensor module are provided. The semiconductor image sensor module is provided by stacking a first semiconductor chip, which has an image sensor wherein a plurality of pixels composed of a photoelectric conversion element and a transistor are arranged, and a second semiconductor chip, which has an A/D converter array. Preferably, the semiconductor image sensor module is provided by stacking a third semiconductor chip having a memory element array. Furthermore, the semiconductor image sensor module is provided by stacking the first semiconductor chip having the image sensor and a fourth semiconductor chip having an analog nonvolatile memory array.

TIME-OF-FLIGHT DETECTION PIXEL

A time-of-flight detection pixel includes a photosensitive area including a first doped layer and a charge collection area extending in the first doped layer. At least two charge storage areas extend from the charge collection area, each including a first well more heavily doped than the charge collection area and separated from the charge collection area by a first portion of the first doped layer which is coated with a gate. Each charge storage area is laterally delimited by two insulated conductive electrodes, extending parallel to each other and facing each other. A second heavily doped layer of opposite conductivity coats the pixel except for at each portion of the first doped layer coated with the gate.

TIME-OF-FLIGHT DETECTION PIXEL

A pixel is formed on a semiconductor substrate that includes a photosensitive area having a first doped layer and a charge collection area of a first conductivity type extending through at least part of the first doped layer. At least two charge storage areas, each including a well of the first conductivity type, are separated from the charge collection area at least by a first portion of the first layer. The first portion is covered by a first gate. Each charge storage area is laterally delimited by two insulated conductive electrodes. A second doped layer of the second conductivity type covers the charge collection area and the charge storage areas.

Array type light-receiving device and hyperspectral spectrometer including array type light-receiving device

An array type light-receiving device includes a plurality of pixels two-dimensionally arranged in a first direction and a second direction perpendicular to the first direction, each of the pixels including a light-receiving layer having a responsivity to a wavelength of light. The pixels arranged in the second direction constitute a plurality of pixel lines extending in the second direction, the plurality of pixel lines being arranged in the first direction to form an array. The pixels in each of the pixel lines have different pixel areas from each other. In addition, the pixel area of each of the pixels included in at least one of the pixel lines is determined in accordance with the responsivity to a wavelength of light received by each of the pixels.

Image Sensors Including Non-Aligned Grid Patterns
20170186805 · 2017-06-29 ·

An image sensor includes a substrate including a first surface and a second surface, a first device isolation layer disposed in the substrate and defining a plurality of pixels in the substrate, and having a lower surface adjacent the first surface of the substrate and an upper surface adjacent the second surface of the substrate. Each of the pixels includes a photoelectric conversion element, a floating diffusion region adjacent the first surface of the substrate, and a grid pattern on the second surface of the substrate. At least one of the grid patterns is not vertically aligned with the first device isolation layer.

SEMICONDUCTOR IMAGE SENSOR MODULE AND METHOD OF MANUFACTURING THE SAME
20170187977 · 2017-06-29 ·

A CMOS type semiconductor image sensor module wherein a pixel aperture ratio is improved, chip use efficiency is improved and furthermore, simultaneous shutter operation by all the pixels is made possible, and a method for manufacturing such semiconductor image sensor module are provided. The semiconductor image sensor module is provided by stacking a first semiconductor chip, which has an image sensor wherein a plurality of pixels composed of a photoelectric conversion element and a transistor are arranged, and a second semiconductor chip, which has an A/D converter array. Preferably, the semiconductor image sensor module is provided by stacking a third semiconductor chip having a memory element array. Furthermore, the semiconductor image sensor module is provided by stacking the first semiconductor chip having the image sensor and a fourth semiconductor chip having an analog nonvolatile memory array.

IMAGE SENSOR USING NANOWIRE AND METHOD OF MANUFACTURING THE SAME

Disclosed is an image sensor using a nanowire, including a substrate, a photodetector for sensing incident light to produce photocurrent, the magnitude of which varies depending on the intensity of incident light, a signal processing module for outputting photodetection current including information about the presence or absence of incident light and the intensity of incident light based on the presence or absence of photocurrent and the magnitude thereof, and an electrode configured to electrically connect the photodetector and the signal processing module to each other and formed on the photodetector and the signal processing module, wherein the photodetector and the signal processing module are formed on the substrate, and the photodetector is formed of at least one silicon nanowire.

Multi-mode power-efficient light and gesture sensing in image sensors
09692968 · 2017-06-27 · ·

Various embodiments comprise apparatuses and methods including an image sensor. In one example, the image sensor includes a read-out integrated circuit, a plurality of pixel electrodes, an optically sensitive layer, and a top electrical contact. In a first low-power mode, electrical current passing through the top electrical contact is configured to be sensed, and independent currents passing through the plurality of pixel electrodes are configured not to be sensed independently. In a second high-resolution mode, independent currents passing through the plurality of pixel electrodes are configured to be sensed independently. Additional methods and apparatuses are described.

Image sensor and electronic device having the same
09691800 · 2017-06-27 · ·

An image sensor includes a substrate including photoelectric conversion elements for a plurality of unit pixels, which are two-dimensionally arranged in a pixel array; a light transmission member on the substrate; a grid structure in the light transmission member and having multiple layers; and a light collection member on the light transmission member, wherein the grid structure is tilted for respective chief ray angles of the plurality of unit pixels according to locations of the plurality of unit pixels in the pixel array.