Patent classifications
H04N25/611
Imaging device and imaging system outputting signals of light in different wavelength bands
The disclosed imaging device includes pixels each including a photoelectric convertor, a focus controller controlling a focal position of light, and a pixel controller controlling charge accumulation in the photoelectric convertors and readout of signals from the pixels. The pixels include a first pixel outputting signal corresponding to light in a first wavelength band and a second pixel outputting signal corresponding to light in a second wavelength band. The pixel controller executes, during one frame, a first period of accumulating charge in the photoelectric convertor of the first pixel in a state that the light in the first wavelength band is focused on, a second period of accumulating charge in the photoelectric convertor of the second pixel in a state that the light in the second wavelength band is focused on, and a third period of reading out signals corresponding to amount of charge generated in the photoelectric convertors.
Imaging device and imaging system outputting signals of light in different wavelength bands
The disclosed imaging device includes pixels each including a photoelectric convertor, a focus controller controlling a focal position of light, and a pixel controller controlling charge accumulation in the photoelectric convertors and readout of signals from the pixels. The pixels include a first pixel outputting signal corresponding to light in a first wavelength band and a second pixel outputting signal corresponding to light in a second wavelength band. The pixel controller executes, during one frame, a first period of accumulating charge in the photoelectric convertor of the first pixel in a state that the light in the first wavelength band is focused on, a second period of accumulating charge in the photoelectric convertor of the second pixel in a state that the light in the second wavelength band is focused on, and a third period of reading out signals corresponding to amount of charge generated in the photoelectric convertors.
COLOR FRINGING PROCESSING INDEPENDENT OF TONE MAPPING
Systems and methods are disclosed for image signal processing. For example, methods may include receiving an image from an image sensor, detecting, in a linear domain, color fringing areas in the image, correcting detected color fringing areas to obtain a corrected image, performing tone mapping to the corrected image to obtain a tone mapped image and storing, displaying, or transmitting an output image based on at least the tone mapped image.
Image processing device, image processing method, program, and imaging device
A signal processing unit performs preprocessing, demosaic processing and color reproduction processing on an image signal. A control unit detects an image area having undergone a characteristic change exceeding a predetermined change amount, by using an image signal before the reproduction processing and an image signal after the color reproduction processing. The control unit associates area information indicating the detection result of the image area having undergone a characteristic change exceeding a predetermined change amount with the image signal after the color reproduction processing, and records the area information on a recording medium, or outputs the area information to an external device. Thus, it becomes possible to detect an image area where a control change caused by color reproduction processing is unnatural.
CIRCUIT FOR CORRECTING CHROMATIC ABBERATION THROUGH SHARPENING
Embodiments relate to axial chromatic aberration (ACA) reduction of raw image data generated by image sensors. A chromatic aberration reduction circuit performs chromatic aberration reduction on the raw image data to correct the ACA in the full color images through sharpening that has been clamped to reduce sharpening overshoot.
DIFFRACTIVE OPTICAL ELEMENT WITH UNDIFFRACTED LIGHT EXPANSION FOR EYE SAFE OPERATION
Aspects of the subject disclosure are directed towards safely projecting a diffracted light pattern, such as in an infrared laser-based projection/illumination system. Non-diffracted (zero-order) light is refracted once to diffuse (defocus) the non-diffracted light to an eye safe level. Diffracted (non-zero-order) light is aberrated twice, e.g., once as part of diffraction by a diffracting optical element encoded with a Fresnel lens (which does not aberrate the non-diffracted light), and another time to cancel out the other aberration; the two aberrations may occur in either order. Various alternatives include upstream and downstream positioning of the diffracting optical element relative to a refractive optical element, and/or refraction via positive and negative lenses.
DIFFRACTIVE OPTICAL ELEMENT WITH UNDIFFRACTED LIGHT EXPANSION FOR EYE SAFE OPERATION
Aspects of the subject disclosure are directed towards safely projecting a diffracted light pattern, such as in an infrared laser-based projection/illumination system. Non-diffracted (zero-order) light is refracted once to diffuse (defocus) the non-diffracted light to an eye safe level. Diffracted (non-zero-order) light is aberrated twice, e.g., once as part of diffraction by a diffracting optical element encoded with a Fresnel lens (which does not aberrate the non-diffracted light), and another time to cancel out the other aberration; the two aberrations may occur in either order. Various alternatives include upstream and downstream positioning of the diffracting optical element relative to a refractive optical element, and/or refraction via positive and negative lenses.
IN-LINE CHROMATIC ABERRATION CORRECTION IN WIDE DYNAMIC RANGE (WDR) IMAGE PROCESSING PIPELINE
In an advanced driver-assistance system (ADAS), RAW sensor image processing for a machine vision (MV) application is important. Due to different color, e.g., red/green/blue (RGB), color components, being focused by the lens at different locations in image plane, the lateral chromatic aberration phenomenon may sometimes be observed, which causes false color around edges in the final image output, especially for high contrast edges, which can impede MV applications. Disclosed herein are low-latency, efficient, optimized designs for chromatic aberration correction (CAC) components. An in-pipeline CAC design may be used to perform on-the-fly CAC without any out-of-pipeline memory traffic; enable use of wide dynamic range (WDR) sensors; uses bicubic interpolation; support vertical and horizontal chromatic aberration color channel offsets, reduce CAC line memory requirements, and support flexible look-up table (LUT) down-sampling factors to improve the spatial precision of correction and accommodate popular image sensor resolutions.
IN-LINE CHROMATIC ABERRATION CORRECTION IN WIDE DYNAMIC RANGE (WDR) IMAGE PROCESSING PIPELINE
In an advanced driver-assistance system (ADAS), RAW sensor image processing for a machine vision (MV) application is important. Due to different color, e.g., red/green/blue (RGB), color components, being focused by the lens at different locations in image plane, the lateral chromatic aberration phenomenon may sometimes be observed, which causes false color around edges in the final image output, especially for high contrast edges, which can impede MV applications. Disclosed herein are low-latency, efficient, optimized designs for chromatic aberration correction (CAC) components. An in-pipeline CAC design may be used to perform on-the-fly CAC without any out-of-pipeline memory traffic; enable use of wide dynamic range (WDR) sensors; uses bicubic interpolation; support vertical and horizontal chromatic aberration color channel offsets, reduce CAC line memory requirements, and support flexible look-up table (LUT) down-sampling factors to improve the spatial precision of correction and accommodate popular image sensor resolutions.
Method and device for correcting varying lateral chromatic aberration, storage medium, and computer equipment
Provided is a method and a device for correcting lateral chromatic aberration, a storage medium and a computer equipment. In the method, a relationship model between lens position and magnitude of LCA is constructed based on preset parameters of lens positions, and the relationship model is stored as calibration data; system parameters of a camera to be corrected and pre-stored calibration data are obtained; the LCA of the camera to be corrected is obtained by calculating the system parameters; and the LCA is corrected by the calibration data. With the method, the LCA of the lens can be removed when the focus distance is changed, and the method is suitable for mass-production.