Patent classifications
H04N23/84
Image sensor, control method, camera component and mobile terminal with raised event adaptability and phase detection auto focus
An image sensor includes a two-dimensional pixel array and a lens array. The two-dimensional pixel array comprises a plurality of pixels. Some of the pixels includes two sub-pixels. A rectangular coordinate is established by taking the pixel as an origin, a length direction of the two-dimensional pixel array as an x-axis, and a width direction of the two-dimensional pixel array as a y-axis. The two sub-pixels lie in both a positive half axis and a negative half axis of the x-axis and lies in both a positive half axis and a negative half axis of the y-axis. The lens array comprises a plurality of lenses, each covering one of the pixels.
Imaging apparatus
An imaging apparatus includes an image sensor, a filter array disposed on an optical path from a target object to the image sensor and including two-dimensionally-arranged optical filters, and a processing circuit that generates at least four pieces of spectral image data based on an image acquired by the image sensor. The optical filters include various types of optical filters with different spectral transmittance. Each of the at least four pieces of spectral image data indicates an image corresponding to one of at least four wavelength bands. The filter array includes at least one characteristic section. The processing circuit detects a relative position between the filter array and the image sensor based on the at least one characteristic section in the image acquired by the image sensor, and compensates for deviation between the relative position and a preliminarily-set relative position when the processing circuit detects the deviation.
Imaging apparatus
An imaging apparatus includes an image sensor, a filter array disposed on an optical path from a target object to the image sensor and including two-dimensionally-arranged optical filters, and a processing circuit that generates at least four pieces of spectral image data based on an image acquired by the image sensor. The optical filters include various types of optical filters with different spectral transmittance. Each of the at least four pieces of spectral image data indicates an image corresponding to one of at least four wavelength bands. The filter array includes at least one characteristic section. The processing circuit detects a relative position between the filter array and the image sensor based on the at least one characteristic section in the image acquired by the image sensor, and compensates for deviation between the relative position and a preliminarily-set relative position when the processing circuit detects the deviation.
High-resolution image capture by luminance-driven upsampling of pixel-binned image sensor array output
Techniques are described for efficient high-resolution output of an image captured using a high-pixel-count image sensor based on pixel binning followed by luminance-guided umsampling. For example, an image sensor array is configured according to a red-green-blue-luminance (RGBL) CFA pattern, such that at least 50-percent of the imaging pixels of the array are luminance (L) pixels. Pixel binning is used during readout of the array to concurrently generate a downsampled RGB capture frame and a downsampled L capture frame. Following the readout, the L capture frame is upsampled (e.g., by upscaling and interpolation) to generate an L guide frame with 100-percent luminance density. An upsampled RGB frame can then be generated by interpolating the RGB capture frame based both on known neighboring RGB information (e.g., from the RGB capture frame and previously interpolated information), as adjusted based on local luminance information from the L guide frame.
High-resolution image capture by luminance-driven upsampling of pixel-binned image sensor array output
Techniques are described for efficient high-resolution output of an image captured using a high-pixel-count image sensor based on pixel binning followed by luminance-guided umsampling. For example, an image sensor array is configured according to a red-green-blue-luminance (RGBL) CFA pattern, such that at least 50-percent of the imaging pixels of the array are luminance (L) pixels. Pixel binning is used during readout of the array to concurrently generate a downsampled RGB capture frame and a downsampled L capture frame. Following the readout, the L capture frame is upsampled (e.g., by upscaling and interpolation) to generate an L guide frame with 100-percent luminance density. An upsampled RGB frame can then be generated by interpolating the RGB capture frame based both on known neighboring RGB information (e.g., from the RGB capture frame and previously interpolated information), as adjusted based on local luminance information from the L guide frame.
Image capturing apparatus, control method, and storage medium
An image capturing apparatus includes an image sensor configured to change an exposure condition for each of a plurality of exposure areas, each of the exposure areas including a single pixel or a plurality of pixels, and an image processing unit configured to perform steps of digital signal processing on a signal of a captured image includes generating a plurality of coupled areas in which the exposure areas are coupled based on a first threshold of the exposure condition, calculating a development parameter for each of the coupled areas, and applying the development parameter to the image processing unit for each of the exposure areas.
Image capturing apparatus, control method, and storage medium
An image capturing apparatus includes an image sensor configured to change an exposure condition for each of a plurality of exposure areas, each of the exposure areas including a single pixel or a plurality of pixels, and an image processing unit configured to perform steps of digital signal processing on a signal of a captured image includes generating a plurality of coupled areas in which the exposure areas are coupled based on a first threshold of the exposure condition, calculating a development parameter for each of the coupled areas, and applying the development parameter to the image processing unit for each of the exposure areas.
Banknote imaging
A method of obtaining a plurality of infrared images of a banknote that involves simultaneously illuminating the banknote with infrared light at a first wavelength and infrared light at a second wavelength, capturing an image of the banknote with an RGB camera, obtaining from both a first output channel signal and a second output channel signal of the RGB camera sensor where the intensity distribution of the infrared light at the first wavelength and the intensity distribution of the infrared light at the second wavelength uses a first calibration coefficient and a second calibration coefficient of the RGB camera sensor, producing separate infrared images of the banknote at the first wavelength and the second wavelength from the respective intensity distributions.
IMAGING SYSTEM AND MONITORING SYSTEM
Color filters are used for color images obtained using imaging devices such as conventional image sensors. Imaging elements with color filters are sold, and an appropriate combination of the imaging element and a lens or the like is incorporated in an electronic device. Only providing a color filter to overlap a light-receiving region of an image sensor reduces the amount of light reaching the light-receiving region. An imaging system of the present invention includes a solid-state imaging element without a color filter, a storage device, and a learning device. Since the color filter is not included, colorization is performed on obtained monochrome image data (analog data), and coloring is performed using an AI system.
SIGNAL PROCESSING METHOD, SIGNAL PROCESSING DEVICE, AND IMAGING SYSTEM
A signal processing method according to one aspect of the present disclosure includes obtaining compressed image data including two-dimensional image information that is obtained by compressing hyperspectral information corresponding to wavelength bands included in a target wavelength range, obtaining setting data including information designating one or more sub-wavelength ranges that are parts of the target wavelength range, and generating, based on the compressed image data, two-dimensional images corresponding to wavelength bands included in the one or more sub-wavelength ranges.