Patent classifications
H04N23/88
Information processing apparatus with a light that illuminates more actively while capturing a printed image when embedded additional information is to be processed, information processing system, information processing method, and storage medium
An image capturing unit captures an image including an embedded image that is printed on the basis of image data in which at least a color component has been modulated according to additional information. An adjustment unit adjusts a white balance of the image captured by the image capturing unit on the basis of an adjustment value associated with the embedded image. A processing unit processes image data of the image captured by the image capturing unit whose white balance has been adjusted by the adjustment unit to read the additional information in the image captured by the image capturing unit.
WARM WHITE LIGHT ILLUMINATION AND DIGITAL IMAGE PROCESSING OF DIGITAL IMAGES DURING MICROSURGERY
A method for enhancing digital images during a microsurgery, e.g., an eye surgery, includes collecting digital images of target anatomy using a digital camera as the target anatomy is illuminated by warm white light. The method includes identifying, via a processor in communication with the digital camera, a predetermined stage of the microsurgery. Within the images, the processor digitally isolates a first pixel region, e.g., a pupil pixel region, from a second pixel region, e.g., an iris pixel region, and adjusts a characteristic of constituent pixels thereof. The method, possibly recorded as instructions in a computer-readable medium, may be used to enhance a red reflex at predetermined stages of an eye surgery. A system includes a lighting source for emitting warm white light having a color temperature of less than about 4000° K, the camera, and the processor.
Compensating for Optical Change in Image Capture Device Components Over Time
Devices, methods, and non-transitory program storage devices (NPSDs) are disclosed to compensate for the predicted color changes experienced by camera modules after certain amounts of time of real world use. Such color changes may be caused by prolonged exposure of optical components of the camera module to one or more of: solar radiation, high temperature conditions, or high humidity conditions, each of which may, over time, induce deviation in the color response of optical components of the camera module. The techniques disclosed herein may first characterize such predicted optical change to components over time based on particular environmental conditions, and then implement one or more time-varying color models to compensate for predicted changes to the camera module's color calibration values due to the characterized optical change. In some embodiments, optical changes in other types of components, e.g., display devices, caused by prolonged environmental stresses may also be modeled and compensated.
NEURAL NETWORK BASED AUTO-WHITE-BALANCING
A method of auto white balancing, including, receiving an original image, determining an RG logarithmic ratio of a set of red to green channel values of the original image, determining a BG logarithmic ratio of a set of blue to green channel values of the original image, determine an original two-dimensional histogram utilizing the RG logarithmic ratio and the BG logarithmic ratio, determine a Gaussian-blur two-dimensional histogram utilizing the RG logarithmic ratio and the BG logarithmic ratio, determining a sharpened two-dimensional histogram of a sharpened image utilizing the RG logarithmic ratio and the BG logarithmic ratio, determining a Laplacian-edge two-dimensional histogram of a Laplacian-edge image utilizing the RG logarithmic ratio and the BG logarithmic ratio and determining a white balancing gain utilizing a neural network based on the original 2D histogram, the Gaussian-blur 2D histogram, the sharpened 2D histogram and the Laplacian-edge 2D histogram.
IMAGE DEVICE, IMAGE SENSOR, AND OPERATION METHOD OF IMAGE SENSOR
An image sensor includes a pixel array including a plurality of pixels; a row driver configured to control the plurality of pixels; and an analog-to-digital converter configured to digitize a result sensed by the pixel array to generate a first image, wherein the pixel array includes: first pixel groups, wherein each first pixel group of the first pixel groups includes first white pixels and first color pixels among the plurality of pixels; and second pixel groups, wherein each second pixel group of the second pixel groups includes second white pixels and second color pixels among the plurality of pixels, and wherein first pixel data of the first image are generated based on the first white pixels and the first color pixels, and second pixel data of the first image are generated based on the second color pixels.
APPARATUS, CONTROL METHOD, AND STORAGE MEDIUM
An apparatus includes a capturing unit configured to capture an image of an object, an exposure control unit configured to control an exposure condition including an exposure time or an analog gain for each of a plurality of pixels or pixel groups on a surface of the capturing unit, a determination unit configured to determine one or more evaluation areas including an achromatic color area from the captured image, a calculation unit configured to calculate a first evaluation value for each of the plurality of pixels or pixel groups in the evaluation area, and calculate a second evaluation value based on the first evaluation value weighted based on the exposure condition for each of the plurality of pixels or pixel groups, a correction unit configured to correct the image based on the second evaluation value.
INFRARED-BASED PROCESSING OF AN IMAGE
A multi-point measurement of a scene captured in an image frame may be used to process the image frame, such as by applying white balance corrections to the image frame. In some examples, the image frame may be segmented into portions that are illuminated by different illumination sources. Different portions of the image frame may be white balanced differently based on the color temperature of the illumination source for the corresponding portion. Infrared measurements of multiple points in the scene may be used to determine a characteristic of the illumination source of different portions of the scene. For example, a picture that includes indoor and outdoor portions may be illuminated by at least two illumination sources that produce different infrared measurements values. White balancing may be applied differently to these two portions to correct for color temperature of the different sources.
GLARE REMOVAL USING DUAL CAMERAS
Dual cameras that simultaneously capture RGB and IR images of a scene can be used to remove glare from the RGB image, transformed to a YUV image, by substituting a glare region in the luminance component of the YUV image with the pixel values in a corresponding region of the IR image. Further, color information in the glare region may be adjusted by averaging over or extrapolating from the color information in the surrounding region.
METHOD AND APPARATUS BASED ON SCENE DEPENDENT LENS SHADING CORRECTION
A method of performing scene-dependent lens shading correction (SD-LSC) is provided. The method includes collecting scene information from a Bayer thumbnail of an input image; generating a standard red green blue (sRGB) thumbnail by processing the Bayer thumbnail of the input image to simulate white balance (WB) and pre-gamma blocks; determining a representative color channel ratio of the input image based on the scene information and the sRGB thumbnail; determining an ideal grid gain of the input image based on the representative color channel ratio and a grid gain of the input image; merging the ideal grid gain and the grid gain of the input image to generate a new grid gain; and applying the new grid gain to the input image.
METHOD FOR CAMERA CONTROL, IMAGE SIGNAL PROCESSOR AND DEVICE
Method and device are provided for camera control to acquire an image. The method includes: acquiring a stream of image frames by an image sensor comprising at least one frame; acquiring a target frame by the image sensor; determining scene information in the target frame; selecting a reference frame from the stream of image frames by identifying the scene information of the target frame in the reference frame; determining at least one acquisition parameter of the reference frame; and acquiring a final image from the target frame with the acquisition parameters.