Patent classifications
H04N23/71
Image capture method and systems to preserve apparent contrast of an image
Methods and systems are described for processing an image captured with an image sensor, such as a camera. In one embodiment, an estimated ambient light level of the captured image is determined and used to compute an optical-optical transfer function (OOTF) that is used to correct the image to preserve an apparent contrast of the image under the estimated ambient light level in a viewing environment. The estimated ambient light level is determined by scaling pixel values from the image sensor using a function that includes exposure parameters and a camera specific parameter derived from a camera calibration.
Image capture method and systems to preserve apparent contrast of an image
Methods and systems are described for processing an image captured with an image sensor, such as a camera. In one embodiment, an estimated ambient light level of the captured image is determined and used to compute an optical-optical transfer function (OOTF) that is used to correct the image to preserve an apparent contrast of the image under the estimated ambient light level in a viewing environment. The estimated ambient light level is determined by scaling pixel values from the image sensor using a function that includes exposure parameters and a camera specific parameter derived from a camera calibration.
ELECTRONIC APPARATUS HAVING FINGER AUTHENTICATING FUNCTION
An electronic apparatus includes an authenticator configured to identify registered finger information that coincides with detected finger information by matching the detected finger information with the plurality of registered finger information in a predetermined order, an executor configured to execute a function corresponding to the registered finger information identified by the authenticator, a user identifier configured to identify the actual user among the plurality of registered users by acquiring user identification information representing the actual user or by performing a determination process configured to determine the actual user, and a controller configured to change the predetermined order according to the actual user identified by the user identifier.
Controlling exposure based on inverse gamma characteristic
An image capturing control apparatus includes a detection unit configured to detect a specific object area in an image captured by an image capturing apparatus, an acquisition unit configured to acquire a first input/output characteristic of the image capturing apparatus, a conversion unit configured to convert the image by acquiring a second input/output characteristic that is an inverse input/output characteristic to the first input/output characteristic, and by applying the second input/output characteristic to the image, and a control unit configured to control exposure of the image capturing apparatus based on a luminance value of the specific object area in the converted image.
Systems and Methods for Sampling Images
An example method includes determining, by a controller of an image capture system, a plurality of sets of exposure parameter values for one or more exposure parameters. The plurality of sets of exposure parameter values are determined at an exposure determination rate. The method further includes capturing, by an image capture device of the image capture system, a plurality of images. Each image of the plurality of images is captured according to a set of exposure parameter values of the plurality of sets of exposure parameter values. The method also includes sending, by the controller of the image capture system to an image processing unit, a subset of the plurality of images. Each subset of images is sent at a sampling rate, and the sampling rate is less than the exposure determination rate.
SENSITIVITY-BIASED PIXELS
Systems, methods, and non-transitory media are provided for automatic exposure control. An example method can include receiving pixel values captured by a photosensor pixel array (PPA) of an image sensor, wherein one or more pixel values from the pixel values correspond to one or more sensitivity-biased photosensor (SBP) pixels in the PPA and one or more saturated pixel values from the pixel values correspond to one or more other photosensor pixels in the PPA; determining, based on the one or more pixel values, an actual pixel value for the one or more pixel values and/or the one or more saturated pixel values; determining an adjustment factor for at least one of the pixel values based on the actual pixel value and a target exposure value; and based on the adjustment factor, correcting an exposure setting associated with the image sensor and/or at least one of the pixel values.
EVENT-BASED COMPUTATIONAL PIXEL IMAGERS
A computational pixel imaging device that includes an array of pixel integrated circuits for event-based detection and imaging. Each pixel may include a digital counter that accumulates a digital number, which indicates whether a change is detected by the pixel. The counter may count in one direction for a portion of an exposure and count in an opposite direction for another portion of the exposure. The imaging device may be configured to collect and transmit key frames at a lower rate, and collect and transmit delta or event frames at a higher rate. The key frames may include a full image of a scene, captured by the pixel array. The delta frames may include sparse data, captured by pixels that have detected meaningful changes in received light intensity. High speed, low transmission bandwidth motion image video can be reconstructed using the key frames and the delta frames.
ACTIVATING LIGHT SOURCES FOR OUTPUT IMAGE
In some examples, a computing device can include a processor resource and a non-transitory memory resource storing machine-readable instructions stored thereon that, when executed, cause the processor resource to: instruct an imaging device to capture an input image, determine image properties of the input image, activate a portion of a plurality of light sources based on a physical location of the plurality of light sources and the determined image properties of the input image, and instruct the imaging device to capture an output image when the portion of the plurality of light sources are activated.
IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM
A same region detector detects a same region of a processing target for each of a plurality of different viewpoints from polarization images in a plurality of polarization directions acquired for each of the viewpoints. The polarization images in the plurality of polarization directions acquired for each of the plurality of different viewpoints are, for example, polarization images acquired by imaging over a period of a plurality of frames in which a positional relationship between the processing target and a polarization image acquisition unit that acquires the polarization images changes. A polarization degree calculation unit calculates a polarization degree of the same region for each of the viewpoints on the basis of the polarization images in the plurality of polarization directions. A reflection removal unit performs reflection removal processing on the same region of the processing target by using the polarization images in the plurality of polarization directions of the viewpoint at which the polarization degree calculated by the polarization degree calculation unit is maximized. A reflection component can be removed even when an angle between a plane direction of a reflecting surface and an imaging direction is not clear.
IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM
A same region detector detects a same region of a processing target for each of a plurality of different viewpoints from polarization images in a plurality of polarization directions acquired for each of the viewpoints. The polarization images in the plurality of polarization directions acquired for each of the plurality of different viewpoints are, for example, polarization images acquired by imaging over a period of a plurality of frames in which a positional relationship between the processing target and a polarization image acquisition unit that acquires the polarization images changes. A polarization degree calculation unit calculates a polarization degree of the same region for each of the viewpoints on the basis of the polarization images in the plurality of polarization directions. A reflection removal unit performs reflection removal processing on the same region of the processing target by using the polarization images in the plurality of polarization directions of the viewpoint at which the polarization degree calculated by the polarization degree calculation unit is maximized. A reflection component can be removed even when an angle between a plane direction of a reflecting surface and an imaging direction is not clear.