H04N25/615

IMAGE PROCESSING APPARATUS, IMAGE PICKUP APPARATUS, IMAGE PROCESSING METHOD, IMAGE PROCESSING PROGRAM, AND STORAGE MEDIUM
20170358058 · 2017-12-14 · ·

An image processing apparatus includes an estimator (804b) which estimates a moire component included in an image based on optical characteristic information, a determiner (804c) which determines a correction amount based on the estimated moire component, and a corrector (804d) which corrects the image so as to reduce the moire component included in the image based on the correction amount.

IMAGE PICKUP APPARATUS WITH FLASH BAND COMPENSATION TECHNIQUE, CONTROL METHOD THEREFOR, AND STORAGE MEDIUM
20170359533 · 2017-12-14 ·

An image pickup apparatus which is capable of, even when a flash band appears during a zooming operation, properly compensating for the flash band to generate a post-compensation image having no different levels of luminance. A lens unit has a zoom function. Image information generated based on image signals output from an image pickup device, which sequentially starts exposure and sequentially reads out signals for each row of pixels, is held in a memory unit. Based on the image information, the flash band caused by an external flash and extending across a plurality of frames is detected, and frames in which the flash band was detected are corrected to obtain an image having no different levels of luminance. The frames are corrected according to a calculated zoom change ratio of the lens unit when the zoom change ratio does not fall inside a first range determined in advance.

Nine cell pixel image sensor with phase detection autofocus

An imaging device includes a pixel array of 1×3 pixel circuits that include 3 photodiodes in a column. Bitlines are coupled to the 1×3 pixel circuits. The bitlines are divided into groupings of 3 bitlines per column of the 1×3 pixel circuits. Each column of the 1×3 pixel circuits includes a plurality of first banks coupled to a first bitline, a plurality of second banks coupled to a second bitline, and a plurality of third banks coupled to a third bitline of a respective grouping of the 3 bitlines. The 1×3 pixel circuits are arranged into groupings of 3 1×3 pixel circuits per nine cell pixel structures that form a plurality of 3×3 pixel structures of the pixel array.

METHOD FOR TRACKING PLACEMENT OF PRODUCTS ON SHELVES IN A STORE

One variation of a method for tracking placement of products in a store includes: accessing an image recorded by a mobile robotic system within a store; detecting a shelf in a region of the image; based on an address of the shelf, retrieving a list of products assigned to the shelf by a planogram of the store; retrieving a set of template images—from a database of template images—defining visual features of products specified in the list of products; extracting a set of features from the region of the image; determining that a unit of the product is mis-stocked on the shelf in response to deviation between the set of features and features in a template image, in the set of template images, representing the product; and in response to determining that the unit of the product is mis-stocked on the shelf, generating a restocking prompt for the product.

IMAGE PROCESSING APPARATUS AND METHOD FOR ENHANCING A PHASE DISTRIBUTION
20220357558 · 2022-11-10 ·

An apparatus for enhancing an input phase distribution (I(x.sub.i)) is configured to retrieve the input phase distribution (I(x.sub.i)) and compute a baseline estimate (ƒ(x.sub.i)) as an estimate of a baseline (I.sub.2 (x.sub.i)) in the input phase distribution (I(x.sub.i)). The apparatus is further configured to obtain an output phase distribution (O(x.sub.i)) based on the baseline estimate (ƒ(x.sub.i)) and the input phase distribution (I(x.sub.i)).

ELECTRONIC DEVICE INCLUDING A CAMERA DISPOSED BEHIND A DISPLAY

In one implementation, an apparatus includes a display having a front surface and a back surface. The display includes a plurality of pixel regions that emit light from the front surface to display a displayed image and a plurality of apertures that transmit light from the front surface to the back surface. The apparatus includes a camera disposed on a side of the back surface of the display. The camera is configured to capture a captured image. The apparatus includes a processor coupled to the display and the camera. The processor is configured to receive the captured image and apply a first digital filter to a first portion of the captured image and a second digital filter, different than the first digital filter, to a second portion of the captured image to reduce image distortion caused by the display.

Method and apparatus for capturing digital video
11490015 · 2022-11-01 · ·

A method and apparatus for capturing digital video includes displaying a preview of a field of view of the imaging device in a user interface of the imaging device. A sequence of images is captured. A main subject and a background in the sequence of images is determined, wherein the main subject is different than the background. A sequence of modified images for use in a final video is obtained, wherein each modified image is obtained by combining two or more images of the sequence of images such that the main subject in the modified image is blur free and the background is blurred. The sequence of modified images is combined to obtain the final video, which is stored in a memory of the imaging device, and displayed in the user interface.

Infrared imaging system and method of operating

A lens unit (120) shows longitudinal chromatic aberration and focuses an imaged scene into a first image for the infrared range in a first focal plane and into a second image for the visible range in a second focal plane. An optical element (150) manipulates the modulation transfer function assigned to the first and second images to extend the depth of field. An image processing unit (200) may amplify a modulation transfer function contrast in the first and second images. A focal shift between the focal planes may be compensated for. While in conventional approaches for RGBIR sensors contemporaneously providing both a conventional and an infrared image of the same scene the infrared image is severely out of focus, the present approach provides extended depth of field imaging to rectify the problem of out-of-focus blur for infrared radiation. An imaging system can be realized without any apochromatic lens.

Image capture device with restoration processing and image restoration processing method

A first restoration processing section 110 and a second restoration processing section 120 perform restoration processing on images (luminance data Y), which are successively captured by an image capture section, using a first filter 102, which is a restoration filter generated corresponding to a point spread function of an optical system, and a second filter 104 of which a restoration strength is weaker than that of the first filter 102. Depending on a result of determination which is input from an in-focus determination section 150 and indicates whether or not the image at the current time point is in a target in-focus state, a selection section 122 selects and outputs either luminance data Y.sub.A which is processed using the first filter 102 or luminance data Y.sub.B which is processed using the second filter 104.

Color reconstruction

In one embodiment, coloring artifacts of a color image output by a camera are minimized by taking into account a distortion introduced by the lens. Based on the distortion, the color reconstruction determines which pixels in the grayscale image to include in the reconstruction process. Additionally, the color reconstruction can take into account edges depicted in the grayscale image to determine which pixels to include in the reconstruction process. In another embodiment, coloring artifacts in a 360 degree color image are minimized by performing the color reconstruction process on a three-dimensional surface. Before the color reconstruction takes place, the two-dimensional grayscale image is projected onto a three-dimensional surface, and the color reconstruction is performed on the three-dimensional surface. The color reconstruction on the three-dimensional surface can take into account the distortion produced by the lens and/or can take into account the edges depicted in the two-dimensional and three-dimensional grayscale image.