H04N23/74

IMAGING APPARATUS

The illumination device has a plurality of light-emitting pixels which are individually on/off controllable, and emits a reference light having a random intensity distribution. The photodetector detects light reflected from an object. The processing device reconstructs an image of the object OBJ, by calculating a correlation between a detection intensity b based on an output of the photodetector, and the intensity distribution I of the reference light. The plurality of light-emitting pixels are divided into the m (m≥2) areas each containing n (n≥2) adjoining light-emitting pixels. By selecting one light-emitting pixel from each of the m areas without overlapping, the n light-emitting pixel groups are determined. The imaging apparatus carries out sensing for every light-emitting pixel group.

Position detection method, position detection device, and display device
11700365 · 2023-07-11 · ·

Position detection methods and systems are disclosed herein. The position detection method of detecting a position in an operation surface pointed by a pointing element includes obtaining a first taken image with the first infrared camera, obtaining a second taken image with the second infrared camera, removing a noise component from the first and second images converting the first and second taken into converted images without the noise component, forming a difference image between the first converted taken image and the second converted taken image, extracting a candidate area in which a disparity amount between the first converted taken image and the second converted taken image is within a predetermined range, detecting a tip position of the pointing element from the candidate area, and determining a pointing position of the pointing element and whether or not the pointing element had contact with the operation surface based on the detecting.

Methods for collecting and processing image information to produce digital assets
11699243 · 2023-07-11 · ·

Paired images of substantially the same scene are captured with the same freestanding sensor. The paired images include reflected light illuminated with controlled polarization states that are different between the paired images. Information from the images is applied to a convolutional neural network (CNN) configured to derive a spatially varying bi-directional reflectance distribution function (SVBRDF) for objects in the paired images. Alternatively, the sensor is fixed and oriented to capture images of an object of interest in the scene while a light source traverses a path that intersects the sensor's field of view. Information from the paired images of the scene and from the images captured of the object of interest when the light source traverses the field of view are applied to a CNN to derive a SVBDRF for the object of interest. The image information and the SVBRDF are used to render a representation with artificial lighting conditions.

Systems and methods for ground truth generation using single photon avalanche diodes

A system for single photon avalanche diode image capture is configurable to, over a frame capture time period, selectively activate an illuminator to alternately emit light from the illuminator and refrain from emitting light from the illuminator. The system is configurable to, over the frame capture time period, perform a plurality of sequential shutter operations to configure each SPAD pixel of the SPAD array to enable photon detection. The plurality of sequential shutter operations generates, for each SPAD pixel of the SPAD array, a plurality of binary counts indicating whether a photon was detected during each of the plurality of sequential shutter operations. The system is configurable to, based on a first set of binary counts of the plurality of binary counts, generate an ambient light image. The system is configurable to, based on a second set of binary counts of the plurality of binary counts, generate an illuminated image.

Medical control device and medical observation system
11700456 · 2023-07-11 · ·

A medical control device includes: a driving mode switch configured to switch a driving mode of a rolling shutter type image sensor in which a plurality of pixels are two-dimensionally arrayed in units of horizontal lines; and a dimming controller configured to control a light emitting element configured to emit light according to a supplied current, and switch control of the light emitting element according to the driving mode.

Tire sensing and analysis system

The tire sensing and analysis system may comprise a measurement device and local application software. The measurement device may make contact with a tire of a vehicle such that the measurement device is positioned at a specific distance and orientation relative to the tire. The measurement device may capture multiple images of the tire using an RGB camera and a pair of infrared cameras. The local application software may analyze the images and may construct a 3D mesh describing the 3-dimensional contours of the tread. The local application software may determine a tread depth and may display status and warning messages on a display unit that is coupled to the measurement device. The measurements may be communicated to remote application software for additional analysis. As non-limiting examples, the remote application software may detect specific tire wear patterns and may transmit a report to share results of the analysis.

DEPTH ACQUISITION DEVICE AND DEPTH ACQUISITION METHOD

A depth acquisition device includes a memory and a processor. The processor performs: acquiring timing information indicating a timing at which a light source irradiates a subject with infrared light; acquiring, from the memory, an infrared light image generated by imaging a scene including the subject with the infrared light according to the timing indicated by the timing information; acquiring, from the memory, a visible light image generated by imaging a substantially same scene as the scene of the infrared light image, with visible light from a substantially same viewpoint as a viewpoint of imaging the infrared light image at a substantially same time as a time of imaging the infrared light image; detecting a flare region from the infrared light image; and estimating a depth of the flare region based on the infrared light image, the visible light image, and the flare region.

EYE-TRACKING FUNDUS IMAGING SYSTEM
20230210367 · 2023-07-06 ·

A head mounted display includes a display layer, an array of light sources, a first optical combiner, and a second optical combiner. The array of light sources are configured to be selectively enabled to emit non-visible light to illuminate a fundus of an eye. The first optical combiner is configured to receive reflected non-visible light that is reflected by the eye, direct a first component of the reflected non-visible light to a first camera to generate an image of the eye, and pass a second component of the reflected non-visible light. The second optical combiner is configured to receive a fundus imaging light responsive to the second component of the reflected non-visible light, and to direct the fundus imaging light to a second camera to generate an image of the fundus.

Systems and Methods to Optimize Imaging Settings for a Machine Vision Job

Methods and systems for optimizing one or more imaging settings for a machine vision job are disclosed herein. An example method includes detecting, by one or more processors, an initiation trigger that initiates the machine vision job. The example method further includes, responsive to detecting the initiation trigger, capturing, by an imaging device, a first image of a target object in accordance with a first configuration of the one or more imaging settings. The example method further includes, responsive to capturing the first image of the target object, automatically adjusting, by the one or more processors, the one or more imaging settings to a second configuration that includes at least one different imaging setting from the first configuration; and capturing, by the imaging device, a second image of the target object in accordance with the second configuration of the one or more imaging settings.

SYSTEMS AND METHODS FOR GENERATING METAMERICALLY MATCHED ILLUMINATION
20230217118 · 2023-07-06 · ·

Methods and systems are provided to generate metamerically matched illumination on a surgical site. The metamerically matched illumination may reduce or remove strobing effects associated with illumination devices during a surgery or surgical procedure. By metamerically matching illuminants in the illumination in combination with estimates of the human visual system, the illumination of the surgical site can visually appear to a user as a continuous white light, while maintaining distinct underlying spectra in the illumination. The light reflected by the surgical object may also be captured and metamerically matched, such that the reflected light appears as a single continuous color.