H04N23/56

Multi-Baseline Camera Array System Architectures for Depth Augmentation in VR/AR Applications

Embodiments of the invention provide a camera array imaging architecture that computes depth maps for objects within a scene captured by the cameras, and use a near-field sub-array of cameras to compute depth to near-field objects and a far-field sub-array of cameras to compute depth to far-field objects. In particular, a baseline distance between cameras in the near-field subarray is less than a baseline distance between cameras in the far-field sub-array in order to increase the accuracy of the depth map. Some embodiments provide an illumination near-IR light source for use in computing depth maps.

EVENT-BASED COMPUTATIONAL PIXEL IMAGERS

A computational pixel imaging device that includes an array of pixel integrated circuits for event-based detection and imaging. Each pixel may include a digital counter that accumulates a digital number, which indicates whether a change is detected by the pixel. The counter may count in one direction for a portion of an exposure and count in an opposite direction for another portion of the exposure. The imaging device may be configured to collect and transmit key frames at a lower rate, and collect and transmit delta or event frames at a higher rate. The key frames may include a full image of a scene, captured by the pixel array. The delta frames may include sparse data, captured by pixels that have detected meaningful changes in received light intensity. High speed, low transmission bandwidth motion image video can be reconstructed using the key frames and the delta frames.

ACTIVATING LIGHT SOURCES FOR OUTPUT IMAGE

In some examples, a computing device can include a processor resource and a non-transitory memory resource storing machine-readable instructions stored thereon that, when executed, cause the processor resource to: instruct an imaging device to capture an input image, determine image properties of the input image, activate a portion of a plurality of light sources based on a physical location of the plurality of light sources and the determined image properties of the input image, and instruct the imaging device to capture an output image when the portion of the plurality of light sources are activated.

ACTIVATING LIGHT SOURCES FOR OUTPUT IMAGE

In some examples, a computing device can include a processor resource and a non-transitory memory resource storing machine-readable instructions stored thereon that, when executed, cause the processor resource to: instruct an imaging device to capture an input image, determine image properties of the input image, activate a portion of a plurality of light sources based on a physical location of the plurality of light sources and the determined image properties of the input image, and instruct the imaging device to capture an output image when the portion of the plurality of light sources are activated.

Monitoring Device
20230236482 · 2023-07-27 ·

A monitoring device may comprise: a base; a body rotatably connected to the base; a camera installed on the body; a lighting assembly rotatably connected to the body and having a first limit position and a second limit position on a path of rotation relative to the body. A light-emitting surface of the lighting assembly in the first limit position faces a shooting direction of the camera, and the light-emitting surface of the lighting assembly in the second limit position faces away from the shooting direction of the camera.

Monitoring Device
20230236482 · 2023-07-27 ·

A monitoring device may comprise: a base; a body rotatably connected to the base; a camera installed on the body; a lighting assembly rotatably connected to the body and having a first limit position and a second limit position on a path of rotation relative to the body. A light-emitting surface of the lighting assembly in the first limit position faces a shooting direction of the camera, and the light-emitting surface of the lighting assembly in the second limit position faces away from the shooting direction of the camera.

HANDHELD ELECTRONIC DEVICE

A portable electronic device may include an enclosure including a front cover defining a front exterior surface of the portable electronic device and a rear cover defining a rear exterior surface of the portable electronic device. The portable electronic device may further include a rear-facing camera and a rear-facing flash including a light emitting component defining a plurality of illuminable regions. The light emitting component may be configured to illuminate a first subset of the plurality of illuminable regions to illuminate a first field of view and illuminate a second subset of the plurality of illuminable regions, the second subset different from the first subset, to illuminate a second field of view different from the first field of view.

HANDHELD ELECTRONIC DEVICE

A portable electronic device may include an enclosure including a front cover defining a front exterior surface of the portable electronic device and a rear cover defining a rear exterior surface of the portable electronic device. The portable electronic device may further include a rear-facing camera and a rear-facing flash including a light emitting component defining a plurality of illuminable regions. The light emitting component may be configured to illuminate a first subset of the plurality of illuminable regions to illuminate a first field of view and illuminate a second subset of the plurality of illuminable regions, the second subset different from the first subset, to illuminate a second field of view different from the first field of view.

MEDICAL OBSERVATION SYSTEM, MEDICAL IMAGING DEVICE AND IMAGING METHOD
20230000330 · 2023-01-05 · ·

A medical observation system includes: a light source configured to emit, to body tissue, at least one of first narrow band light and second narrow band light; an imaging element that includes: a pixel portion including plural pixels arranged in a two-dimensional matrix; and a color filter including red filters, green filters, and blue filters that are provided on light receiving surfaces of the plural pixels, each of the light receiving surfaces including any one filter of the red, green, and blue filters on each of the light receiving surfaces; and a cut filter that is provided on a light receiving surface side of at least the pixels provided with the green filters, the cut filter being configured to shield light of a shorter wavelength band including the wavelength band of the second narrow band light, and transmit therethrough the first narrow band light.

MEDICAL IMAGE PROCESSING DEVICE, MEDICAL IMAGING DEVICE, MEDICAL OBSERVATION SYSTEM, IMAGE PROCESSING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM

A medical image processing device includes a processor configured to: obtain image data; generate, based on the obtained image data, a captured image including color component signals including a red component signal representing a red component, a green component signal representing a green component, and a blue component signal representing a blue component; calculate an intensity ratio between a fluorescent component signal and a reflected light component signal in a pixel of the captured image; determine, based on the calculated intensity ratio in the pixel of the captured image, a fluorescence region and a background region in the captured image; and generate a fluorescence image by performing, based on a result of the determination, image processing with parameters different from each other for color component signals in pixels positioned in the fluorescence region and color component signals in pixels positioned in the background region.