H04N23/10

Image sensor with photoelectric conversion units arranged in different directions
11546536 · 2023-01-03 · ·

An imaging device includes: a first image sensor comprising first pixels that receive incident light, and that include a first and second photoelectric conversion units that are arranged in a first direction; and a second image sensor including second pixels that receive light that has passed through the first image sensor, and that include a third and fourth photoelectric conversion units that are arranged in a second direction that is different from the first direction.

Camera assembly, image acquisition method, and mobile terminal
11546531 · 2023-01-03 · ·

A camera assembly, an image acquisition method, and a mobile terminal are provided. The image acquisition method is applied to a mobile terminal including the camera assembly provided in this disclosure, and the image acquisition method includes: acquiring first image data derived from an exposure of a filtering and photosensitive module in the camera assembly and second image data derived from an exposure of a photosensitive module in the camera assembly; generating a target image from the first image data and the second image data.

Imaging Method for Non-Line-of-Sight Object and Electronic Device
20220417447 · 2022-12-29 ·

Certain embodiments provide an imaging method for a non-line-of-sight object and an electronic device. In certain embodiments, the method includes: detecting a first input operation; and generating first image data in response to the first input operation. The first image data is imaging data of the non-line-of-sight object obtained by fusing second image data and third image data. The first image data includes position information between the non-line-of-sight object and a line-of-sight object. The second image data is imaging data of the line-of-sight object captured by the optical camera. The third image data is imaging data of the non-line-of-sight object captured by the electromagnetic sensor.

Computational High-Speed Hyperspectral Infrared Camera System

A hyperspectral infrared imaging system includes optical components, multi-color focal plane array or arrays, readout electronics, control electronics, and a computing system. The system measures a limited number of spatial and spectral points during image capture and the full dataset is computationally generated.

Stereoscopic visualization camera and platform

A stereoscopic imaging apparatus and platform are disclosed. An example stereoscopic imaging apparatus includes a main objective assembly and left and right lens sets defining respective parallel left and right optical paths from light that is received from the main objective assembly of a target surgical site. Each of the left and right lens sets includes a front lens, first and second zoom lenses configured to be movable along the optical path, and a lens barrel configured to receive the light from the second zoom lens. The example stereoscopic imaging apparatus also includes left and right image sensors configured to convert the light after passing through the lens barrel into image data that is indicative of the received light. The example stereoscopic visualization camera further includes a processor configured to convert the image data into stereoscopic video signals or video data for display on a display monitor.

Compensating for Optical Change in Image Capture Device Components Over Time
20220398779 · 2022-12-15 ·

Devices, methods, and non-transitory program storage devices (NPSDs) are disclosed to compensate for the predicted color changes experienced by camera modules after certain amounts of time of real world use. Such color changes may be caused by prolonged exposure of optical components of the camera module to one or more of: solar radiation, high temperature conditions, or high humidity conditions, each of which may, over time, induce deviation in the color response of optical components of the camera module. The techniques disclosed herein may first characterize such predicted optical change to components over time based on particular environmental conditions, and then implement one or more time-varying color models to compensate for predicted changes to the camera module's color calibration values due to the characterized optical change. In some embodiments, optical changes in other types of components, e.g., display devices, caused by prolonged environmental stresses may also be modeled and compensated.

Multi-spectrum-based image fusion apparatus and method, and image sensor

A multi-spectrum based image fusion apparatus is disclosed, which includes a light acquisition device, an image processor, and an image sensor having five types of photosensitive channels. The five types of photosensitive channels including red, green and blue RGB channels, an infrared IR channel and a full-band W channel. The light acquisition device acquires target light corresponding to incident light. The image sensor converts the target light into an image signal through the RGB channels, the IR channel and the W channel. The image processor analyzes the image signal into RGB color signals and a brightness signal, and fuses the RGB color signals and the brightness signal to obtain a fused image. The collection of the channels based on which the RGB color signals and the brightness signal are obtained includes the five types of photosensitive channels.

Imaging device, imaging system, vehicle running control system, and image processing device

An imaging device having an imager that includes first pixels having sensitivity to a first light and second pixels having sensitivity to a second light, a wavelength of the first light being different from a wavelength of the second light. The imager being configured to acquire first image data from the first pixels and being configured to acquire second image data from the second pixels. Each of the first image data and the second image data including an image of a code, the code being configured to output the second light. The imaging device further including an image processor configured to extract an image of the code based on the first image data and the second image data.

DEPTH SENSING VIA DEVICE CASE

Examples are disclosed that relate to displaying a hologram via an HMD. One disclosed example provides a method comprising obtaining depth data from a direct-measurement depth sensor included in the case for the HMD, the depth data comprising a depth map of a real-world environment. The method further comprises determining a distance from the HMD to an object in the real-world environment using the depth map, obtaining holographic imagery for display based at least upon the distance, and outputting the holographic imagery for display on the HMD.

Virtual reality head-mounted apparatus

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for a virtual reality head-mounted apparatus are provided. One of the apparatus includes: an apparatus body being provided with a convex lens and a camera, the camera being located on a user side of the convex lens, and a lens of the camera facing an eye of the user to acquire eye pattern features of the user. The virtual reality head-mounted apparatus may acquire eye pattern features of the user without interrupting the displaying of virtual reality contents, and quickly and accurately perform identity verification on the user based on the eye pattern features.