H04N23/13

Systems and methods for array camera focal plane control

Systems and methods for controlling the parameters of groups of focal planes as focal plane groups in an array camera are described. One embodiment includes a plurality of focal planes, and control circuitry configured to control the capture of image data by the pixels within the focal planes. In addition, the control circuitry includes: a plurality of parameter registers, where a given parameter register is associated with one of the focal planes and contains configuration data for the associated focal plane; and a focal plane group register that contains data identifying focal planes that belong to a focal plane group. Furthermore, the control circuitry is configured to control the imaging parameters of the focal planes in the focal plane groups by mapping instructions that address virtual register addresses to the addresses of the parameter registers associated with focal planes within specific focal plane groups.

Array camera design with dedicated Bayer camera

The invention is directed to systems, methods and computer program products for capturing an image using an array camera. A method comprises determining an application associated with capturing an image using an array camera, wherein the array camera comprises a first sensor and at least one second sensor, wherein the first sensor comprises a red filter, a green filter, and a blue filter, and wherein each second sensor comprises a red filter, a green filter, or a blue filter; determining whether the application requires the image to have a first resolution equal to or greater than a predetermined resolution; determining whether the application requires depth information associated with the image; and in response to determining the application does not require the image to have the first resolution and does not require depth information, activating the first sensor, and capturing the image using the first sensor.

COMBINED HDR/LDR VIDEO STREAMING
20220311907 · 2022-09-29 ·

The invention provides methods for broadcasting video in a dual HDR/LDR format such that the video can be displayed in real time by both LDR and HDR display devices. Methods and devices of the invention process streams of pixels from multiple sensors in a frame-independent manner to produce an HDR video signal in real time. That HDR video signal is then tone-mapped to produce an LDR video signal, the LDR signal is subtracted from the HDR signal to calculate a residual signal, and the LDR signal and the residual signal are merged into a combined signal that is broadcast via a communications network.

Lens module array, image sensing device and fusing method for digital zoomed images

A lens module array for being assembled to a portable device is provided. The lens module array includes a wide-angle mono lens module, a narrow-angle mono lens module, and two color lens modules. While the wide-angle mono lens module, the narrow-angle mono lens module, and the two color lens modules are assembled onto the portable device, the wide-angle mono lens module, the narrow-angle mono lens module, and the two color lens modules are located at four vertices of a quadrangle, respectively. Besides, the two color lens modules are located at two opposite vertices.

VEHICLE VISION SYSTEM

Vehicle visual systems are disclosed to produce seamless and uniform surround-view images of the vehicle using a number of Ultra Wide-Angle (UWA) lens cameras and optionally HUD systems. A distributive system architecture wherein individual cameras are capable of performing various image transformations allows a flexible and resource efficient image processing scheme.

Combined HDR/LDR Video Streaming
20170238029 · 2017-08-17 ·

The invention provides methods for broadcasting video in a dual HDR/LDR format such that the video can be displayed in real time by both LDR and HDR display devices. Methods and devices of the invention process streams of pixels from multiple sensors in a frame-independent manner to produce an HDR video signal in real time. That HDR video signal is then tone-mapped to produce an LDR video signal, the LDR signal is subtracted from the HDR signal to calculate a residual signal, and the LDR signal and the residual signal are merged into a combined signal that is broadcast via a communications network.

SYSTEMS AND METHODS FOR HDR VIDEO CAPTURE WITH A MOBILE DEVICE
20170237913 · 2017-08-17 ·

The invention is relates to systems and methods for high dynamic range (HDR) image capture and video processing in mobile devices. Aspects of the invention include a mobile device, such as a smartphone or digital mobile camera, including at least two image sensors fixed in a co-planar arrangement to a substrate and an optical splitting system configured to reflect at least about 90% of incident light received through an aperture of the mobile device onto the co-planar image sensors, to thereby capture a HDR image. In some embodiments, greater than about 95% of the incident light received through the aperture of the device is reflected onto the image sensors.

OPEN VIEW, MULTI-MODAL, CALIBRATED DIGITAL LOUPE WITH DEPTH SENSING
20220038675 · 2022-02-03 ·

A digital loupe system is provided which can include a number of features. In one embodiment, the digital loupe system can include a stereo camera pair and a distance sensor. The system can further include a processor configured to perform a transformation to image signals from the stereo camera pair based on a distance measurement from the distance sensor and from camera calibration information. In some examples, the system can use the depth information and the calibration information to correct for parallax between the cameras to provide a multi-channel image. Ergonomic head mounting systems are also provided. In some implementations, the head mounting systems can be configurable to support the weight of a digital loupe system, including placing one or two oculars in a line of sight with an eye of a user, while improving overall ergonomics, including peripheral vision, comfort, stability, and adjustability. Methods of use are also provided.

IMAGE SENSOR AND THERMAL CAMERA DEVICE, SYSTEM AND METHOD

The present disclosure is directed to devices and methods for synchronizing capturing of spectral images with the capturing of thermal images. A thermal imaging device of an aerial vehicle captures a sequence of thermal image of thermal images. Capturing of spectral images by a spectral imaging device of the aerial vehicle is synchronized with the capturing of the thermal images. Irradiance data indicative of a background temperature is sensed. A digital surface model of an area of interest is generated based on the sequence of spectral images. An emissivity of a target is estimated and a temperature of a pixel of the digital surface model of the target is estimated based on the sequence of thermal images, the irradiance data indicative of the background temperature and the estimated emissivity of the target.

OPTICAL SYSTEM WITH DYNAMIC DISTORTION USING FREEFORM ELEMENTS

A method for designing an optical system for providing reliable, robust and successful realization of a distortion variation function is presented. In a preferred embodiment, the proposed distortion variation optical system includes at least two non-symmetrical elements, which are moving in the transverse direction. The proposed freeform lens contains two transmissive refractive surfaces. The freeform elements designed with this method have preferably a flat surface and a non-symmetrical freeform surface. The two plano-surfaces are preferably made to face each other, so that a miniature camera can be offered. The value of the non-symmetrical freeform surface is used to produce variable optical power when the two freeform elements undergo a relative movement in the vertical direction. Using this method, an optical system with an active distortion, smaller form factor, and better imaging quality can be obtained.