Patent classifications
H04N23/16
Systems and methods for enhanced infrared imaging
An infrared imaging system combines a low-resolution infrared camera with a high-resolution visible-light camera. Information extracted from images taken using the visible-light camera is combined with the low-resolution infrared images to produce an infrared image with enhanced spatial details. The process of extracting the information from the visible image adjusts the quantization level of the visible-light image to scale visible objects to match objects identified in the infrared image.
LIGHT SOURCE APPARATUS AND CONTROL METHOD THEREOF
A light source apparatus includes: a light source unit; and a control unit configured to perform lighting control, wherein the light source unit emits first color light at a first timing, and emits second color light at a second timing which delays after the first timing by a predetermined period, in accordance with the lighting control, and the control unit performs a plurality of times of lighting control based on the predetermined period, so that at least a part of a period in which the light source unit emits the second color light in accordance with a first lighting control overlaps with at least a part of a period, in which the light source unit emits the first color light in accordance with a second lighting control, which is performed after the first lighting control.
Systems and Methods for Estimating Depth and Visibility from a Reference Viewpoint for Pixels in a Set of Images Captured from Different Viewpoints
Systems in accordance with embodiments of the invention can perform parallax detection and correction in images captured using array cameras. Due to the different viewpoints of the cameras, parallax results in variations in the position of objects within the captured images of the scene. Methods in accordance with embodiments of the invention provide an accurate account of the pixel disparity due to parallax between the different cameras in the array, so that appropriate scene-dependent geometric shifts can be applied to the pixels of the captured images when performing super-resolution processing. In a number of embodiments, generating depth estimates considers the similarity of pixels in multiple spectral channels. In certain embodiments, generating depth estimates involves generating a confidence map indicating the reliability of depth estimates.
LENS DEVICE, CAMERA SYSTEM, AND ABERRATION CORRECTION UNIT
The first and second optical elements are held so as to be rotatable relative to each other about an optical axis. An aberration, which can cancel an aberration caused by a color separation prism, is generated from a synthesis of aberrations generated by the first and second optical elements in a case in which the second optical element is positioned at a first position with respect to the first optical element. The aberration generated by the first optical element is cancelled by the aberration generated by the second optical element in a case in which the second optical element is positioned at a second position with respect to the first optical element. The second optical element is positioned at the first position in a case in which the lens device is to be used in a 3-CCD type first camera device, and the second optical element is positioned at the second position with respect to the first optical element in a case in which the lens device is to be used in a single-CCD type second camera device.
Imaging system having dual image sensors
An imaging system for capturing an image of an object comprises a first lens, a dichroic beam splitter, which transmits light of a color band and reflects light of all colors outside the color band, a first image sensor for capturing an image formed by the transmitted light in the color band, a second image sensor for capturing an image formed by the reflected light outside the color band. The first image sensor is a monochrome image sensor and the second image sensor is a color image sensor having a color filter array disposed on pixels of the second image sensor. The image captured by the first image sensor and the image captured by the second image sensor are combined to form a single color image.
DEVICE AND METHOD FOR OBSERVING A BIOLOGICAL PROBE
The invention relates to a device for observing a biological probe. The device comprises an optical microscope, a beam splitting device and a plurality of cameras. The optical microscope comprises a support structure for supporting the biological probe in a beam path of the optical microscope. The beam splitting device is arranged in the beam path downstream from the biological probe, wherein the beam splitting device is configured to split the beam path into a plurality of beam paths. Each camera is arranged in one beam path of the plurality of beam paths and is configured to generate camera images of the biological probe. For at least some of the cameras, focal lengths of the cameras differ from one another and/or wavelength ranges captured by the cameras for generating the camera images of the biological probe differ from one another and/or sensor types of the cameras differ from one another.
DEVICE AND METHOD FOR OBSERVING A BIOLOGICAL PROBE
The invention relates to a device for observing a biological probe. The device comprises an optical microscope, a beam splitting device and a plurality of cameras. The optical microscope comprises a support structure for supporting the biological probe in a beam path of the optical microscope. The beam splitting device is arranged in the beam path downstream from the biological probe, wherein the beam splitting device is configured to split the beam path into a plurality of beam paths. Each camera is arranged in one beam path of the plurality of beam paths and is configured to generate camera images of the biological probe. For at least some of the cameras, focal lengths of the cameras differ from one another and/or wavelength ranges captured by the cameras for generating the camera images of the biological probe differ from one another and/or sensor types of the cameras differ from one another.
Capturing and Processing of Images Including Occlusions Focused on an Image Sensor by a Lens Stack Array
Systems and methods for implementing array cameras configured to perform super-resolution processing to generate higher resolution super-resolved images using a plurality of captured images and lens stack arrays that can be utilized in array cameras are disclosed. An imaging device in accordance with one embodiment of the invention includes at least one imager array, and each imager in the array comprises a plurality of light sensing elements and a lens stack including at least one lens surface, where the lens stack is configured to form an image on the light sensing elements, control circuitry configured to capture images formed on the light sensing elements of each of the imagers, and a super-resolution processing module configured to generate at least one higher resolution super-resolved image using a plurality of the captured images.
Capturing and Processing of Images Including Occlusions Focused on an Image Sensor by a Lens Stack Array
Systems and methods for implementing array cameras configured to perform super-resolution processing to generate higher resolution super-resolved images using a plurality of captured images and lens stack arrays that can be utilized in array cameras are disclosed. An imaging device in accordance with one embodiment of the invention includes at least one imager array, and each imager in the array comprises a plurality of light sensing elements and a lens stack including at least one lens surface, where the lens stack is configured to form an image on the light sensing elements, control circuitry configured to capture images formed on the light sensing elements of each of the imagers, and a super-resolution processing module configured to generate at least one higher resolution super-resolved image using a plurality of the captured images.
Large dynamic range cameras
A digital camera includes a plurality of channels and a processing component operatively coupled to the plurality of channels. Each channel of the plurality of channels includes an optics component and a sensor that includes an array of photo-detectors. The processing component is configured to separately control an integration time of each channel, where a first integration time of a first channel is less than a second integration time of a second channel. The processing component is also configured to combine data from the plurality of channels to generate an image.