Patent classifications
H04N9/76
DUAL PATH ENDOSCOPE
A novel dual-path-endoscope where a multi-function light source produces a first-light and a second-light toward an object. The first-light exhibits first-light-characteristics. The second-light exhibits second-light-characteristics different from the first-light-characteristics. The endoscope includes two light-paths, the disparity there between is larger than zero. Each light-path includes a respective pupil and a respective light-separator coupled with the pupil, transmitting there through one of the first-light and the second-light, associating the first-light and the second-light with a respective light-path. The dual-channel-imager includes two imaging sensors, each associated with a respective light-path and optically coupled with a respective light-separator. Each imaging-sensor exhibits sensitivity to the characteristics of the respective one of the first-light and the second-light. A first imaging-sensor acquires a first-image of the first-light reflected of the object and a second imaging-sensor acquires a second-image of the second-light reflected of the object. The processor processes the acquired images.
DISPLAY CONTROL APPARATUS AND METHOD FOR CONTROLLING THE SAME
A display control apparatus includes one or more processors, a conversion unit configured to convert a color of the image based on a predetermined condition, and a display control unit configured to perform control such that, in a case where a non-color-converted image is displayed in which the color of the non-color-converted image is not converted, a display item is displayed at a superimposed position at which the display item is superimposed on the image, and in a case where a color-converted image is displayed in which the color of the color-converted image is converted, the display item is displayed at a non-superimposed position at which the display item is not superimposed on the color-converted image.
Image processing method and apparatus
An image processing method and apparatus, where the method includes determining m boundary points of a target image, acquiring a color component of a non-boundary point in a j.sup.th area, where the j.sup.th area is a neighborhood of the j.sup.th boundary point in the m boundary points, 1≦j≦m, and m is a positive integer greater than or equal to 1, and performing synthesis processing according to the color component of the non-boundary point in the j.sup.th area, to obtain a color component of the j.sup.th boundary point, and by means of processing on a boundary point of a target image, precision of image matting and synthesis processing may be improved, and may be applied to a real-time image matting and synthesis process.
Image processing method and apparatus
An image processing method and apparatus, where the method includes determining m boundary points of a target image, acquiring a color component of a non-boundary point in a j.sup.th area, where the j.sup.th area is a neighborhood of the j.sup.th boundary point in the m boundary points, 1≦j≦m, and m is a positive integer greater than or equal to 1, and performing synthesis processing according to the color component of the non-boundary point in the j.sup.th area, to obtain a color component of the j.sup.th boundary point, and by means of processing on a boundary point of a target image, precision of image matting and synthesis processing may be improved, and may be applied to a real-time image matting and synthesis process.
REAL-TIME HDR VIDEO FOR VEHICLE CONTROL
The invention provides an autonomous vehicle with a video camera that merges images taken a different light levels by replacing saturated parts of an image with corresponding parts of a lower-light image to stream a video with a dynamic range that extends to include very low-light and very intensely lit parts of a scene. The high dynamic range (HDR) camera streams the HDR video to a HDR system in real time—as the vehicle operates. As pixel values are provided by the camera's image sensors, those values are streamed directly through a pipeline processing operation and on to the HDR system without any requirement to wait and collect entire images, or frames, before using the video information.
REAL-TIME HDR VIDEO FOR VEHICLE CONTROL
The invention provides an autonomous vehicle with a video camera that merges images taken a different light levels by replacing saturated parts of an image with corresponding parts of a lower-light image to stream a video with a dynamic range that extends to include very low-light and very intensely lit parts of a scene. The high dynamic range (HDR) camera streams the HDR video to a HDR system in real time—as the vehicle operates. As pixel values are provided by the camera's image sensors, those values are streamed directly through a pipeline processing operation and on to the HDR system without any requirement to wait and collect entire images, or frames, before using the video information.
IMAGING SYSTEM INCLUDING LENS WITH LONGITUDINAL CHROMATIC ABERRATION, ENDOSCOPE AND IMAGING METHOD
An imaging system (500) includes an optical unit (100) that captures, from a scene (900), first images indifferent wavelength ranges when the scene (900) is illuminated with not-structured light and second images of different wavelength ranges when the scene (900) is illuminated with structured light. Thereby an imaging lens unit (112) with longitudinal chromatic aberration is arranged between the scene (900) and an imaging sensor unit (118). A depth processing unit (200) may generate depth information (DI) on the basis of the second images by using optical triangulation. A sharpness processing unit (300) uses the depth information (DI) to generate an output image (OImg) by combining the first images. The optical unit (100) of the imaging system (500) may be implemented in an endoscope, by way of example.
IMAGING SYSTEM INCLUDING LENS WITH LONGITUDINAL CHROMATIC ABERRATION, ENDOSCOPE AND IMAGING METHOD
An imaging system (500) includes an optical unit (100) that captures, from a scene (900), first images indifferent wavelength ranges when the scene (900) is illuminated with not-structured light and second images of different wavelength ranges when the scene (900) is illuminated with structured light. Thereby an imaging lens unit (112) with longitudinal chromatic aberration is arranged between the scene (900) and an imaging sensor unit (118). A depth processing unit (200) may generate depth information (DI) on the basis of the second images by using optical triangulation. A sharpness processing unit (300) uses the depth information (DI) to generate an output image (OImg) by combining the first images. The optical unit (100) of the imaging system (500) may be implemented in an endoscope, by way of example.
SYSTEMS AND METHODS FOR HDR VIDEO CAPTURE WITH A MOBILE DEVICE
The invention is relates to systems and methods for high dynamic range (HDR) image capture and video processing in mobile devices. Aspects of the invention include a mobile device, such as a smartphone or digital mobile camera, including at least two image sensors fixed in a co-planar arrangement to a substrate and an optical splitting system configured to reflect at least about 90% of incident light received through an aperture of the mobile device onto the co-planar image sensors, to thereby capture a HDR image. In some embodiments, greater than about 95% of the incident light received through the aperture of the device is reflected onto the image sensors.
Apparatus and method for combining images
Provided are an image composition apparatus for composing color images with black-and-white images including infrared components, and an image composition method thereof. The image composition method includes generating a first image signal with color information and a second image signal including infrared components without color information, dividing the first image signal into a brightness signal and a color signal, composing the brightness signal of the first image signal with a brightness signal of the second image signal to generate a composed brightness signal, and composing the composed brightness signal with the color signal of the first image signal to generate a color image.