H04N13/257

Methods, systems, and media for generating and rendering immersive video content
11589027 · 2023-02-21 · ·

Methods, systems, and media for generating and rendering immersive video content are provided. In some embodiments, the method comprises: receiving information indicating positions of cameras in a plurality of cameras; generating a mesh on which video content is to be projected based on the positions of the cameras in the plurality of cameras, wherein the mesh is comprised of a portion of a faceted cylinder, and wherein the faceted cylinder has a plurality of facets each corresponding to a projection from a camera in the plurality of cameras; receiving video content corresponding to the plurality of cameras; and transmitting the video content and the generated mesh to a user device in response to receiving a request for the video content from the user device.

Non-same camera based image processing apparatus

The present invention provides an image processing apparatus comprising: a first camera obtaining a true-color image by capturing a subject; a second camera spaced apart from the first camera and obtaining an infrared image by capturing the subject; and a control unit connected to the first camera and the second camera, wherein the control unit matches the true-color image and the infrared image and obtains three-dimensional information of the subject by using the matched infrared image in a region corresponding to the matched true-color image and a valid pixel.

Non-same camera based image processing apparatus

The present invention provides an image processing apparatus comprising: a first camera obtaining a true-color image by capturing a subject; a second camera spaced apart from the first camera and obtaining an infrared image by capturing the subject; and a control unit connected to the first camera and the second camera, wherein the control unit matches the true-color image and the infrared image and obtains three-dimensional information of the subject by using the matched infrared image in a region corresponding to the matched true-color image and a valid pixel.

Augmented Reality with Motion Sensing

The technology disclosed relates to a motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.

Augmented Reality with Motion Sensing

The technology disclosed relates to a motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.

Stereo viewing

The invention relates to creating and viewing stereo images, for example stereo video images, also called 3D video. At least three camera sources with overlapping fields of view are used to capture a scene so that an area of the scene is covered by at least three cameras. At the viewer, a camera pair is chosen from the multiple cameras to create a stereo camera pair that best matches the location of the eyes of the user if they were located at the place of the camera sources. That is, a camera pair is chosen so that the disparity created by the camera sources resembles the disparity that the user's eyes would have at that location. If the user tilts his head, or the view orientation is otherwise altered, a new pair can be formed, for example by switching the other camera. The viewer device then forms the images of the video frames for the left and right eyes by picking the best sources for each area of each image for realistic stereo disparity.

Stereo viewing

The invention relates to creating and viewing stereo images, for example stereo video images, also called 3D video. At least three camera sources with overlapping fields of view are used to capture a scene so that an area of the scene is covered by at least three cameras. At the viewer, a camera pair is chosen from the multiple cameras to create a stereo camera pair that best matches the location of the eyes of the user if they were located at the place of the camera sources. That is, a camera pair is chosen so that the disparity created by the camera sources resembles the disparity that the user's eyes would have at that location. If the user tilts his head, or the view orientation is otherwise altered, a new pair can be formed, for example by switching the other camera. The viewer device then forms the images of the video frames for the left and right eyes by picking the best sources for each area of each image for realistic stereo disparity.

Stereoscopic camera with fluorescence visualization

A stereoscopic camera with fluorescence visualization is disclosed. An example stereoscopic camera includes a visible light source, a near-infrared light source, and a near-ultraviolet light source. The stereoscopic camera also includes a light filter assembly having left and right filter magazines positioned respectively along left and right optical paths and configured to selectively enable certain wavelengths of light to pass through. Each of the left and right filter magazines includes an infrared cut filter, a near-ultraviolent cut filter, and a near-infrared bandpass filter. A controller of the camera is configured to provide for a visible light mode, an indocyanine green (“ICG”) fluorescence mode, and a 5-aminolevulinic acid (“ALA”) fluorescence mode by synchronizing the activation of the light sources with the selection of the filters. A processor of the camera combines image data from the different modes to enable fluorescence emission light to be superimposed on visible light stereoscopic images.

Dark flash photography with a stereo camera
11490070 · 2022-11-01 · ·

Scenes can be imaged under low-light conditions using flash photography. However, the flash can be irritating to individuals being photographed, especially when those individuals' eyes have adapted to the dark. Additionally, portions of images generated using a flash can appear washed-out or otherwise negatively affected by the flash. These issues can be addressed by using a flash at an invisible wavelength, e.g., an infrared and/or ultraviolet flash. At the same time a scene is being imaged, at the invisible wavelength of the invisible flash, the scene can also be imaged at visible wavelengths. This can include simultaneously using both a standard RGB camera and a modified visible-plus-invisible-wavelengths camera (e.g., an “IR-G-UV” camera). The visible and invisible image data can then be combined to generate an improved visible-light image of the scene, e.g., that approximates a visible light image of the scene, had the scene been illuminated during daytime light conditions.

Depth and vision sensors for challenging agricultural environments

Provided is a method for three-dimensional imaging a plant in an indoor agricultural environment having an ambient light power spectrum that differs from a power spectrum of natural outdoor light. The method comprises directing a spatially separated stereo pair of cameras at a scene including the plant, illuminating the scene with a non-uniform pattern provided by a light projector utilizing light in a frequency band having a lower than average ambient intensity in the indoor agricultural environment, filtering light entering image sensors of each of the cameras with filters which selectively pass light in the frequency band utilized by the light projector, capturing an image of the scene with each of the cameras to obtain first and second camera images, and generating a depth map including a depth value corresponding to each pixel in the first camera image.