H04N13/133

System and method for three-dimensional scanning and for capturing a bidirectional reflectance distribution function

A method for generating a three-dimensional (3D) model of an object includes: capturing images of the object from a plurality of viewpoints, the images including color images; generating a 3D model of the object from the images, the 3D model including a plurality of planar patches; for each patch of the planar patches: mapping image regions of the images to the patch, each image region including at least one color vector; and computing, for each patch, at least one minimal color vector among the color vectors of the image regions mapped to the patch; generating a diffuse component of a bidirectional reflectance distribution function (BRDF) for each patch of planar patches of the 3D model in accordance with the at least one minimal color vector computed for each patch; and outputting the 3D model with the BRDF for each patch.

System and method for three-dimensional scanning and for capturing a bidirectional reflectance distribution function

A method for generating a three-dimensional (3D) model of an object includes: capturing images of the object from a plurality of viewpoints, the images including color images; generating a 3D model of the object from the images, the 3D model including a plurality of planar patches; for each patch of the planar patches: mapping image regions of the images to the patch, each image region including at least one color vector; and computing, for each patch, at least one minimal color vector among the color vectors of the image regions mapped to the patch; generating a diffuse component of a bidirectional reflectance distribution function (BRDF) for each patch of planar patches of the 3D model in accordance with the at least one minimal color vector computed for each patch; and outputting the 3D model with the BRDF for each patch.

Method for image processing of image data for image and visual effects on a two-dimensional display wall

A captured scene captured of a live action scene while a display wall is positioned to be part of the live action scene may be processed. To perform the processing, image data of the live action scene having a live actor and the display wall displaying a first rendering of a precursor image is received. Further, precursor metadata for the precursor image displayed on the display wall and display wall metadata for the display wall is determined. An image matte is accessed, where the image matte indicates a first portion associated with the live actor and a second portion associated with the precursor image on the display wall in the live action scene. Pixel display values to add or modify an image effect or a visual effect are determined, and the image data is adjusted using the pixel display values and the image matte.

HEAD-UP DISPLAY SYSTEM AND MOVING BODY
20230026137 · 2023-01-26 ·

A head-up display system includes a first projection module, a second projection module, and a first reflective optical element. The first projection module projects a first image. The second projection module projects a second image. The first reflective optical element reflects at least a part of the first image and at least a part of the second image. The first projection module includes a first display panel that displays the first image and projects the first image toward the first reflective optical element. The second projection module includes a second display panel that displays the second image, and an optical system that directs the second image toward the first reflective optical element.

HEAD-UP DISPLAY SYSTEM AND MOVING BODY
20230026137 · 2023-01-26 ·

A head-up display system includes a first projection module, a second projection module, and a first reflective optical element. The first projection module projects a first image. The second projection module projects a second image. The first reflective optical element reflects at least a part of the first image and at least a part of the second image. The first projection module includes a first display panel that displays the first image and projects the first image toward the first reflective optical element. The second projection module includes a second display panel that displays the second image, and an optical system that directs the second image toward the first reflective optical element.

Method and apparatus for overlay processing in 360 video system
11706397 · 2023-07-18 · ·

Provided is a 360-degree image data processing method performed by a 360-degree video reception apparatus. The method includes receiving 360-degree image data, obtaining information on an encoded picture and metadata from the 360-degree image data, decoding a picture based on the information on the encoded picture, rendering the decoded picture and an overlay based on the metadata, in which the metadata includes overlay related metadata, the overlay is rendered based on the overlay related metadata, the overlay related metadata includes information on an alpha plane of the overlay, and the information on the alpha plane of the overlay is included in a image item or a video track.

Method and apparatus for overlay processing in 360 video system
11706397 · 2023-07-18 · ·

Provided is a 360-degree image data processing method performed by a 360-degree video reception apparatus. The method includes receiving 360-degree image data, obtaining information on an encoded picture and metadata from the 360-degree image data, decoding a picture based on the information on the encoded picture, rendering the decoded picture and an overlay based on the metadata, in which the metadata includes overlay related metadata, the overlay is rendered based on the overlay related metadata, the overlay related metadata includes information on an alpha plane of the overlay, and the information on the alpha plane of the overlay is included in a image item or a video track.

Method controlling image sensor parameters
11699218 · 2023-07-11 · ·

A method of controlling parameters for image sensors includes; receiving a first image and a second image, calculating first feature values related to the first image and second feature values related to the second image; generating comparison results by comparing the first feature values of fixed regions and first variable regions of the first image with the second feature values of fixed regions and first variable regions of the second image, and controlling at least one parameter on the basis of the comparison results.

Method controlling image sensor parameters
11699218 · 2023-07-11 · ·

A method of controlling parameters for image sensors includes; receiving a first image and a second image, calculating first feature values related to the first image and second feature values related to the second image; generating comparison results by comparing the first feature values of fixed regions and first variable regions of the first image with the second feature values of fixed regions and first variable regions of the second image, and controlling at least one parameter on the basis of the comparison results.

MULTI-VIEW IMAGE FUSION BY IMAGE SPACE EQUALIZATION AND STEREO-BASED RECTIFICATION FROM TWO DIFFERENT CAMERAS

Methods to solve the problem of performing fusion of images acquired with two cameras with different type sensors, for example a visible (VIS) digital camera and an short wave infrared (SWIR) camera, include performing image space equalization on images acquired with the different type sensors before performing rectification and registration of such images in a fusion process.