H04N13/254

Systems and methods for reconstruction and rendering of viewpoint-adaptive three-dimensional (3D) personas

An exemplary method includes maintaining a receiver-side mesh-vertices list, receiving duplicative-vertex information from a sender, and responsively reducing the receiver-side mesh-vertices list in accordance with the received duplicative-vertex information, and rendering, using the reduced receiver-side mesh-vertices list, viewpoint-adaptive three-dimensional (3D) personas of a subject at least in part by weighting video pixel colors from different video-camera vantage points of video cameras that capture video streams of the subject, the weighting being performed according to a respective geometric relationship of each video-camera vantage point to a user-selected viewpoint.

Systems and methods for reconstruction and rendering of viewpoint-adaptive three-dimensional (3D) personas

An exemplary method includes maintaining a receiver-side mesh-vertices list, receiving duplicative-vertex information from a sender, and responsively reducing the receiver-side mesh-vertices list in accordance with the received duplicative-vertex information, and rendering, using the reduced receiver-side mesh-vertices list, viewpoint-adaptive three-dimensional (3D) personas of a subject at least in part by weighting video pixel colors from different video-camera vantage points of video cameras that capture video streams of the subject, the weighting being performed according to a respective geometric relationship of each video-camera vantage point to a user-selected viewpoint.

System and method for three-dimensional scanning and for capturing a bidirectional reflectance distribution function

A method for generating a three-dimensional (3D) model of an object includes: capturing images of the object from a plurality of viewpoints, the images including color images; generating a 3D model of the object from the images, the 3D model including a plurality of planar patches; for each patch of the planar patches: mapping image regions of the images to the patch, each image region including at least one color vector; and computing, for each patch, at least one minimal color vector among the color vectors of the image regions mapped to the patch; generating a diffuse component of a bidirectional reflectance distribution function (BRDF) for each patch of planar patches of the 3D model in accordance with the at least one minimal color vector computed for each patch; and outputting the 3D model with the BRDF for each patch.

System and method for three-dimensional scanning and for capturing a bidirectional reflectance distribution function

A method for generating a three-dimensional (3D) model of an object includes: capturing images of the object from a plurality of viewpoints, the images including color images; generating a 3D model of the object from the images, the 3D model including a plurality of planar patches; for each patch of the planar patches: mapping image regions of the images to the patch, each image region including at least one color vector; and computing, for each patch, at least one minimal color vector among the color vectors of the image regions mapped to the patch; generating a diffuse component of a bidirectional reflectance distribution function (BRDF) for each patch of planar patches of the 3D model in accordance with the at least one minimal color vector computed for each patch; and outputting the 3D model with the BRDF for each patch.

Passive three-dimensional image sensing based on chromatic focal differentiation
11582436 · 2023-02-14 · ·

Techniques are described for passive three-dimensional (3D) image sensing based on chromatic differentiation. For example, an object can be imaged by using a photodetector array to detect light reflected off of the object and focused through a lens onto the array. Light components of different wavelengths tends to be focused through the lens to different focal lengths, which can tend to impact the brightness of each wavelength as detected. For example, if the detector array is closer to a shorter-wavelength focal plane, a white spot will tend to be detected with a higher magnitude of blue light components than of red light components. Ratios of brightness magnitudes for different wavelengths vary in a manner that strongly correlates to object distance from the lens. Embodiments exploit this correlation to passively detect object distance. Some embodiments further provide various types of distance and/or chromatic calibration to further facilitate such detection.

Passive three-dimensional image sensing based on chromatic focal differentiation
11582436 · 2023-02-14 · ·

Techniques are described for passive three-dimensional (3D) image sensing based on chromatic differentiation. For example, an object can be imaged by using a photodetector array to detect light reflected off of the object and focused through a lens onto the array. Light components of different wavelengths tends to be focused through the lens to different focal lengths, which can tend to impact the brightness of each wavelength as detected. For example, if the detector array is closer to a shorter-wavelength focal plane, a white spot will tend to be detected with a higher magnitude of blue light components than of red light components. Ratios of brightness magnitudes for different wavelengths vary in a manner that strongly correlates to object distance from the lens. Embodiments exploit this correlation to passively detect object distance. Some embodiments further provide various types of distance and/or chromatic calibration to further facilitate such detection.

Image processing device
11582402 · 2023-02-14 · ·

An image processing device includes a rotation processor and an image processor. The rotation processor receives an input image and generates a temporary image according to the input image. The image processor is coupled to the rotation processor and outputs a processed image according to the temporary image, wherein the image processor has a predetermined image processing width, a width of the input image is larger than the predetermined image processing width, and a width of the temporary image is less than the predetermined image processing width.

Image processing device
11582402 · 2023-02-14 · ·

An image processing device includes a rotation processor and an image processor. The rotation processor receives an input image and generates a temporary image according to the input image. The image processor is coupled to the rotation processor and outputs a processed image according to the temporary image, wherein the image processor has a predetermined image processing width, a width of the input image is larger than the predetermined image processing width, and a width of the temporary image is less than the predetermined image processing width.

A SURVEILLANCE SENSOR SYSTEM
20230045319 · 2023-02-09 ·

A surveillance sensor system for a surveillance network configured to monitor an environment surrounding a device, and including a processing unit, a tridimensional sensor and a camera. The surveillance sensor system is providing to the surveillance network, a global tridimensional map, a plurality of features including associations of the features to the corresponding tridimensional data points in the global tridimensional map, and including properties determined for each feature. The surveillance sensor system is not providing the images from the camera to the surveillance network.

DEVICE AND METHOD FOR GENERATING PHOTOMETRIC STEREO IMAGES AND A COLOR IMAGE
20230038127 · 2023-02-09 · ·

A device for generating a photometric stereo image and a color image of an object which is moved at a predetermined relative speed relative to the device. The device includes a lighting area and a lighting arrangement which is oriented to the lighting area. The lighting arrangement cyclically emits a lighting sequence in the lighting area. The lighting sequence has at least four lighting pulses each of which have a wavelength range and a lighting direction. At least three of the at least four lighting pulses have a different wavelength range, and at least two of the at least four lighting pulses have a different lighting direction and a same wavelength range.