H04N13/398

MULTI-VIEWPOINT 3D DISPLAY DEVICE AND 3D IMAGE DISPLAY METHOD
20230007226 · 2023-01-05 ·

A multi-viewpoint 3D display method is provided, comprising: obtaining a distance between a user and a multi-viewpoint 3D display screen; and dynamically rendering subpixels in composite subpixels in the multi-viewpoint 3D display screen based on 3D signals in response to a change in a distance. The method can implement flexible projection from multiple viewpoints. A multi-viewpoint 3D display device, a computer readable storage medium, and a computer program product are also provided.

MULTI-VIEWPOINT 3D DISPLAY DEVICE AND 3D IMAGE DISPLAY METHOD
20230007226 · 2023-01-05 ·

A multi-viewpoint 3D display method is provided, comprising: obtaining a distance between a user and a multi-viewpoint 3D display screen; and dynamically rendering subpixels in composite subpixels in the multi-viewpoint 3D display screen based on 3D signals in response to a change in a distance. The method can implement flexible projection from multiple viewpoints. A multi-viewpoint 3D display device, a computer readable storage medium, and a computer program product are also provided.

3D DISPLAY DEVICE AND 3D IMAGE DISPLAY METHOD
20230007228 · 2023-01-05 ·

The present disclosure relates to the technical field of 3D display, and discloses a 3D display device, comprising: a multi-viewpoint 3D display screen, which comprises a plurality of composite pixels, wherein each composite pixel of the plurality of composite pixels comprises a plurality of composite subpixels, and each composite subpixel of the plurality of composite subpixels comprises a plurality of subpixels corresponding to a plurality of viewpoints of the 3D display device; a viewing angle determining apparatus, configured to determine a user viewing angle of a user; a 3D processing apparatus, configured to render, based on the user viewing angle, corresponding subpixels of the plurality of composite subpixels according to depth-of-field (DOF) information of a 3D model. The device may solve a problem of 3D display distortion. The present disclosure further discloses a 3D image display method, a computer-readable storage medium, and a computer program product.

3D DISPLAY DEVICE AND 3D IMAGE DISPLAY METHOD
20230007228 · 2023-01-05 ·

The present disclosure relates to the technical field of 3D display, and discloses a 3D display device, comprising: a multi-viewpoint 3D display screen, which comprises a plurality of composite pixels, wherein each composite pixel of the plurality of composite pixels comprises a plurality of composite subpixels, and each composite subpixel of the plurality of composite subpixels comprises a plurality of subpixels corresponding to a plurality of viewpoints of the 3D display device; a viewing angle determining apparatus, configured to determine a user viewing angle of a user; a 3D processing apparatus, configured to render, based on the user viewing angle, corresponding subpixels of the plurality of composite subpixels according to depth-of-field (DOF) information of a 3D model. The device may solve a problem of 3D display distortion. The present disclosure further discloses a 3D image display method, a computer-readable storage medium, and a computer program product.

STEREOSCOPIC-IMAGE PLAYBACK DEVICE AND METHOD FOR GENERATING STEREOSCOPIC IMAGES

A method for generating stereoscopic images is provided. The method includes: creating a three-dimensional mesh to obtain a stereoscopic scene and capturing a two-dimensional image of the stereoscopic scene; performing image preprocessing to obtain a first image in response to the two-dimensional image not being a side-by-side image; utilizing a graphics processing pipeline to perform depth estimation on the first image to obtain a depth image, to update the three-dimensional mesh according to a depth setting of the depth image, and to map the three-dimensional mesh to a corresponding coordinate system; utilizing the graphics processing pipeline to project the first image onto the mapped three-dimensional mesh to obtain an output three-dimensional mesh, and to capture an output side-by-side image from the output three-dimensional mesh; and utilizing the graphics processing pipeline to weave a left-eye and right-eye image into an output image, and to display the output image.

STEREOSCOPIC-IMAGE PLAYBACK DEVICE AND METHOD FOR GENERATING STEREOSCOPIC IMAGES

A method for generating stereoscopic images is provided. The method includes: creating a three-dimensional mesh to obtain a stereoscopic scene and capturing a two-dimensional image of the stereoscopic scene; performing image preprocessing to obtain a first image in response to the two-dimensional image not being a side-by-side image; utilizing a graphics processing pipeline to perform depth estimation on the first image to obtain a depth image, to update the three-dimensional mesh according to a depth setting of the depth image, and to map the three-dimensional mesh to a corresponding coordinate system; utilizing the graphics processing pipeline to project the first image onto the mapped three-dimensional mesh to obtain an output three-dimensional mesh, and to capture an output side-by-side image from the output three-dimensional mesh; and utilizing the graphics processing pipeline to weave a left-eye and right-eye image into an output image, and to display the output image.

CALIBRATION OF STEREOSCOPIC DISPLAY USING WAVEGUIDE COMBINER

Examples are disclosed that relate to calibration of a stereoscopic display system of an HMD via an optical calibration system comprising a waveguide combiner. One example provides an HMD device comprising a first image projector and a second image projector configured to project a stereoscopic image pair, and an optical calibration system. The optical calibration system comprises a first optical path indicative of an alignment of the first image projector, a second optical path indicative of an alignment of the second image projector, a waveguide combiner in which the first and second optical paths combine into a shared optical path, and one or more boresight sensors configured to detect calibration image light traveling along one or more of the first optical or the second optical path.

CALIBRATION OF STEREOSCOPIC DISPLAY USING WAVEGUIDE COMBINER

Examples are disclosed that relate to calibration of a stereoscopic display system of an HMD via an optical calibration system comprising a waveguide combiner. One example provides an HMD device comprising a first image projector and a second image projector configured to project a stereoscopic image pair, and an optical calibration system. The optical calibration system comprises a first optical path indicative of an alignment of the first image projector, a second optical path indicative of an alignment of the second image projector, a waveguide combiner in which the first and second optical paths combine into a shared optical path, and one or more boresight sensors configured to detect calibration image light traveling along one or more of the first optical or the second optical path.

MEMORY STRUCTURES TO SUPPORT CHANGING VIEW DIRECTION
20230237730 · 2023-07-27 ·

In one embodiment, a computing system may store, in a memory unit, a first array of pixel values to represent a scene as viewed along a first viewing direction. The first array of pixel values may correspond to a number of positions uniformly distributed in an angle space. The system may determine an angular displacement from the first viewing direction to a second viewing direction. The system may determine a second array of pixel values to represent the scene as viewed along the second viewing direction by: (1) shifting a portion of the first array of pixel values in the memory unit based on the angular displacement, or (2) reading a portion of the first array of pixel values from the memory unit using an address offset determined based on the angular displacement. The system may output the second array of pixel values to a display.

MEMORY STRUCTURES TO SUPPORT CHANGING VIEW DIRECTION
20230237730 · 2023-07-27 ·

In one embodiment, a computing system may store, in a memory unit, a first array of pixel values to represent a scene as viewed along a first viewing direction. The first array of pixel values may correspond to a number of positions uniformly distributed in an angle space. The system may determine an angular displacement from the first viewing direction to a second viewing direction. The system may determine a second array of pixel values to represent the scene as viewed along the second viewing direction by: (1) shifting a portion of the first array of pixel values in the memory unit based on the angular displacement, or (2) reading a portion of the first array of pixel values from the memory unit using an address offset determined based on the angular displacement. The system may output the second array of pixel values to a display.