Real-time aliasing rendering method for 3D VR video and virtual three-dimensional scene

11076142 · 2021-07-27

Assignee

Inventors

Cpc classification

International classification

Abstract

Provided is a real-time aliasing rendering method for a 3D VR video and a virtual three-dimensional scene, including: capturing 3D camera video signals in real time and process the same to generate texture data; creating a virtual three-dimensional scene according to the proportion of a real scene; generating virtual camera rendering parameters according to a physical position of the 3D camera and a shooting angle relationship; aliasing the texture data onto a virtual three-dimensional object in a virtual scene, and adjusting the position of the virtual three-dimensional object according to a physical positional relationship between the virtual three-dimensional scene and the real scene, so as to form a complete virtual reality combined three-dimensional scene; rendering the virtual reality combined three-dimensional scene by using the virtual camera rendering parameters to obtain a simulated rendering picture.

Claims

1. A real-time aliasing rendering method for three-dimensional virtual reality (3D VR) video and a virtual three-dimensional scene, the method comprising: acquiring in real-time a video signal of a 3D camera and processing the video signal to generate texture data; creating a virtual three-dimensional scene according to a real scene ratio; adjusting rendering parameters of a virtual camera according to a relationship between a physical position of the 3D camera and a shooting angle; according to a picture effect captured by the 3D camera, aliasing the texture data in a form of texture maps to specific virtual three-dimensional objects in the virtual three-dimensional scene; adjusting a position of a virtual three-dimensional object of the specific virtual three-dimensional objects by a physical position relationship between the virtual three-dimensional scene and the real scene to form a virtual and reality combined three-dimensional scene; and adjusting a rendering process corresponding to the virtual camera rendering parameters, and using the virtual camera rendering parameters to render the virtual and reality combined three-dimensional scene to obtain a simulation rendering; wherein the video signal processing includes deinterlacing interlaced data and performing keying to a blue-green background of the video source data.

2. The real-time aliasing rendering method for 3D VR video and a virtual three-dimensional scene according to claim 1, wherein the texture data includes left-eye scene texture data and right-eye scene texture data, the left-eye scene texture data and the right eye scene texture data corresponds to left and right lenses of the 3D camera respectively, as a left-eye scene and a right-eye scene of a simulated human eye.

3. The real-time aliasing rendering method for 3D VR video and a virtual three-dimensional scene according to claim 2, wherein in the process of aliasing the texture data onto the specific virtual three-dimensional objects in the virtual three-dimensional scene, the left-eye scene texture data is used when the left-eye scene is aliased, and the right-eye scene texture data is used when the right-eye scene is aliased, so as to form a virtual and reality combined three-dimensional scene data of the left and right eyes.

4. The real-time aliasing rendering method for 3D VR video and a virtual three-dimensional scene according to claim 3, wherein a process of rendering the 3D scene data of the left and right eyes of the virtual and reality combination by using virtual camera rendering parameters is further based on: when a left camera parameter of the virtual camera is used to render the left eye 3D scene data, the texture maps uses data of the left lens of the virtual camera; when a right camera parameter of the virtual camera is used to render the right eye 3D scene data, the texture maps uses data of the right camera's virtual camera.

5. The real-time aliasing rendering method for 3D VR video and a virtual three-dimensional scene according to claim 1, wherein the virtual three-dimensional scene is created by using a fixed conversion ratio.

6. The real-time aliasing rendering method for 3D VR video and a virtual three-dimensional scene according to claim 1, wherein generating virtual camera-related parameters includes: the 3D camera position generates virtual camera position to simulate real camera position, and the 3D camera rotates to simulate real camera shooting angle and the 3D camera opening angle simulates the real camera zoom and lens distance simulates real camera lens distance.

7. The real-time aliasing rendering method for 3D VR video and a virtual three-dimensional scene according to claim 1, wherein simulated 3D rendering picture data is stored in a group of left and right eyes.

8. The real-time aliasing rendering method for 3D VR video and a virtual three-dimensional scene according to claim 1, wherein the rendering comprises 360-degree panoramic rendering and VR glasses rendering.

9. The real-time aliasing rendering method for 3D VR video and a virtual three-dimensional scene according to claim 1, wherein a 360-degree panoramic rendering typesets the left and right eye rendering pictures of each group in an up-to-down 1:1 layout and combines them into a complete picture; and VR glasses rendering typesets the left and right eye rendering pictures of each group in a left-to-right 1:1 layout and combines them into a complete picture.

10. A real-time aliasing rendering method for three-dimensional virtual reality (3D VR) video and a virtual three-dimensional scene, the method comprising: acquiring in real-time a video signal of a 3D camera and processing the video signal to generate texture data; creating a virtual three-dimensional scene according to a real scene ratio; adjusting rendering parameters of a virtual camera according to a relationship between a physical position of the 3D camera and a shooting angle; according to a picture effect captured by the 3D camera, aliasing the texture data in a form of texture maps to specific virtual three-dimensional objects in the virtual three-dimensional scene; adjusting a position of a virtual three-dimensional object of the specific virtual three-dimensional objects by a physical position relationship between the virtual three-dimensional scene and the real scene to form a virtual and reality combined three-dimensional scene; and adjusting a rendering process corresponding to the virtual camera rendering parameters, and using the virtual camera rendering parameters to render the virtual and reality combined three-dimensional scene to obtain a simulation rendering; wherein a 360-degree panoramic rendering typesets the left and right eye rendering pictures of each group in an up-to-down 1:1 layout and combines them into a complete picture; and VR glasses rendering typesets the left and right eye rendering pictures of each group in a left-to-right 1:1 layout and combines them into a complete picture.

Description

BRIEF DESCRIPTION

(1) Some of the embodiments will be described in detail, with references to the following Figures, wherein like designations denote like members, wherein:

(2) FIG. 1 is a flowchart of a method for real-time aliasing rendering of a 3D VR video and a virtual 3D scene according to embodiments of the present invention;

(3) FIG. 2 is a flowchart of a method for real-time aliasing rendering of a 3D VR video and a virtual 3D scene according to an embodiment of the present invention. In this embodiment, left and right eye simulations are performed through left and right lenses of a 3D camera;

(4) FIG. 3 is a schematic layout diagram of the VR glasses typesetting in FIG. 2 according to an embodiment of the present invention, which simulates left and right eyes through 3D camera left and right lenses; and

(5) FIG. 4 is a schematic diagram of 360 panoramic typesetting of left and right eye simulation through left and right lenses of a 3D camera according to an embodiment of the present invention.

DETAILED DESCRIPTION

(6) In order to make the objectives, technical solutions, and advantages of embodiments of the present invention clearer, embodiments of the present invention will be described in further detail with reference to the accompanying drawings. The embodiments described below are only embodiments of the present invention, and they are only used to make embodiments of the present invention more clear. Explanation and description are not used to limit the protection scope of embodiments of the present invention.

(7) Referring to FIG. 1, FIG. 1 is a flowchart of a method for real-time aliasing rendering of a 3D VR video and a virtual 3D scene.

(8) As shown in FIG. 1, the method includes:

(9) S101. Collect and process a 3D camera signal.

(10) In this step, a 3D camera video signal is collected and processed in real time to generate texture data;

(11) The processing of the video signal may include deinterleaving processing of interlaced data and keying processing of the video source data with a blue-green background.

(12) S102. Create a virtual three-dimensional scene.

(13) Create a virtual 3D scene according to the real scene scale; optionally, create a virtual 3D scene with a fixed conversion scale;

(14) S103. Adjust parameters related to the virtual camera.

(15) Generate virtual camera rendering parameters according to the real 3D camera physical position and shooting angle relationship;

(16) Optionally, generating the virtual camera related parameters includes: camera position generation virtual camera position to simulate a real camera position, camera rotation to simulate a real camera shooting angle, camera opening angle to simulate a real camera Zoom, and lens distance to simulate a real camera lens distance;

(17) S104. Aliasing forms a three-dimensional scene combined with virtual reality.

(18) Overlaying the texture data on a specific virtual three-dimensional object in the virtual scene in the form of a texture map according to the picture effect captured by the 3D camera, and adjusting the position of the virtual three-dimensional object according to the physical position relationship between the virtual three-dimensional scene and the real scene, Form a three-dimensional scene combined with virtual reality.

(19) S105. Render a three-dimensional scene combined with virtual reality.

(20) The rendering process is adjusted corresponding to the rendering parameters of the virtual camera, and the virtual camera rendering parameters are used to render the three-dimensional scene combining virtual and reality to obtain a simulated rendering picture.

(21) Referring to FIG. 2, FIG. 2 is a flowchart of another embodiment of a method for real-time overlay rendering of a 3D VR video and a virtual 3D scene in FIG. 1.

(22) In this embodiment, the human eyes are simulated through the left and right lenses of the 3D camera. The specific steps are:

(23) S201: Collect and process the left and right lens scene signals of the 3D camera in real time to generate texture data for the left eye scene and texture data for the right eye scene.

(24) The scene signals captured by the left and right lenses of the 3D camera are used as the left-eye scene signal and the right-eye scene signal, respectively, for simulating the left-eye scene and the right-eye scene of the human eye.

(25) The processing of the video signal may include deinterleaving processing of interlaced data and keying processing of the video source data with a blue-green background.

(26) S202. Create a virtual three-dimensional scene.

(27) Create a virtual three-dimensional scene according to the real scene scale; optionally, create a virtual three-dimensional scene with a fixed conversion scale.

(28) S203: Adjust parameters related to the virtual camera.

(29) Generate virtual camera rendering parameters based on the real 3D camera physical position and shooting angle relationship.

(30) Optionally, generating the virtual camera related parameters includes: camera position generation virtual camera position to simulate real camera position, camera rotation to simulate real camera shooting angle, camera opening angle to simulate real camera Zoom, and lens distance to simulate real camera lens distance.

(31) S204. Aliasing forms a three-dimensional scene combined with virtual reality. According to the picture effect captured by the 3D camera, the texture data is aliased onto a specific virtual three-dimensional object in the virtual scene. Specifically, according to the picture effect captured by the 3D camera, the texture data is aliased into the form of a texture map. On the specific virtual 3D object in the virtual scene, and adjust the position of the virtual 3D object according to the physical position relationship between the virtual 3D scene and the real scene to form a 3D scene combined with virtual reality.

(32) Wherein, the left-eye scene texture data is used when the left-eye scene is aliased, and the right-eye scene texture data is used when the right-eye scene is aliased to form three-dimensional scene data of the left and right eyes combined with virtual reality.

(33) S205. Use virtual camera rendering parameters to render the left and right eye three-dimensional scene data of the virtual and reality combination.

(34) Among them, when rendering the left-eye 3D scene data according to the left lens parameter of the virtual camera, the texture map uses the data of the left lens of the virtual camera; when rendering the 3D scene data of the right eye according to the right lens parameter of the virtual camera, the texture map uses the right of the virtual camera Lens data.

(35) In addition, the simulated 3D rendered picture data can be stored in a group of left and right eyes.

(36) Referring to FIG. 3, in the manner of VR glasses typesetting for left and right eye simulation through the left and right lenses of the 3D camera, the scenes taken by the left and right lenses of the 3D camera are respectively left eye scene and right eye scene, and are collected and processed to generate multiple The left-eye scene texture data and the right-eye scene texture data of the group are simulated for the left and right eyes.

(37) In this mode, rendering may include 360-degree panoramic rendering and VR glasses rendering.

(38) Among them, 360-degree panoramic rendering typesets the left and right eye renderings of each group in an up-to-down 1:1 layout and combines them into a complete picture.

(39) Referring to FIG. 4, in a 360 panorama typesetting method for left and right eye simulation through the left and right lenses of the 3D camera, the scenes captured by the left and right lenses of the 3D camera are respectively used as the left eye scene and the right eye scene, and are collected and processed to generate multiple groups The left-eye scene texture data and the right-eye scene texture data are simulated for the left and right eyes.

(40) In this mode, the VR glasses rendering is to typeset the left and right eye renderings of each group in a left-to-right 1:1 layout and finally combine them into a complete picture.

(41) It can be seen from the above embodiments that the present invention performs independent two 360 panorama renderings by using the 360-degree panoramic pictures of the left and right eyes and the virtual graphics and images of the virtual three-dimensional scene to simulate the position and perspective relationship of the human eye, respectively. When the user experiences, the left and right eyes respectively see the rendered pictures with poor perspectives, which is more realistic and more immersive.

(42) Although the invention has been illustrated and described in greater detail with reference to the preferred exemplary embodiment, the invention is not limited to the examples disclosed, and further variations can be inferred by a person skilled in the art, without departing from the scope of protection of the invention.

(43) For the sake of clarity, it is to be understood that the use of “a” or “an” throughout this application does not exclude a plurality, and “comprising” does not exclude other steps or elements.