Transform method for rendering post-rotation panoramic images
11210840 · 2021-12-28
Assignee
Inventors
Cpc classification
G06T17/20
PHYSICS
International classification
Abstract
A transform method applied in an image processing system is disclosed, comprising: when the image capture module is rotated, respectively performing inverse rotation operations over post-rotation space coordinates of three first vertices from a integral vertex stream according to rotation angles of the image capture module to obtain their pre-rotation space coordinates; calculating pre-rotation longitudes and latitudes of the three first vertices according to their pre-rotation space coordinates; selecting one from a pre-rotation panoramic image, a south polar image and a north polar image as a texture image to determine a texture ID for the three first vertices according to their pre-rotation latitudes; and, calculating pre-rotation texture coordinates according to the texture ID and the pre-rotation longitudes and latitudes to form a first complete data structure for each of the three first vertices.
Claims
1. A transform method applied in an image processing system having an image capture module and a render engine, the image capture module capturing a view with a 360-degree horizontal field of view and 180-degree vertical field of view to generate a plurality of camera images, the method comprising: when the image capture module is rotated, respectively performing inverse rotation operations over post-rotation space coordinates of three first vertices from an integral vertex stream according to rotation angles of the image capture module to obtain pre-rotation space coordinates of the three first vertices; calculating pre-rotation longitudes and latitudes of the three first vertices according to the pre-rotation space coordinates of the three first vertices; selecting one from a pre-rotation panoramic image, a south polar image and a north polar image as a texture image to determine a texture ID for the three first vertices according to the pre-rotation latitudes of the three first vertices; and calculating pre-rotation texture coordinates according to the texture ID and the pre-rotation longitude and latitude to form a first complete data structure for each of the three first vertices; wherein the three first complete data structures of the three first vertices are inputted to the render engine and cause the render engine to render a triangle in a post-rotation panoramic image.
2. The method according to claim 1, further comprising: repeating all the steps until all the first vertices from the integral vertex stream are processed so that the post-rotation panoramic image is completed by the render engine.
3. The method according to claim 1, wherein the first complete data structure for each of the first vertices comprises the pre-rotation texture coordinates, the texture ID and post-rotation destination coordinates.
4. The method according to claim 3, wherein the integral vertex stream comprises a plurality of groups of three first vertices and each first vertex comprises attached coordinates, and wherein each group of three first vertices form a triangle in a first triangle mesh modeling the post-rotation panoramic image.
5. The method according to claim 4, wherein the attached coordinates of the three first vertices from the first vertex stream are respectively specified by the post-rotation space coordinates and the post-rotation image coordinates of the three first vertices in the post-rotation panoramic image.
6. The method according to claim 4, further comprising: when the image capture module is rotated, calculating post-rotation image coordinates and the post-rotation space coordinates of the three first vertices according to first pseudo coordinates of the three first vertices prior to the step of respectively performing inverse rotation operations; wherein the attached coordinates of the three first vertices are specified by the first pseudo coordinates of the three first vertices.
7. The method according to claim 6, wherein the first pseudo coordinates refer to a first pseudo coordinate system used in the first triangle mesh modeling the post-rotation panoramic image and are measured in units of horizontal spacing between first vertices along a horizontal axis and in units of vertical spacing between first vertices along a vertical axis in the first pseudo coordinate system.
8. The method according to claim 6, wherein the step of calculating the post-rotation image coordinates and the post-rotation space coordinates further comprises: calculating the post-rotation image coordinates according to the first pseudo coordinates for each of the three first vertices; calculating the post-rotation longitude and latitude according to the first pseudo coordinates for each of the three first vertices; and calculating the post-rotation space coordinates according to the post-rotation longitude and latitude for each of the three first vertices; wherein the post-rotation image coordinates for each of the first vertices are assigned to the post-rotation destination coordinates in the corresponding first complete data structure.
9. The method according to claim 1, further comprising: before the image capture module is rotated, generating the pre-rotation panoramic image according to the camera images.
10. The method according to claim 1, further comprising: when the image capture module is rotated, prior to the step of respectively performing the inverse rotation operations, generating a north polar image according to a north polar vertex stream and the pre-rotation panoramic image and generating a south polar image according to a south polar vertex stream and the pre-rotation panoramic image.
11. The method according to claim 10, wherein the north vertex stream comprises a plurality of groups of three second vertices and each second vertex comprises second texture coordinates, wherein each group of three second vertices form a triangle in a second triangle mesh modeling a north portion of the pre-rotation panoramic image, wherein the south vertex stream comprises a plurality of groups of three third vertices and each third vertex comprises third texture coordinates, and wherein each group of three third vertices form a triangle in a third triangle mesh modeling a south portion of the pre-rotation panoramic image.
12. The method according to claim 11, wherein the north portion of the pre-rotation panoramic image ranges have negative latitudes that range from −90° to a first degree, and the south portion of the pre-rotation panoramic image ranges have positive latitudes that range from +90° to a second degree.
13. The method according to claim 11, wherein the step of generating the north polar image comprises: (n1) for a group of three second vertices from the north polar vertex stream, calculating polar destination coordinates in the north polar image according to a longitude and a latitude in the pre-rotation panoramic image to form a second complete data structure with respect to each of the three vertices; (n2) performing triangle rasterization operations and texture mapping based on the three second complete data structures of the three second vertices by the render engine to render a triangle in the north polar image; and (n3) repeating the steps of (n1) and (n2) until all the second vertices from the second vertex stream are processed to form the north polar image.
14. The method according to claim 13, wherein the step of (n1) further comprises: for each of the three second vertices, calculating its polar destination coordinates (x, y) in the north polar image according to its longitude and latitude (θ,φ) in the pre-rotation panoramic image and three equations to form the second complete data structure comprising the polar destination coordinates (x, y) in the north polar image and initial texture coordinates in the pre-rotation panoramic image; wherein the three equations are given by:
r=|φ−φ.sub.pole|/Ω;
x=(W.sub.p/2)*(1+r*cos(θ*π/180)); and
y=(W.sub.p/2)*(1+r*sin(θ*π/180)); wherein φ.sub.pole denotes a latitude of −90° and W.sub.p denotes a side length of the north polar image; and wherein Ω denotes a latitude difference between −90° and the first degree.
15. The method according to claim 14, wherein the second texture coordinates of the three second vertices from the north polar vertex stream are specified by the longitudes and latitudes and the initial texture coordinates in the pre-rotation panoramic image.
16. The method according to claim 13, wherein the step of generating the north polar image further comprises: (n01) calculating the initial texture coordinates in the pre-rotation panoramic image according to second pseudo coordinates for each of the three second vertices prior to the steps (n1)˜(n3); and (n02) calculating the longitude and the latitude in the pre-rotation panoramic image according to the second pseudo coordinates for each of the three second vertices; wherein the second texture coordinates of the three second vertices from the north polar vertex stream are specified by the second pseudo coordinates in the pre-rotation panoramic image; wherein the second pseudo coordinates refer to a second pseudo coordinate system used in the second triangle mesh modeling the pre-rotation panoramic image and are measured in units of horizontal spacing between second vertices along a horizontal axis and in units of vertical spacing between second vertices along a vertical axis in the second pseudo coordinate system.
17. The method according to claim 11, wherein the step of generating the south polar image comprises: (s1) for a group of three third vertices from the south polar vertex stream, calculating polar destination coordinates in the south polar image according to a longitude and a latitude in the pre-rotation panoramic image to form a third complete data structure with respect to each of the three third vertices; (s2) performing triangle rasterization operations and texture mapping based on the three third complete data structures of the three third vertices by the render engine to render a triangle in the south polar image; and (s3) repeating the steps of (s1) and (s2) until all the third vertices from the third vertex stream are processed to form the south polar image.
18. The method according to claim 17, wherein the step of (s1) further comprises: for each of the three third vertices, calculating its polar destination coordinates (x, y) in the south polar image according to its longitude and latitude (θ,φ) in the pre-rotation panoramic image to form a third complete data structure comprising the polar destination coordinates (x, y) in the south polar image and initial texture coordinates in the pre-rotation panoramic image; wherein the three equations are given by:
r=|φ−φ.sub.pole|/Ω;
x=(W.sub.p/2)*(1+r*cos(θ*π/180));
y=(W.sub.p/2)*(1+r*sin(θ*π/180)); wherein φ.sub.pole denotes a latitude of +90° and W.sub.p denotes a side length of the south polar image; wherein Ω denotes a latitude difference between +90° and the second degree.
19. The method according to claim 18, wherein the third texture coordinates of the three third vertices from the south polar vertex stream are specified by the longitudes and latitudes and the initial texture coordinates in the pre-rotation panoramic image.
20. The method according to claim 17, wherein the step of generating the south polar image further comprises: (s01) calculating the initial texture coordinates in the pre-rotation panoramic image according to third pseudo coordinates for each of the three third vertices prior to the steps (s1)˜(s3); and (s02) calculating the longitude and the latitude in the pre-rotation panoramic image according to the third pseudo coordinates for each of the three third vertices; wherein the third pseudo coordinates in the pre-rotation panoramic image are assigned to the third texture coordinates of the three third vertices from the south polar vertex stream; wherein the third pseudo coordinates refer to a third pseudo coordinate system used in the third triangle mesh modeling the pre-rotation panoramic image and are measured in units of horizontal spacing between third vertices along a horizontal axis and in units of vertical spacing between third vertices along a vertical axis in the third pseudo coordinate system.
21. The method according to claim 1, wherein the step of calculating the pre-rotation texture coordinates further comprises: if the texture ID denotes one of the north and the south polar images, calculating the pre-rotation texture coordinates (u,v) for each of the three first vertices according to its pre-rotation longitude and latitude (θ′, φ′), the texture ID and the following three equations:
r=|φ−φ.sub.pole/Ω;
u=(W.sub.p/2)*(1+r*cos(θ′*π/180)); and
v=(W.sub.p/2)*(1+r*sin(θ′*π/180)); wherein W.sub.p denotes a side length of the north and the south polar images; and wherein if the texture ID denotes the north polar image, φ.sub.pole denotes a latitude of −90°, otherwise, φ.sub.pole denotes a latitude of −90°.
22. The method according to claim 1, wherein the step of calculating the pre-rotation texture coordinates further comprises: if the texture ID denotes the pre-rotation panoramic image, calculating the pre-rotation texture coordinates (u,v) for each of the three first vertices according to its pre-rotation longitude and latitude (θ′.sub.k, φ′.sub.k) and the following equations:
23. The method according to claim 22, wherein the step of calculating the pre-rotation texture coordinates further comprises: if the texture ID denotes the pre-rotation panoramic image, measuring a distance along u-axis between the leftmost vertex and the rightmost vertex among the three first vertices in the pre-rotation panoramic image; if the distance along u-axis is greater than a threshold, adding Wo and a u coordinate of each left vertex among the three first vertices to obtain its modified u coordinate, wherein each left vertex is located on the left side of the pre-rotation panoramic image; and if the distance along u-axis is greater than the threshold, forming the first complete data structure for each left vertex according to its modified u coordinate and forming the first complete data structure for each of the other vertices among the three vertices according to the pre-rotation texture coordinates (u,v).
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The present invention will become more fully understood from the detailed description given hereinbelow and the accompanying drawings which are given by way of illustration only, and thus are not limitative of the present invention, and wherein:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
DETAILED DESCRIPTION OF THE INVENTION
(19) As used herein and in the claims, the term “and/or” includes any and all combinations of one or more of the associated listed items. The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. Throughout the specification, the same components and/or components with the same function are designated with the same reference numerals.
(20) Through the specification and claims, the following notations/terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise. The term “rasterization” refers to a process of computing the mapping from scene geometry (or a panoramic image) to texture coordinates. The term “texture coordinates (u, v)” refers to a 2D Cartesian coordinate system used in a source/texture image. The term “destination coordinates (x, y)” refers to a 2D Cartesian coordinate system used in a destination image. The term “space coordinates (X, Y, Z)” refers to a 3D Cartesian coordinate system used in a panoramic image. The term “image coordinates” refers to a 2D Cartesian coordinate system used in either an panoramic image or a south/north polar image (see
(21)
(22)
(23)
(24) Thus, the 3D rotation matrix R.sub.3D derived from three basic rotation matrices is given by: R.sub.3D=Rx Ry Rz. According to the rotation angles α, β, γ, the processing unit 205 generates a rotation matrix R.sub.3D and an inverse rotation matrix R.sub.3D.sup.−1 by using equation (1). Referring to
(25) Examples of the panoramic image include, without limitation, a 360-degree panoramic image and an equirectangular panoramic image. For purposes of clarity and ease of description, hereinafter, the following examples and embodiments will be described with the equirectangular panoramic image.
(26)
(27) As set forth above, the integral vertex stream is a list of multiple vertices forming a plurality of triangles of the triangle mesh on an equirectangular panoramic image. Besides, each vertex in the integral vertex stream is defined by its corresponding data structure. There are three types for the integral vertex stream as follows: (1) pseudo integral vertex stream (e.g.,
(28)
(29)
(30) If the grid in
(31)
(32)
(33)
(34) Step S602: Sequentially fetch three vertices from the pseudo/combined south polar vertex stream by the primitive setup unit 222. The primitive flag of the last vertex among the three vertices (k=0˜2) in sequence from the pseudo/combined south polar vertex stream is set to 1 and the last vertex together with its immediately-previous two vertices forms a triangle.
(35) Step S603: Determine what type the south polar vertex stream is by the primitive setup unit 222. If the south polar vertex stream is a pseudo type, the flow goes to step S604. If the south polar vertex stream is a combined type, the “texture coordinates (u,v)” field in the complete data structure for each of the three vertices (k=0˜2) in a complete south polar vertex stream in
(36) Step S604: Calculate the image coordinates and the longitude and latitude of the three vertices according to the pseudo coordinates of the three vertices by the primitive setup unit 222. For each of the three vertices (k=0˜2), calculate the image coordinates (u.sub.k, v.sub.k) in the pre-rotation equirectangular panoramic image 230 (texture space) using their pseudo coordinates (i.sub.k, j.sub.k) according to the following equations (2)˜(3):
u.sub.k=j.sub.k*W/Nx; /*equation (2)*/
v.sub.k=i.sub.k*H/Ny, /*equation (3)*/
given (W, H)=(Wo, Ho)=(4096, 2048) and (Nx, Ny)=(144, 72). The “texture coordinates (u,v)” field in the complete data structure for each of the three vertices (k=0˜2) in a complete south polar vertex stream in
(37) For each of the three vertices (k=0˜2), calculate the longitude and latitude (θ.sub.k and φ.sub.k) in the pre-rotation equirectangular panoramic image 230 (texture space) using their pseudo coordinates (i.sub.k, j.sub.k) according to the following equations (4)˜(5):
θ.sub.k=θ.sub.start+θ.sub.step*j.sub.k; /*equation (4)*/
φ.sub.k=φ.sub.start+φ.sub.step*i.sub.k; /*equation (5)*/
given (θ.sub.start, φ.sub.start)=(−180°,−90°) and (θ.sub.step and φ.sub.step)=(2.5°, 2.5°).
(38) Step S606: Map the three vertices from the pre-rotation equirectangular panoramic image to the south polar image so as to obtain the image coordinates in the south polar image by the primitive setup unit 222. For each of the three vertices (k=0˜2), calculate the image coordinates (x.sub.k, y.sub.k) in the south polar image (destination space) using the longitude and latitude (θ.sub.k and φ.sub.k) according to the following equations (6)˜(8):
r=|φ.sub.k−φ.sub.pole|/Ω; /*equation (6)*/
x.sub.k=(W.sub.p/2)*(1+r*cos(θ.sub.k*π/180)); /*equation (7)*/
y.sub.k=(W.sub.p/2)*(1+r*sin(θ.sub.k*π/180)); /*equation (8)*/
given Ω=10°, W.sub.P=256, φ.sub.pole=+90° (corresponding to the south pole). Since the pre-rotation equirectangular panoramic image 230 is always treated as the texture space, the “texID” field in the complete data structure of each vertex in the complete south polar vertex stream is always set to 0. Please note that the complete data structures of the three vertices (k=0˜2) are finished in the complete south polar vertex stream after the “destination coordinates” field of each of the three vertices is filled in with the image coordinates (x.sub.k, y.sub.k) in
(39) Step S608: Render a primitive/triangle in the south polar image 250 according to the complete data structures of the three vertices in the complete south polar vertex stream by the render engine 226. The primitive/triangle is formed by the three vertices (k=0˜2). According to the complete data structures of the three vertices in the complete south polar vertex stream from the primitive setup unit 222, the render engine 226 performs triangle rasterization operations and texture mapping to render the primitive/triangle in the south polar image 250. Specifically, the render engine 226 performs triangle rasterization operations for a point Q having the image/destination coordinates (x.sub.Q, y.sub.Q) within the triangle formed by the three vertices (k=0˜2) in the south polar image to generate the texture coordinates (u.sub.Q, v.sub.Q) of the point Q; afterward, the render engine 226 textures map the texture data from the pre-rotation equirectangular panoramic image using any appropriate method (such as nearest-neighbour interpolation, bilinear interpolation or trilinear interpolation) according to the texture coordinates of point Q and the three vertices to generate destination value of the point Q. In this manner, the render engine 226 sequentially generates destination value for each point/pixel until all the points within the triangle in the south polar image are processed/completed.
(40) Step S610: Determine whether all the vertices within the pseudo/combined south polar vertex stream are processed by the primitive setup unit 222. If YES, the flow is terminated and the south polar image is rendered/produced; otherwise, return to the step S602 for the following three vertices in the pseudo/combined south polar vertex stream.
(41) Analogous to the method of
(42)
(43) Step S702: Sequentially fetch three vertices from the pseudo/combined integral vertex stream.
(44) Step S703: Determine what type the integral vertex stream is. If the integral vertex stream is a pseudo type, the flow goes to step S704. If the integral vertex stream is a combined type, the “post-rotation destination coordinates” field of the complete data structure of each of the three vertices (for k=0˜2) in the complete integral vertex stream in
(45) Step S704: Calculate post-rotation space coordinates, post-rotation image coordinates and post-rotation longitude/latitude of the three vertices according to their pseudo coordinates. For each of the three vertices (k=0˜2), calculate the post-rotation image coordinates (x.sub.k, y.sub.k) in the post-rotation equirectangular panoramic image 280 using their pseudo coordinates (i.sub.k, j.sub.k) according to the following equations (2)˜(3):
x.sub.k=j.sub.k*W/Nx, /*equation (2)*/
y.sub.k=i.sub.k*H/Ny; /*equation (3)*/
given (W, H)=(Wr, Hr)=(4096, 2048) and (Nx, Ny)=(144, 72). In the complete integral vertex stream in
(46) For each of the three vertices (k=0˜2), calculate the post-rotation longitude and latitude (θ.sub.k and φ.sub.k) in the post-rotation equirectangular panoramic image 280 using their pseudo coordinates (i.sub.k, j.sub.k) according to the following equations (4)˜(5):
θ.sub.k=θ.sub.start+θ.sub.step*j.sub.k; /*equation (4)*/
φ.sub.k=φ.sub.start+φ.sub.step*i.sub.k; /*equation (5)*/
given (θ.sub.start, φ.sub.start)=(−180°,−90°) and (θ.sub.step and φ.sub.step)=(2.5°,2.5°).
(47) For each of the three vertices (k=0˜2), calculate the post-rotation space coordinates (X.sub.k, Y.sub.k and Z.sub.k) using its post-rotation longitude and latitude (θ.sub.k and φ.sub.k) according to the following equations (9)˜(11):
X.sub.k=cos(φ.sub.k*π/180)*cos(θ.sub.k*π/180); /*equation (9)*/
Y.sub.k=cos(φ.sub.k*π/180)*sin(θ.sub.k*π/180); /*equation (10)*/
Z.sub.k=sin(φ.sub.k*π/180). /*equation (11)*/
(48) Step S706: Perform inverse rotation operations over the post-rotation space coordinates of each of the three vertices (k=0˜2) to obtain pre-rotation space coordinates according to the inverse rotation matrix R.sub.3D.sup.−1. The primitive setup unit 222 performs inverse rotation operations over the post-rotation space coordinates (X.sub.k, Y.sub.k and Z.sub.k) of each of the three vertices (k=0˜2) to obtain its pre-rotation space coordinates (X′.sub.k, Y′.sub.k and Z′.sub.k) according to the inverse rotation matrix R.sub.3D.sup.−1 (having the following nine elements R.sub.11˜R.sub.33) and the following equations (12)˜(14):
X′.sub.k=R.sub.11*X.sub.k+R.sub.12*Y.sub.k+R.sub.13*Z.sub.k; /*equation (12)*/
Y′.sub.k=R.sub.21*X.sub.k+R.sub.22*Y.sub.k+R.sub.23*Z.sub.k; /*equation (13)*/
Z′.sub.k=R.sub.31*X.sub.k+R.sub.32*Y.sub.k+R.sub.33*Z.sub.k. /*equation (14)*/
(49) Step S708: Calculate the pre-rotation longitude and latitude of each of the three vertices according to their pre-rotation space coordinates. In one embodiment, the following program codes are provided in the primitive setup unit 222 to obtain the pre-rotation longitude and latitude of each of the three vertices. r′=sqrt(X′.sub.k*X′.sub.k+Y′.sub.k*Y′.sub.k); if (r′==0) { φ′.sub.k=(Z′.sub.k<0) ?−90: 90; θ′.sub.k=0; } else { x1=fabs(X′.sub.k); /*x1=the absolute value of X′.sub.k*/ y1=fabs(Y′.sub.k); /*Y1=the absolute value of Y′.sub.k*/ z1=fabs(Z′.sub.k); /*z1=the absolute value of Z′.sub.k*/ φ′.sub.k=atan 2(r, z1)*180/PI; /* φ′.sub.k=arctan.sup.−1(r/z1)*180/π*/ if (Z′.sub.k<0) φ′.sub.k=−φ′.sub.k, θ′.sub.k=atan 2(x1,y1)*180/PI; /* θ′.sub.k=arctan.sup.−1(x1/y1)*180/π*/ if (X′.sub.k>=0 && Y′.sub.k>=0) θ′.sub.k=θ′.sub.k; else if (X′.sub.k<0 && Y′.sub.k>=0) θ′.sub.k=180−θ′.sub.k; else if (X′.sub.k<0 && Y′.sub.k<0) θ′.sub.k=180+θ′.sub.k; else θ′.sub.k=360−θ′.sub.k; }
(50) Finally, we obtain the pre-rotation longitude and latitude (θ′.sub.k, φ′.sub.k) of each of the three vertices (k=0˜2).
(51) Step S710: Determine which image is the texture image according to the pre-rotation latitudes of the three vertices and the latitudes of the north pole (Zenith) and the south pole (Nadir). Given φ.sub.north=−90° and φ.sub.south=90°, the primitive setup unit 222 calculates the following three first latitude differences: (φ′.sub.0−φ.sub.north), (φ′.sub.1−φ.sub.north) and (φ′.sub.2−φ.sub.north)) and the following three second latitude differences: (φ′.sub.0−φ.sub.south), (φ′.sub.1−φ.sub.south) and (φ′.sub.2−φ.sub.south). If all of the above three first latitude differences ((φ′.sub.k−φ.sub.north) are less than a threshold TH, it is determined that the north polar image 240 is designated as the texture image and texID.sub.k=1 if all of the above three second latitude differences are less than a threshold TH, it is determined that the south polar image 250 is designated as the texture image and texID.sub.k=2; otherwise, it is determined that the pre-rotation equirectangular panoramic image 230 is designated as the texture image and texID.sub.k=0. In the complete integral vertex stream, fill in the “texID” field of the complete data structure of each of the three vertices with the texID.sub.k value, for k=0˜2.
(52) Step S714: Calculate the pre-rotation texture coordinates (u.sub.k, v.sub.k) of the three vertices (k=0˜2) according to their pre-rotation longitude and latitude (θ′.sub.k, φ′.sub.k) and the texture IDs (texID.sub.k). In one embodiment, the following program codes are provided in the primitive setup unit 222 to obtain the pre-rotation texture coordinates. for (k=0; k<3; k++){ If (texID.sub.k==0) {/*pre-rotation equirectangular panoramic image is designated as the texture image */ s=(θ′.sub.k−θ.sub.start)/360, if (s<0) s=s+1; if (s>=1) s=s−1, t=(φ′.sub.k−φ.sub.start)/180; u.sub.k=s*Wr; v.sub.k=t* Hr; } else{/* north/south polar image is designated as the texture image */ if (texID.sub.k==1) φ.sub.pole=φ.sub.north; else φ.sub.pole=φ.sub.south; r=|φ′.sub.k−φ.sub.pole|/Ω; u.sub.k=(W.sub.p/2)*(1+r*cos(θ′.sub.k*π/180)); v.sub.k=(W.sub.p/2)*(1+r*sin(θ′.sub.k*π/180)); } }
(53) At the end of this step, the pre-rotation texture coordinates (u.sub.k, v.sub.k) of the three vertices (k=0˜2) are determined if any of the north and the south polar images is designated as the texture image (texID.sub.k equal to 1 or 2). In the complete integral vertex stream as shown in
(54) Step S716: Determine whether texID.sub.k is equal to 0. Stated in another way, determine whether the pre-rotation equirectangular panoramic image is designated as the texture image. If YES, the flow goes to step S718; otherwise, the flow goes to step S720.
(55) Step S718: Determine whether the pre-rotation three vertices need correction. If YES, the flow goes to step S719; otherwise, the flow goes to step S720.
(56) Step S719: Correct u coordinates of the left vertex/vertices among the three pre-rotation vertices TaTbTc on the left side of the pre-rotation equirectangular panoramic image. If the three pre-rotation vertices TaTbTc need correction, u coordinates of all the left vertex/vertices among the three pre-rotation vertices TaTbTc on the left side of the pre-rotation equirectangular panoramic image would be corrected (or wrapped around) along the u-axis by the primitive setup unit 222 so that the left vertex/vertices would be re-located and gets nearer to the right vertices. Specifically, the primitive setup unit 222 corrects the u coordinates of the left vertex/vertices on the left side by respectively adding the u coordinates of the left vertex/vertices and Wo to obtain new u coordinates. For the example in
(57) Finally, we obtain the pre-rotation texture coordinates (u′.sub.k, v′.sub.k) of the three vertices (k=0˜2) for the pre-rotation equirectangular panoramic image 230 assigned as the texture image. In the complete integral vertex stream as shown in
(58) Step S720: Send the complete data structures of the three vertices to the render engine 226. The primitive/triangle is formed by the three vertices (k=0˜2). The primitive setup unit 222 sends the three complete data structures of the three vertices (k=0˜2) in the complete integral vertex stream to the render engine 226. According to the complete data structures of the three vertices in the complete integral vertex stream from the primitive setup unit 222, the render engine 226 performs triangle rasterization operations and texture mapping to render the primitive/triangle in the post-rotation equirectangular panoramic image 280. Specifically, the render engine 226 performs triangle rasterization operations for a point Q having the image coordinates (x.sub.Q, y.sub.Q) within the triangle formed by the three vertices (k=0˜2) in the post-rotation equirectangular panoramic image to generate the texture coordinates (u.sub.Q, v.sub.Q) of the point Q; afterward, the render engine 226 textures map the texture data from the texture space (one of the pre-rotation equirectangular panoramic image 230, the north polar image 240 and the south polar image 250) specified by texID.sub.k using any appropriate method (such as nearest-neighbour interpolation, bilinear interpolation or trilinear interpolation) according to the texture coordinates of point Q and the three vertices (k=0˜2) to generate destination value of the point Q. In this manner, the render engine 226 sequentially generates destination value for each point/pixel until all the points within the triangle in the post-rotation equirectangular panoramic image 280 are processed/completed.
(59) Step S722: Determine whether all the vertices within the pseudo/combined integral vertex stream are processed by the primitive setup unit 222. If YES, the flow is terminated and the post-rotation equirectangular panoramic image 280 is produced; otherwise, return to the step S702 for the following three vertices in the pseudo/combined integral vertex stream.
(60) In one embodiment, the processing unit 205 and the primitive setup unit 222 are implemented with a general-purpose processor having a first program memory (not shown); the render engine 226 is implemented with a second program memory and a graphics processing unit (GPU) (not shown). The first program memory stores a first processor-executable program and the second program memory stores a second processor-executable program. When the first processor-executable program is executed by the general-purpose processor, the general-purpose processor is configured to function as: the processing unit 205 and the primitive setup unit 222. When the second processor-executable program is executed by the GPU, the GPU is configured to function as: the render engine 226 that performs rasterization, texture mapping and blending operations to form a pre-rotation equirectangular panoramic image 230, a post-rotation equirectangular panoramic image 280, a south polar image 250 and a north polar image 240.
(61) While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention should not be limited to the specific construction and arrangement shown and described, since various other modifications may occur to those ordinarily skilled in the art.