PANORAMIC VIDEO MAPPING METHOD BASED ON MAIN VIEWPOINT

20200143511 ยท 2020-05-07

    Inventors

    Cpc classification

    International classification

    Abstract

    Disclosed are a panoramic video forward mapping method and a panoramic video inverse mapping method, which relates to the field of virtual reality (VR) videos. In the present disclosure, the forward mapping method comprises: mapping, based on a main viewpoint, the Areas I, II, and III on the sphere onto corresponding areas on the plane, wherein Area I corresponds to the area with the included angle 0Z.sub.1, the Area II corresponds to the area with the included angle Z.sub.1Z.sub.2, and the Area III corresponds to the area with the included angle Z.sub.2180. The panoramic video forward mapping method refers to mapping a spherical source corresponding to the panoramic image A onto a plane square image B; the panoramic video inverse mapping method refers to mapping the plane square image B back to the sphere for being rendered and viewed. the present disclosure may significantly lower the resolution of a video, effectively lower the code rate for coding the panoramic video and reducing the complexity of coding and decoding, further achieving the objective of lowering the code rate and guaranteeing video quality of the ROI area.

    Claims

    1. A panoramic video forward mapping method, comprising: mapping, based on a main viewpoint, a sphere corresponding a panoramic image A to a plane square image B; wherein the longitude-latitude of the main viewpoint center are (lon, lat), partitioning the sphere into three areas according to an included angle from the main viewpoint center to the spherical center, denoted as Area I, Area II, and Area III, respectively, wherein the Area I corresponds to an area with an angle of 0Z.sub.1, the Area II corresponding an area with an angle of Z.sub.1Z.sub.2, and the Area III corresponds to an area with an angle of Z.sub.2180; then mapping the Area I into a circle on the image B with the image center as the circle center and .sub.0 as the radium; mapping the Area II into a circle on the image B with the image center as the circle center, .sub.0 as the inner radium, and the outer radium being half of the image size; mapping the Area III into four circles with the four corners of the image B as respective centers, each of the circle being tangent with the outer ring of the circle corresponding to the Area II; when Z.sub.1=0, .sub.0=0, and at this point, the Area I does not exist while only the Area II and the Area III exist; when Z.sub.1=Z.sub.2, .sub.0 is half of the image size, and at this point, the Area II does not exist while only the Area I and the Area III exist; after forward mapping with the above method, part of region in the Image B is not used, which may be filled with any pixel value.

    2. The panoramic video forward mapping method according to claim 1, comprising: the mapping the sphere corresponding to the panoramic image 1 to a plane square image B with a resolution of NN specifically comprises: for each pixel point (X, Y) in the plane image B, comprises steps of: 1) computing the coordinate on the sphere corresponding to the pixel point (X, the plane image B when the main viewpoint center is North Latitude 90 or South Latitude 90, specifically comprising steps of: A) computing the distance from each pixel point (X, Y) in the plane image B the plane image center; B) determining the area where the pixel point (X, Y) is located based on the value of ; when the pixel point (X, Y) is in the Area I or the Area II, shifting to step C); otherwise, computing the distances from the pixel point (X, Y) to the four corners of the image, and taking the shortest distance, denoted as ; determining whether the pixel point (X, Y) is in the Area III based on the value of ; when the pixel point (X, Y) is in the Area III, shifting to step C); when the pixel point (X, Y) is not in the Area III, the pixel point (X, Y) is an unused pixel, which may be filled with any value; then ending the operation; C) computing the value of the included angle Z between the current point and the main viewpoint center based on the area where the pixel point (X, Y) is located and the value of or ; D) computing the latitude corresponding to the pixel point (X, Y) based on the equation latitude=90+Z when the main viewpoint center is North Latitude 90) or based on equation latitude=90Z when the main viewpoint center is South Latitude 90; E) if the pixel point (X, Y)is in the Area I or the Area II, computing the longitude Longitude of the current pixel point based on the direction with the longitude of 0 selected from the Area I or the Area II on the plane and the value of X, Y; if the pixel point (X, Y) is in the Area III, computing the longitude Longitude of the current pixel point based on the angles corresponding to the four circles and the value of X, Y, wherein the direction of the longitude of 0 in the Area I and II are initiatively set, and the longitudes corresponding to the four circles in Area III are initiatively set; F) obtaining the coordinate of the pixel point on the sphere based on the longitude-latitude of the pixel point; the coordinate being the corresponding coordinate of the pixel point (X, the plane image B on the sphere when the main viewpoint center is North Latitude 90 or South Latitude 90; 2) rotating the coordinate obtained in step F) to obtain the coordinate corresponding to the pixel point (X, the plane image B when the main viewpoint center is (lon, lat); 3) taking the pixel value of the corresponding position on the sphere or obtaining the corresponding value by interpolation based on the rotated coordinate in step 2) as the pixel value of the pixel value point (X, Y) the plane image B.

    3. The panoramic video forward mapping method according to claim 1, wherein mapping formats of the panoramic image A include, but are not limited to, a longitude-latitude image, a cubic mapping image, a multi-channel camera captured panoramic video.

    4. The panoramic video forward mapping method according to claim 4, wherein the values of parameters lon, lat, Z.sub.1, Z.sub.2 and .sub.0 in the method may all be initiatively set.

    5. The panoramic video forward mapping method according to claim 2, wherein the step A) of computing the distance from the pixel point (X, Y) in the plane image B to the center of the plane image B specifically comprises: normalizing the pixel point (X, Y) in the plane image B to a range from 1 to 1, wherein the normalized coordinate is (X, Y); then, computing the distance ={square root over ((X).sup.2+(Y).sup.2)} from the point (X, Y) to the center of the plane image B.

    6. The panoramic video forward mapping method according to claim 2, wherein the step B) of determining the area where the pixel point (X, Y) is located is performed as such: when <.sub.0, the pixel point (X, Y) is in the Area I; when 1>.sub.0, the pixel point (X, Y) is in Area II; when >1, computing the distances from the pixel point (X, Y) to the four corners of the planar image B, taking the shortest distance therein, denoted as ; when {square root over (2)}1, the pixel point (X, Y) is in the Area III; when {square root over (2)}1, the pixel point (X, Y) in the plane image is an unused pixel, which may be filled with any value; in the steps above, points satisfying =.sub.0 are assigned to Area I or Area II.

    7. The panoramic video forward mapping method according to claim 2, wherein step C) of computing the value of the included angle Z between the current point and the main viewpoint center based on the area where the pixel point (X, Y) is located and the value of or specifically comprises: when the pixel point (X, Y) is in the Area I, computing Z based on the equation Z = 2 .Math. .Math. arcsin .Math. 0 C 0 , wherein C.sub.0 is determined by the boundary condition: when =.sub.0, Z=Z.sub.1; when the pixel point (X, Y) is in the Area II, solving Z based on the equation .sup.2=C.sub.1+C.sub.0sinZ f.sub.1(Z)dZ , wherein C.sub.0, C.sub.1 is determined by the boundary conditions: when =.sub.0, Z=Z.sub.1 and =1, Z=Z.sub.2; and f.sub.1(Z) is any function of Z; when the pixel point (X, Y)is in the Area III, solving the azimuth Z based on the equation .sup.2=C.sub.1+C.sub.0sinZ f.sub.2(Z)dZ , wherein C.sub.0, C.sub.1 is determined by the boundary conditions: when =0, Z=180 and when ={square root over (2)}1, Z=Z.sub.2; and f.sub.2(Z) is any function of Z.

    8. The panoramic video forward mapping method according to claim 2, wherein the rotating in step 2) specifically comprising: computing the rectangular coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) of the point on the unit sphere based on the longitude and the latitude, then multiplying the coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) by a corresponding rotation matrix resulting from rotating from the North Latitude 90 or South 90 to the main viewpoint center (lon, lat) to obtain the rectangular coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) on the sphere corresponding to the pixel point (X, Y) in the plane image B.

    9. A panoramic video inverse mapping method, comprising: mapping, based on a main viewpoint, the plane square image B back to the sphere, wherein the longitude-latitude of the main viewpoint center of the square image B are (lon, lat), Area I refers to a circle on the image B with the image center as the circle center and a radius of .sub.0, which is inversely mapped to an area on the sphere having an included angle 0Z.sub.1 with a connecting line from the main viewpoint center to the spherical center; Area II is a circular ring on the image B which takes the image center as the circular center, with an inner radium of .sub.0 and an outer radium being half of the image size, which is inversely mapped to an area on the sphere having an included angle Z.sub.1Z.sub.2 with the connecting line from the main viewpoint center to the spherical center; Area III refer to four circles with four corners of the image B as the circle centers, each circle being tangent with the outer perimeter of the circular ring corresponding to Area II, which is inversely reflected to an area on the sphere having an included angle Z.sub.2180 with the connecting line from the main viewpoint center to the spherical center; wherein when .sub.0=0, Z.sub.1=0, and at this point, Area I does not exist while only Area II and Area III exist; when .sub.0 is half of the image size, Z.sub.1=Z.sub.2, and at this point, Area II does not exist while only Area II and Area III exist; values of the parameters lon, lat, Z.sub.1, Z.sub.2 and .sub.0 are obtained from the code rate, but not limited thereto.

    10. The panoramic video inverse mapping method according to claim 9, wherein for each point on the sphere with coordinates of (longitude, latitude) or (X.sub.sphere, Y.sub.sphere, Z.sub.sphere), the method of inversely mapping the plane square image B back to the sphere specifically comprise steps of: 1) rotating the point on the sphere to obtain the corresponding longitude-latitude (longitude, latitude) supposing that the current main viewpoint center is at North Latitude 90 or South Latitude 90; 2) computing the included angle Z with the North Latitude 90 or South Latitude 90 based on the Latitude latitude, and then determining the area where the current point is located based on the value of Z; 3) computing the distance from the current point mapped to the plane to the center of the plane image B or the shortest distance among the distances from the current point mapped to the plane to the four corners of the plane image B based on the area corresponding to the current point and the value of Z; 4) solving the coordinates (X, Y)of the current point after being mapped to the plane image B based on the or solved in step 3) and the longitude longitude; 5) taking the pixel value at the position of (X, Y) on the plane image B or performing interpolation to a nearby pixel as the pixel value of the point with coordinates (longitude, latitude) or (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) on the sphere; executing the steps 1)5) for all points on the sphere, thereby completing inverse mapping of the panoramic video.

    11. The panoramic video inverse mapping method according to claim 10, wherein the step 1) of rotating further specifically comprises: converting the coordinates (longitude, latitude) or (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) into rectangular coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere); multiplying the coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) by a corresponding rotation matrix resulting from rotating from the main viewpoint center (lon, lat) to the North Latitude 90 or South Latitude 90 to obtain the rotated rectangular coordinates (X.sub.sphereY.sub.sphere, Z.sub.sphere); then, computing the corresponding longitude-latitude (longitude, latitude) based on the rectangular coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere).

    12. The panoramic video inverse mapping method according to claim 10, wherein the step 2) of computing the included angle Z with the North Latitude 90 or South Latitude 90 based on the Latitude latitude and then determining the area where the current point is located based on the value of Z comprises: . computing the value of Z based on the equation Z=90latitude when the main viewpoint center is at North Latitude 90; computing the value of Z based on equation Z=latitude+90 when the main viewpoint center is at South Latitude 90; when 0Z<Z.sub.1, the current point is located in Area II; when Z.sub.1<Z<Z.sub.2, the current point is located in Area II; when Z.sub.2<Z180, the current point is located in Area III; when Z=Z.sub.1, the current point is located in Area I or Area II; when Z=Z.sub.2 , the current point is located in Area II or Area III.

    13. The panoramic video inverse mapping method according to claim 10, wherein the step 3) of computing the distance from the current point mapped to the plane to the center of the plane image B or the shortest distance among the distances from the current point mapped to the plane to the four corners of the plane image B based on the area corresponding to the current point and the value of Z specifically comprises: when the current point is in the Area I, computing based on the equation = C 0 .Math. sin .Math. Z 2 , wherein C.sub.0 is determined by the boundary condition: when =.sub.0, Z=Z.sub.1; when the current point is in the Area II, solving based on the equation .sup.2=C.sub.1+C.sub.0sinZ f.sub.1(Z)dZ , wherein C.sub.0, C.sub.1 is determined by the boundary conditions: when =.sub.0, Z=Z.sub.1 and when =1, Z=Z.sub.2 ; and f.sub.1(Z) is any function of Z; when the current point is in the Area III, solving based on the equation .sup.2=C.sub.1+C.sub.0sinZ f.sub.2(Z) dZ , wherein C.sub.0, C.sub.1 is determined by the boundary conditions: when =0, Z=180 and when ={square root over (2)}1, Z=Z.sub.2; and f.sub.2(Z) is any function of Z.

    14. The panoramic video inverse mapping method according to claim 10, wherein the step 4) of solving the value of coordinates (X, Y). Specifically comprises: when the point is located in Area I or II, obtaining the included angle between the current point and the direction of longitude of 0 chosen from the Area I or the Area II on the plane based on longitude, and then solving the value of the coordinates (X, Y) of the current point mapped to the plane based on the value of the included angle and the value of ; when the point is located in Area III, obtaining the value of the coordinates (X, Y) of the current point mapped to the plane based on the values of longitude and and the angles corresponding to the four circles.

    Description

    DRAWINGS

    [0043] FIG. 1 shows a schematic diagram of a correspondence relationship between a plane image and a sphere by panoramic image mapping, wherein through on the panoramic image forward mapping method provided by the present disclosure, the Areas I, II, and III on the sphere are mapped to the Areas I, II, and III on the plane, respectively; through the panoramic image inverse mapping method provided by the present disclosure, the Areas I, II, and III on the plane are mapped to the Areas I, II, and II on the sphere, respectively;

    [0044] wherein (a) is a schematic diagram of a panoramic image on a sphere; (b) is a schematic diagram of the image mapped using the method of the present disclosure; Area I in (a) refers to the main viewpoint area with point C as the center, which corresponds to the area from 0 to Z.sub.1; Z.sub.1 denotes the of BOC in (a), which may be initiatively set; in the present disclosure, the Area I in (a) is mapped to a circular surface in Area I in (b), wherein the radius of the circular surface in Area I in (b) is .sub.0, which may be initiatively set; the Area II on the (a) sphere is a first-stage non-viewpoint area, which corresponds to the area from Z.sub.1 to Z.sub.2, where Z.sub.2 denotes the of AOC in (a), which may be initiatively set; in the present disclosure, Area II in (a) is mapped to a circular ring in Area II in the square of (b); Area III on the (a) sphere is a second-stage non-viewpoint area, which corresponds to the area from R to 180; in the present disclosure, the Area III in (a) is mapped to the circular face in Area III in the square of (b), where the circular face in Area III in the square comprises four circular faces; areas other than Area I, Area II, and Area III in the square of (b) are unused areas, which may be filled with any pixel values.

    [0045] FIG. 2 is a schematic diagram of correspondence between the mapping and the sizes on the sphere in the present disclosure, wherein (a) is a schematic diagram of computing the size of a small block of area on the sphere; (b) is a schematic diagram of computing the size of a small block of area on a plane; wherein the size of the smaller area on a unit sphere may be denoted as S.sub.sphere=sinZddZ, which, after being mapped to the circular face in the plane, has a corresponding size of S.sub.circular surface=dd; the ratio between the two sizes is

    [00004] s circular .Math. .Math. surface s sphere = .Math. .Math. d .Math. .Math. sin .Math. .Math. ZdZ ;

    when the ratio is a constant, i.e.,

    [00005] .Math. .Math. d .Math. .Math. sin .Math. .Math. ZdZ = C ,

    the relation between and Z is solved to be

    [00006] = C 0 .Math. sin .Math. .Math. Z 2 ,

    which guarantees that the sampling density in the main viewpoint area is consistent; if the ratio is a function f(Z) whose value increases, not incrementing, with Z, i.e.,

    [00007] .Math. .Math. d .Math. .Math. sin .Math. .Math. ZdZ = f ( Z ) ,

    the relation between and Z is solved to be .sup.2=C.sub.1+C.sub.0sinZ f(Z)dZ, which guarantees that the farther from the main viewpoint area, the lower the sampling density.

    [0046] FIG. 3 is a flow diagram of the forward mapping method according to the present disclosure;

    [0047] FIG. 4 is a flow diagram of the inverse mapping method according to the present disclosure;

    [0048] FIG. 5 is a schematic diagram of computing forward mapping and inverse mapping in an embodiment of the present disclosure;

    [0049] wherein, Area I and Area II choose the direction with the longitude being 0 as the forward direction of X axis; and the correspondence relationship with respect to the longitude in Area III is obtained through map computing.

    [0050] FIG. 6 is a effect diagram resulting from forward mapping according to an embodiment of the present disclosure.

    DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

    [0051] Hereinafter, the present disclosure is further described through the embodiments, but the scope of the present disclosure is not limited in any manner.

    [0052] An embodiment of the present disclosure provides a main viewpoint-based panoramic image mapping method, comprising a panoramic image forward mapping method and a corresponding inverse mapping method, embodiments of which will be explained below, respectively.

    [0053] The panoramic image forward mapping method is shown in FIG. 1, wherein Areas I, II, and III on the sphere are mapped to the Areas I, II, and III on the plane, respectively; In FIG. 1, the left part is a schematic diagram of a panoramic image on a sphere, and the right part is a schematic diagram of an image mapped using the method of the present disclosure. Area I in (a) refers to the main viewpoint area with point C as the center, which corresponds to the area from 0 to Z.sub.1; Z.sub.1 denotes the of BOC in (a), which may be initiatively set; in the present disclosure, the Area I in (a) is mapped to a circular surface in Area I in (b), wherein the radius of circular surface in Area I in (b) is .sub.0, which may be initiatively set; the Area II on the (a) sphere is a first-stage non-viewpoint area, which corresponds to the area from Z.sub.1 to Z.sub.2, where Z.sub.2 denotes the of AOC in (a), which may be initiatively set; in the present disclosure, Area II in (a) is mapped to a circular ring in Area II in the square of (b); Area III on the (a) sphere is a second-stage non-viewpoint area, which corresponds to the area from R to 180; in the present disclosure, the Area III in (a) is mapped to the circular face in Area III in the square of (b), where the circular face in Area III in the square comprises four circular faces; areas other than Area I, Area II, and Area III in the square of (b) are unused areas, which may be filled with any pixel values.

    [0054] As shown in FIG. 2, a size of a smaller area in a unit sphere may be expressed as S.sub.spheresinZddZ, the corresponding size of which after being mapped to the circular face in the plane image is S.sub.circular surface=dd, wherein the ratio between the two sizes is

    [00008] s circular .Math. .Math. surface s spher = .Math. .Math. d .Math. .Math. sin .Math. .Math. ZdZ .

    In the circular face corresponding to Area I in the square, supposing the size ratio is a constant, namely,

    [00009] .Math. .Math. d .Math. .Math. sin .Math. .Math. ZdZ = C ,

    the relation between and Z is solved to be

    [00010] = C 0 .Math. sin .Math. Z 2 ,

    which guarantees a consistent sampling density in the circular face of Area I in the square (i.e., in the main viewpoint area). In the circular faces corresponding to Areas II and III, the size ratio is a function f(Z) whose value increases, but not incrementing, with Z (the f(Z) functions corresponding to Area II and Area III can be different), i.e.,

    [00011] .Math. .Math. d .Math. .Math. sin .Math. .Math. ZdZ = f ( Z ) ;

    then, the relation between and Z is solved to be .sup.2=C.sub.1+C.sub.0sinZ f(Z)dZ, which guarantees that the farther from the main viewpoint area, the lower the sampling density is.

    [0055] In an embodiment, a sphere corresponding to panoramic image A of a certain format (e.g., a longitude-latitude map, a cubic map image, etc.) is mapped to a plane image B (with a resolution of NN) corresponding to the panoramic mapping designed in the present disclosure. The specific flow is described as follows (wherein the first through sixth steps relate to computing coordinates of the pixel point (X, the plane image B after being mapped to the sphere when the main viewpoint center is at North Latitude 90; the seventh step relates to rotating the coordinates to the coordinates corresponding to the current main viewpoint center):

    [0056] Step 1: normalizing the pixel point (X, Y)in the plane image B to a range from 1 to 1, wherein the normalized coordinates are (X, Y) (here, it is optional not to normalize; however, normalization may facilitate derivation of subsequent equations).

    [0057] Step 2: computing the distance ={square root over ((X).sup.2+(Y).sup.2)} from the point (X, Y) to the center of the plane image B.

    [0058] Step 3: if .sub.0, the point is located in Area I, and the azimuth is

    [00012] Z = 2 .Math. arcsin .Math. C 0

    (C.sub.0 in the equation is determined by the boundary condition: when =.sub.0, Z=Z.sub.1; if 1>.sub.0 is located in Area II, the azimuth Z is solved by .sup.2=C.sub.1+C.sub.0sinZ f.sub.1(Z)dZ (C.sub.0, C.sub.1 in the equation is determined by boundary conditions: when =.sub.0, Z=Z.sub.1 and when =1, Z=Z.sub.2); if >1, computing the distances from point (X, Y) to the four corners of the plane image B, taking the shortest distance, denoted as ; if {square root over (2)}1, the point is located in Area III, and the azimuth Z is solved by .sup.2=C.sub.1+C.sub.0sinZ f.sub.2(Z)dZ (C.sub.0, C.sub.1 in the equation is determined by the boundary conditions: when =0, Z=180 and when ={square root over (2)}1, Z=Z.sub.2); if >{square root over (2)}1, the pixel point (X, Y) in the plane image B is an unused pixel, which may be filled with any value.

    [0059] Step 4: computing the latitude corresponding to the point based on the equation latitude=90Z, where when latitude is forward, it indicates North Latitude, when latitude is negative, it indicates South Latitude.

    [0060] Step 5: computing the longitude of the current point based on the value of X, Y, where when the longitude is forward, it indicates East Longitude, and when the longitude is negative, it indicates West Longitude.

    [0061] Step 6: obtaining the coordinates of the point on the sphere based on the longitude-latitude; i.e., when the main viewpoint center is at North Latitude 90, the pixel point (X, Y) in the plane image B corresponds to the coordinates on the sphere.

    [0062] Step 7: computing the Cartesian coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) of the point on the unit sphere based on the longitude and the latitude; then multiplying the coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) by the rotation matrix corresponding to the main viewpoint center rotated from the North Latitude 90 to obtain the Cartesian coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) on the sphere corresponding to the pixel point (X, Y) in the plane image B (here, it is optional to directly rotate the longitude-latitude coordinates, but rotation with the Cartesian coordinates facilitates computation).

    [0063] Step 8: taking the pixel value of the corresponding position on the sphere (or obtaining the corresponding value by interpolation) based on the coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) on the sphere as the pixel value of the pixel point (X, Y) in the plane image B.

    [0064] FIG. 3 is a flow diagram of the forward mapping method according to the present disclosure; The specific computation for the forward mapping process in this embodiment is provided below:

    [0065] Step 1: normalizing the pixel point (X, Y)in the plane image B (with a resolution of NN) to a range from 1 to 1; the normalized coordinates are (X, Y); a computing equation thereof is as follows: .

    [00013] X = 2 .Math. X N - 1 ( Equation .Math. .Math. 1 ) Y = 2 .Math. Y N - 1 ( Equation .Math. .Math. 2 )

    [0066] Step 2: computing the distance from the point (X, Y)to the center of the plane image B.


    ={square root over ((X).sup.2+(Y).sup.2)}(Equation 3)

    [0067] Particularly, prefers to a distance from the point (X, Y) to the center of the plane image B.

    [0068] Step 3: if .sub.0, the point is located in Area I; if 1>.sub.0, the point is located in Area II; if >1, computing distances from the point (X, Y) to the four corners of the plane image B, taking the shortest distance, denoted as ; if {square root over (2)}1 the point is located in Area III; if >{square root over (2)}1, the pixel point (X, Y) in the plane image B is an unused pixel, which may be filled with any value; then, directly skipping off the subsequent steps. If the point is located in Areas I, II, and III, solving the value Z based on the relationship between or and Z, the computed relations are provided below:

    [00014] in .Math. .Math. Area .Math. .Math. I .Math. : .Math. .Math. Z = 2 .Math. .Math. arcsin .Math. 0 C 0 ( Equation .Math. .Math. 4 ) in .Math. .Math. Area .Math. .Math. II .Math. : .Math. .Math. 2 = C 1 + C 0 .Math. sin .Math. .Math. Z .Math. .Math. f 1 ( Z ) .Math. dZ ( equation .Math. .Math. 5 ) In .Math. .Math. Area .Math. .Math. III .Math. : .Math. .Math. 2 = C 1 + C 0 .Math. sin .Math. .Math. Z .Math. .Math. f 2 ( Z ) .Math. dZ ( equation .Math. .Math. 6 )

    [0069] C.sub.0, C.sub.1in the above equations are solved by the boundary conditions; the boundary conditions of Areas I, II, and III are provided below:

    [00015] in .Math. .Math. Area .Math. .Math. I .Math. : .Math. .Math. 2 .Math. .Math. arcsin .Math. 0 C 0 ( equation .Math. .Math. 7 )
    In Area II: .sub.0.sup.2=(C.sub.1+C.sub.0sinZ f.sub.1(Z)dZ)|.sub.Z=Z.sub.1


    and


    1.sup.2=(C.sub.1+C.sub.0sinZ f.sub.1(Z)dZ)|.sub.Z=Z.sub.2 (8)


    In Area III: ({square root over (2)}1).sup.2=(C.sub.1+C.sub.0sinZ f.sub.2(Z)dZ)|.sub.Z=Z.sub.2


    and


    0=(C.sub.1+C.sub.0sinZ f.sub.2(Z)dZ)|.sub.Z=180 (equation 9)

    [0070] Step 4: computing the latitude latitude corresponding to the point based on Z, where when latitude is forward, it indicates North Latitude, when latitude is negative, it indicates South Latitude, the computing equation of which is provided below:


    latitude=90Z (equation 10)

    [0071] Step 5: computing the longitude of the current point based on the value of X, Y, where when the longitude is forward, it indicates East Longitude, and when the longitude is negative, it indicates West Longitude. The correspondence relationship between the direction of longitude 0 on the Areas I and II in the plane image B and the longitude on Area III may be initiatively set. In this embodiment, are I and Area II choose the direction with the longitude being 0 as the forward direction of X axis; and the correspondence relationship with respect to the longitude in Area III is shown in FIG. 5, the specific computing relationship is provided below:

    [00016] In .Math. .Math. area .Math. .Math. I .Math. .Math. and .Math. .Math. II .Math. : .Math. .Math. longitude = arctan .Math. Y X ( equation .Math. .Math. 11 ) in .Math. .Math. Area .Math. .Math. II .Math. : .Math. .Math. longitude = { - arctan .Math. Y + 1 X + 1 - 90 .Math. 1 4 .Math. .Math. circle .Math. .Math. at .Math. .Math. the .Math. .Math. left .Math. .Math. upper .Math. .Math. corner - arctan .Math. Y + 1 X - 1 - 90 .Math. 1 4 .Math. .Math. corc ; e .Math. .Math. at .Math. .Math. the .Math. .Math. left .Math. .Math. lower .Math. .Math. corner - arctan .Math. Y - 1 X + 1 + 90 .Math. 1 4 .Math. .Math. circle .Math. .Math. at .Math. .Math. the .Math. .Math. right .Math. .Math. upper .Math. .Math. corner - arctan .Math. Y - 1 X - 1 - 90 .Math. 1 4 .Math. .Math. circle .Math. .Math. at .Math. .Math. the .Math. .Math. right .Math. .Math. lower .Math. .Math. corner ( equation .Math. .Math. 12 )

    [0072] Step 6: obtaining the coordinates of the point on the sphere based on the longitude-latitude; i.e., when the main viewpoint center is at North Latitude 90, the pixel point (X, Y) in the plane image B corresponds to the coordinates on the sphere.

    [0073] Step 7: computing the Cartesian coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) of the point on a unit sphere based on the longitude-latitude (X-axis, Y-axis, and Z-axis of the coordinate system are shown in FIG. 1), which is computed as follows:


    X.sub.sphere=sin(longitude)cos(latitude) (equation 13)


    Y.sub.sphere=sin(latitude) (equation 14)


    Z.sub.sphere=cos(longitude)cos(latitude) (equation 15)

    [0074] Then, multiplying the coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) by the rotation matrix corresponding to the main viewpoint center rotated from the North Latitude 90 to obtain the Cartesian coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) on the sphere corresponding to the pixel point (X, Y) in the plane image B.

    [0075] Step 8: taking the pixel value of the corresponding position on the sphere (or obtaining the corresponding value by interpolation) based on the coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) on the sphere as the pixel value of the pixel point (X, Y) in the plane image B.

    [0076] Till now, all steps of the embodiment of the forward mapping process are completed. The illustrative effect of this embodiment is shown in FIG. 6.

    [0077] In another aspect, the panoramic image inverse mapping process maps the plane square image B back to the sphere. In this embodiment, the longitude-latitude of main viewpoint center of the square image B is (lon, lat)with a resolution of NN, wherein Area I is a circle on the image B with the image center as the circle center and the radius of .sub.0, corresponding to an area on the sphere with an included angle relative to the connecting line from the spherical center to the main viewpoint center being 0Z.sub.1; the Area II is a circular ring on the image B with the image center being the circle center, the inner radium being .sub.0, the outer radium being a half of the image size, corresponding to an area on the sphere with the included angle relative to the connecting line from the spherical center to the main viewpoint center being Z.sub.1Z.sub.2; the Area III are four circles with the four corners of the image B as respective circle centers, each of the four circles being tangent with the outer perimeter of the circle corresponding to the Area II, corresponding to the area with an included angle relative to the connecting line from the corresponding spherical center to the main viewpoint center being Z.sub.2180, wherein the values of parameters lon, lat, Z.sub.1, Z.sub.2 and .sub.0 are obtained from the code rate, but not included thereto.

    [0078] FIG. 4 is a flow diagram of the inverse mapping method according to the present disclosure; The specific inverse mapping process specifically comprises steps of:

    [0079] Step 1: converting the coordinates of the point with coordinates (longitude, latitude)or (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) on the sphere to the rectangular coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere), the computing equations are provided below (the X-axis, Y-axis, and Z-axis of the coordinate system are shown in FIG. 1, if the input is (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) conversion would be unnecessary):


    X.sub.sphere=sin(longitude)cos(latitude) (16)


    custom-character=sin(latitude) (17)


    Z.sub.sphere=cos(longitude)cos(latitude) (18)

    [0080] Step 2: multiplying the coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) by the corresponding rotation matrix resulting from rotating from the main viewpoint center (lon, lat) to the North Latitude 90 (or South Latitude 90), to obtain the rotated rectangular coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere);

    [0081] Step 3: normalizing (X.sub.sphere, Y.sub.sphere, Z.sub.sphere), the normalized rectangular coordinates are (X.sub.sphere, Y.sub.sphere, Z.sub.sphere), the computing relationship thereof being provided below:

    [00017] ( X sphere .Math. , Y sphere , Z sphere ) = ( X sphere , Y sphere , Z sphere ) ( X sphere ) 2 + ( Y sphere ) 2 + ( Z sphere ) 2 ( equation .Math. .Math. 19 )

    [0082] Step 3: computing the corresponding longitude-latitude (longitude, latitude)based on the rectangular coordinates (X.sub.sphere, Y.sub.sphere, Z.sub.sphere), the computing relationship thereof being provided below:

    [00018] latitude = arcsin ( Y sphere ) ( equation .Math. .Math. 20 ) longitude = arctan ( X sphere Z sphere ) ( 21 )

    [0083] Step 4: computing the included angle between the current point and the main viewpoint center based on the equation Z=90latitude

    [0084] Step 5: determining the area where the current point is located based on the value of Z; in the case of 0ZZ.sub.1, it is located in Area I; in the case of Z.sub.1<ZZ.sub.2, it is located in Area II; and in the case of Z.sub.2<Z180, it is located in Area III.

    [0085] Step 6: if the point is located in Areas I or II, computing the distance from the current point after being mapped to the plane to the center of the plane image B; if the point is in Area III, computing the shortest distance among the distances from the current point after being mapped to the plane to the four corners of the plane image B, wherein or is solved by the following equation:

    [00019] in .Math. .Math. Area .Math. .Math. I .Math. : .Math. .Math. Z = 2 .Math. .Math. arcsin .Math. 0 C 0 ( equation .Math. .Math. 22 ) in .Math. .Math. Area .Math. .Math. II .Math. : .Math. .Math. 2 = C 1 + C 0 .Math. sin .Math. .Math. Z .Math. .Math. f 1 ( Z ) .Math. dZ ( 23 ) In .Math. .Math. Area .Math. .Math. III .Math. : .Math. .Math. 2 = C 1 + C 0 .Math. sin .Math. .Math. Z .Math. .Math. f 2 ( Z ) .Math. dZ ( equation .Math. .Math. 24 )

    [0086] C.sub.0, C.sub.1in the above equations are solved by the boundary conditions; the boundary conditions are provided below:

    [00020] in .Math. .Math. Area .Math. .Math. I .Math. : .Math. .Math. Z 1 = 2 .Math. .Math. arcsin .Math. 0 C 0 ( equation .Math. .Math. 25 ) in .Math. .Math. Area .Math. .Math. II .Math. : .Math. .Math. 0 2 = ( C 1 + C 0 .Math. sin .Math. .Math. Z .Math. .Math. f 1 ( Z ) .Math. dZ ) .Math. | Z = Z 1 .Math. .Math. and .Math. .Math. 1 2 = ( C 1 + C 0 .Math. sin .Math. .Math. Z .Math. .Math. f 1 ( Z ) .Math. dZ ) .Math. | Z = Z 2 ( equation .Math. .Math. 26 ) In .Math. .Math. Area .Math. .Math. III .Math. : .Math. .Math. ( 2 - 1 ) 2 = ( C 1 + C 0 .Math. sin .Math. .Math. Z .Math. .Math. f 2 ( Z ) .Math. dZ ) .Math. | Z = Z 2 .Math. .Math. and .Math. .Math. 0 = ( C 1 + C 0 .Math. sin .Math. .Math. Z .Math. .Math. f 2 ( Z ) .Math. dZ ) .Math. | Z = 180 .Math. ( equation .Math. .Math. 27 )

    [0087] Step 7: computing the coordinates (X, Y) of the current point on the plane based on the values of longitude, or ; specifically, if the point is in Area I or Area II, obtaining the included angle between the current point and the direction of longitude 0 selected in the Area I or Area II on the plane based on longitude, solving the value of the coordinates (X, Y) of the current point mapped to the plane based on the value of the included angle and the value of ; if the point is located in Area III, obtaining the value of the coordinates (X, Y) of the current point mapped to the plane based on the values of longitude and , and the angles corresponding to the four circles. In this embodiment, are I and Area II choose the direction with the longitude being 0 as the forward direction of X axis; and the correspondence relationship with respect to the longitude in Area III is shown in FIG. 5, the specific computing relationship is provided below:

    [00021] In .Math. .Math. Areas .Math. .Math. I .Math. .Math. and .Math. .Math. II : .Math. X = .Math. .Math. cos ( longitude ) , Y = .Math. .Math. sin ( longitude ) ( Equation .Math. .Math. 28 ) In .Math. .Math. Area .Math. .Math. III : .Math. { X = .Math. cos ( - longitude - 90 .Math. ) - 1 Y = .Math. sin ( - longitude - 90 .Math. ) - 1 1 4 .Math. .Math. cirle .Math. .Math. at .Math. .Math. the .Math. .Math. lef .Math. t .Math. .Math. upper .Math. .Math. corner X = .Math. cos ( 90 .Math. - longitude ) + 1 Y = .Math. cos ( 90 .Math. - longitude ) - 1 1 4 .Math. .Math. cirle .Math. .Math. at .Math. .Math. the .Math. .Math. lef .Math. t .Math. .Math. lower .Math. .Math. corner X = .Math. cos ( 90 .Math. - longitude ) + 1 Y = .Math. sin ( 90 .Math. - longitude ) + 1 1 4 .Math. .Math. cirle .Math. .Math. at .Math. .Math. the .Math. .Math. right .Math. .Math. upper .Math. .Math. corner X = .Math. cos ( - longitude - 90 .Math. ) + 1 Y = .Math. sin ( - longitude - 90 .Math. ) + 1 1 4 .Math. .Math. cirle .Math. .Math. at .Math. .Math. the .Math. .Math. right .Math. .Math. lower .Math. .Math. corner ( Equation .Math. .Math. 29 )

    [0088] Step 8: unnormalizing the coordinates (X, Y) resulting from normalizing 1 to 1 ; computation of the unnormalizng is provided below:

    [00022] X = N ( X + 1 ) 2 ( 30 ) Y = N ( Y + 1 ) 2 ( Equation .Math. .Math. 31 )

    [0089] Step 9: taking the pixel value at the position of (X, Y)on the plane image B (or performing interpolation to a nearby pixel) as the pixel value of the point with coordinates (longitude, latitude) or (X.sub.sphere, Y.sub.sphere, Z.sub.sphere) on the sphere;

    [0090] Till now, all steps of the forward mapping process and the inverse mapping process have been completed. The forward mapping process according to the embodiments of the present disclosure may map an image A (an area on the sphere) to a plane image B (corresponding area on the plane); while the inverse mapping process according to the embodiments of the present disclosure may map the plane image B back to the sphere for being rendered and viewed.

    [0091] It needs to be noted that the embodiments as disclosed are intended to facilitating further understanding of the present disclosure; however, those skilled in the art may understand that various substitutions and modifications are possible without departing from the spirit and scope of the present disclosure. Therefore, the present disclosure should not be limited to the contents disclosed in the embodiments, but should be governed by the appended claims.