SYSTEM AND METHOD FOR GENERATING A LIMITLESS PATH IN VIRTUAL REALITY ENVIRONMENT FOR CONTINUOUS LOCOMOTION
20220382367 · 2022-12-01
Inventors
Cpc classification
G06F3/011
PHYSICS
G06F2203/012
PHYSICS
International classification
Abstract
A method for generating a limitless path in a virtual reality environment (VR) for a continuous locomotion within a real physical space using Head-Mounted-Display (HMD) device associated with a user is provided. The method includes determining a line segment between two points that corresponds to an initial path travelled by the user. The method includes detecting a boundary of the VR environment to generate a next line segment. The method includes generating and adding a new line segment to end of the initial path. The method includes generating and adding the new line segment to the end of the next line segment. The method includes generating an updated path by adding the new line segment in a direction at the angle of shift angle to the direction of the next line segment. The method includes, configuring to output updated path as two-dimensional points to render updated path into VR environment.
Claims
1. A processor-implemented method for generating a continuous path in a virtual reality environment (VR) for a limitless locomotion within a real physical space using a Head-Mounted-Display (HMD) device associated with a user, the method comprising: determining a line segment (L.sub.i) between an initial point (P.sub.0) to a terminal point (P1) by analyzing input data that corresponds to an initial path travelled by the user in the real physical space from the initial point (P.sub.0), wherein the input data is obtained from the HMD associated with the user, wherein the line segment is determined by Li.sub.=
2. The processor-implemented of claim 1, wherein the input data comprises (i) the initial point (P.sub.0) and head-yaw (β.sub.0) of the HMD at the initial point (ii) dimensions of boundary of the real physical space D(x, z), and (iii) path properties that include segment length (l) and path width (w).
3. The processor-implemented of claim 1, wherein the shift angle is determined by γ.sub.j=β.sub.i−1−π/2+((π/j)*k), where k is in range {0, j}; j>0 and the shift angle ranges between β.sub.i−1−π/2, β.sub.i−1+π/2.
4. The processor-implemented of claim 1, wherein the environmental properties comprise at least one of information regarding assets placed in the virtual reality environment, information about textures, or placement of assets in the virtual reality environment.
5. The processor-implemented of claim 4, wherein points in the VR environment is determined by Pi+1(x)=1*sin β.sub.i+Pi(x), and Pi+1(z)=1*cos β.sub.i+Pi(z).
6. The processor-implemented of claim 1, wherein the head-yaw (β.sub.i) at i.sup.th point ranges from β.sub.i−1−π/2, β.sub.i−1+π/2.
7. The processor-implemented of claim 1, wherein the method comprises generating the path using the line and path walls without an intersection of the path walls by maintaining the ratio of the width of the path to the length of the path at a constant to avoid the intersection of the walls in the virtual reality environment.
8. The processor-implemented of claim 1, wherein the method comprises generating the limitless path in the virtual environment with parallel walls of the path by correlating the width of the path at the turns with the angle of the turn.
9. The processor-implemented of claim 8, wherein the method comprises generating dynamic path segments based on a random length unit value between the length of the fixed path segment and the proximity distance, thereby utilizing the real physical space efficiently.
10. One or more non-transitory computer-readable storage medium storing the one or more sequence of instructions, which when executed by the one or more processors, causes to perform a method for generating a continuous path in a virtual reality environment (VR) for a limitless locomotion within a real physical space using a Head-Mounted-Display (HMD) device associated with a user, said method comprising: determining a line segment (L.sub.i) between an initial point (P.sub.0) to a terminal point (P1) by analyzing input data that corresponds to an initial path travelled by the user in the real physical space from the initial point (P.sub.0), wherein the input data is obtained from the HMD associated with the user, wherein the line segment is determined by Li.sub.=
11. A system for generating a continuous path in a virtual reality environment (VR) for a limitless locomotion within a real physical space using a Head-Mounted-Display (HMD) device associated with a user, the system comprising: a device processor; and a non-transitory computer-readable storage medium storing one or more sequences of instructions, which when executed by the device processor, causes: determine a line segment (L.sub.i) between an initial point (P.sub.0) to a terminal point (P1) by analyzing input data that corresponds to an initial path travelled by the user in the real physical space from the initial point (P.sub.0), wherein the input data is obtained from the HMD associated with the user, wherein the line segment is determined by Li.sub.=
12. The system of claim 11, wherein the input data comprises (i) the initial point (P.sub.0) and head-yaw (β.sub.0) of the HMD at the initial point (ii) dimensions of boundary of the real physical space D(x, z), and (iii) path properties that include segment length (l) and path width (w).
12. The system of claim 11, wherein the shift angle is determined by γ.sub.j=β.sub.i−1−π/2+((π/j)*k), where k is in range {0, j}; j>0 and the shift angle ranges between β.sub.i−1−π/2, β.sub.i−1+π/2.
13. The system of claim 11, wherein the environmental properties comprise at least one of information regarding assets placed in the virtual reality environment, information about textures, or placement of assets in the virtual reality environment.
14. The system of claim 11, wherein points in the VR environment is determined by Pi+1(x)=1*sin β.sub.i+Pi(x), and Pi+1(z)=1*cos β.sub.i+Pi(z).
15. The system of claim 11, wherein the head-yaw (β.sub.i) at i.sup.th point ranges from β.sub.i−1−π/2, β.sub.i−1+π/2.
16. The system of claim 11, wherein the environmental properties comprise at least one of information regarding assets placed in the virtual reality environment, information about textures, or placement of assets in the virtual reality environment.
17. The system of claim 11, wherein the processor is configured to generate the path using the line and path walls without an intersection of the path walls by maintaining the ratio of the width of the path to the length of the path at a constant to avoid the intersection of the walls in the virtual reality environment.
18. The system of claim 11, wherein the processor is configured to generate the limitless path in the virtual environment with parallel walls of the path by correlating the width of the path at the turns with the angle of the turn.
19. The system of claim 11, wherein the processor is configured to generate dynamic path segments based on a random length unit value between the length of the fixed path segment and the proximity distance, thereby utilizing the real physical space efficiently.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0030] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
DETAILED DESCRIPTION OF THE DRAWINGS
[0042] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0043] As mentioned, there remains a need for a system for creating a continuous locomotion in a limitless path with an unbounded experience in a virtual reality environment without a need of additional hardware support. Various embodiments disclosed herein provide a system and method for generating a continuous path in the virtual reality environment using only Head-Mounted-Device (HMD) with 6-Degrees-Of-Freedom for the continuous locomotion with the unbounded experience. Referring now to the drawings, and more particularly to
[0044]
[0045] The user 102 may employ the HMD 104 to experience the virtual reality environment that is rendered visually by the HMD 104. The user 102 can explore, navigate, and move within the virtual reality environment rendered by the HMD 104 by moving within a real physical space.
[0046] The HMD 104 is associated with the user 102 and is communicatively connected with the virtual environment continuous locomotion generating server 108 through a network 106. In some embodiments, the network 104 includes, but not limited to, a wireless network, a wired network, a combination of the wired network and the wireless network or Internet and the like.
[0047] The HMD 104 associated with the user 102 is configured to obtain input data from the real physical space, when the user 102 is present in the real physical space. In some embodiments, the input data includes HMD position information, head position information, and dimensions data of the real physical space. The HMD 104 may include at least one sensor to obtain the input data. The HMD position information may include coordinates of the user's position and orientation in the real physical space. The head position information may include HMD head-yaw in the virtual reality environment. The dimensions data of the real physical space may include boundary information of the real physical space. In some embodiments, the HMD 104 has 6-Degrees-Of-Freedom (6-DoF) tracking capability to track the orientation and position of the user 102 in the real physical space. In some embodiments, the input data includes (i) the initial point (P.sub.0) and head-yaw (β.sub.0) of the HMD 104 at the initial point (ii) dimensions of boundary of the real physical space D (x, z), and (iii) path properties that include segment length (l) and path width (w).
[0048] The virtual environment continuous locomotion generating server 108 receives the input data from the HMD 104 and outputs a path based on the input data and path properties. In some embodiments, the path is a list of 2D points or line segments representing a line. The virtual environment continuous locomotion generating server 108 generates the generated path and renders the generated path into the virtual reality environment using environmental properties. The environmental properties may include information regarding assets placed in the virtual reality environment, information about textures, placement of assets in the VR environment, etc. The path properties and the environmental properties are metadata that may be provided by a designer for generating the VR environment with the desired configuration to achieve a use case and may be stored in the database.
[0049] The virtual environment continuous locomotion generating server 108 determines a line segment (L.sub.i) between an initial point (P.sub.0) to a terminal point (P1) by analyzing the input data that corresponds to an initial path travelled by the user in the real physical space from the initial point (P.sub.0). The initial path may include a set number of 2D points (or line segments). The initial path may be rendered into the virtual reality environment by the virtual environment continuous locomotion generating server 108. For example, the first line segment may be represented as: L.sub.0=
[0050] The virtual environment continuous locomotion generating server 108 detects a boundary of the VR environment using a new point to generate a next line segment from an end of the line segment by projecting one or more rays in different directions using a shift angle. The virtual environment continuous locomotion generating server 108 generates a new line segment and adds the new line segment to the end of the initial path when the user 102 moves forward on the initial path at a certain distance using the shift angle. To detect the boundary for generating a path, the virtual environment continuous locomotion generating server 108 uses ‘j’ value that equally divides a 180° range into multiple possible rays. If a new point P.sub.i+1 is generated for generating the new line segment by the virtual environment continuous locomotion generating server 108, then the virtual environment continuous locomotion generating server 108 projects j+1 number of rays in multiple directions with certain angle γ as follows:
γ.sub.j=β.sub.i−1−π/2+((π/j)*k),
[0051] where k is in range {0, j}; j>0 and a range of angle is β.sub.i−1−π/2, βi−1+π/2.
[0052] For example, if the value of j=4, the virtual environment continuous locomotion generating server 108 projects j+1 rays i.e. 5 rays at equal angles between β.sub.i−1−π/2, β.sub.i−1+π/2 such as γ.sub.0, γ.sub.1, γ.sub.2, γ.sub.3, γ.sub.4. The source of the rays is P.sub.i and length is equal to path length+path width/2.
[0053] The virtual environment continuous locomotion generating server 108 generates a new line segment using Li.sub.=
[0054] The virtual environment continuous locomotion generating server 108 configures to output the updated path as a list of two-dimensional points to render the updated path into the virtual reality environment using environmental properties. The virtual environment continuous locomotion generating server 108 removes the line segment from the updated path when the user moves forward to cover half of the path in the virtual reality environment, thereby continuously generating new line segments, updating the initial path by adding each new line segment at the end of the initial path and removing line segment from beginning of the initial path to enable the continuous locomotion in the virtual reality environment through the HMD 104 associated to the user 102.
[0055] The virtual environment continuous locomotion generating server 108 may be configured to generate (i) a path with multiple path options and (ii) a path in a reverse direction. The user 102 may choose at least one path that goes in different directions each turn. The user 102 may choose the path in the reverse direction, if the user 102 moves in a backward direction. The system 100 may be implemented with at least one bicycle, treadmill, controller, or any other external hardware.
[0056] The system 100 may be integrated with at least one of (i) an obstacle avoidance system, (ii) redirection techniques, and (iii) translation gains. The obstacle avoidance system may allow the virtual environment continuous locomotion generating server 108 to generate the path in the real physical space with obstacles. The redirection techniques and translation gains may allow the virtual environment continuous locomotion generating server 108 to generate the path larger than the real physical space.
[0057]
[0058] The input data receiving module 204 is communicatively connected with the HMD 104 associated with the user 102 and is configured to receive input data. The input data may include HMD position information, head position information, and dimensions data of the real physical space from the HMD 104. The HMD position information may include coordinates of the user's position and orientation in a real physical space. The head position information may include HMD head-yaw in the virtual reality environment. The dimensions data of the real physical space may include boundary information of the real physical space.
[0059] The line segment determining module 206 determines a line segment (L.sub.i) between an initial point (P.sub.0) to a terminal point (P1) by analyzing the input data that corresponds to an initial path travelled by the user in the real physical space from the initial point (P.sub.0). In some embodiments, the initial path is a line that consists of successive line segments connected at different angles (i.e., a set number of 2D points). In some embodiments, the input data includes (i) the initial point (P.sub.0) and head-yaw (β.sub.0) of the HMD at the initial point (ii) dimensions of boundary of the real physical space D (x, z), and (iii) path properties that include segment length (l) and path width (w). In some embodiments, points in the VR environment are determined by Pi+1(x)=1*sin β.sub.i+Pi(x), and Pi+1(z)=1*cos β.sub.i+Pi(z). In some embodiments, the head-yaw (β.sub.i) at i.sup.th point ranges from β.sub.i−1−π/2, β.sub.i−1+π/2.
[0060] The boundary detecting module 206 detects a boundary of the VR environment using a new point to generate a next line segment from an end of the line segment by projecting a plurality of rays in different directions using a shift angle. In some embodiments, the shift angle is an angle between the line segment of the initial path and the next line segment of an upcoming path. In some embodiments, the shift angle is determined by γ.sub.j=β.sub.i−1−π/2+((π/j)*k), where k is in range {0, j}; j>0 and the shift angle ranges between β.sub.i−1−π/2, β.sub.i−1+π/2.
[0061] The new line segment adding module 210 generate a new line segment using Li.sub.=
[0062] The updated path generating module 212 generate an updated path by adding the new line segment in a direction at the angle of the shift angle to the direction of the next line segment. The updated path outputting module 212 configure to output the updated path as a list of two-dimensional points to render the updated path into the virtual reality environment using environmental properties. In some embodiments, the environmental properties comprise at least one of information regarding assets placed in the virtual reality environment, information about textures, or placement of assets in the virtual reality environment. The environmental properties may be stored in the database 202.
[0063] The line segment generating module 206 the line segment from the updated path when the user moves forward to cover half of the path in the virtual reality environment, thereby continuously generating new line segments, updating the initial path by adding each new line segment at the end of the initial path and removing line segment from beginning of the initial path to enable the continuous locomotion in the virtual reality environment through the HMD 104 associated to the user 102 until the system 100 is terminated externally by the user 102.
[0064]
[0065] As shown in
[0066] L.sub.0=
[0067] Li.sub.=
[0068] In the real physical space, coordinates of the user's position (P.sub.i+1) are calculated using the following equations:
P.sub.i+1(x)=l*sin β.sub.i+P.sub.i(x)
P.sub.i+1(z)=l*cos β.sub.i+P.sub.i(z),
[0069] where 0<=i<total number of segments generated at the start; x and z are the limits of the boundary in a plane along x-axis and z-axis and β.sub.i is head-yaw of the user (i.e. HMD's orientation along the y-axis in virtual reality environment). Using the coordinates of the user's position (P.sub.i+1), the line segment generating module 206 generates the initial path in a defined boundary of D (x, z).
[0070] The virtual environment continuous path generating server 108 generates an upcoming P.sub.i to generate a new line segment (L.sub.i) by detecting the proximity of P.sub.i−1 to the boundary. When P.sub.i−1 is not close to the boundary, the line generating module 206 uses β.sub.i value that is set to a random value in range {β.sub.i−1−π/2, β.sub.i−1+π/2} for generating the new line segment L.sub.i for a given position P.sub.i at the boundary.
[0071]
[0072]
[0073] To detect the boundary 408 for generating a path, the boundary detecting module 208 uses ‘j’ value that equally divides a 1800 range into multiple possible rays 402A-E as shown in
γ.sub.j=β.sub.i−1−π/2+((π/j)*k),
[0074] where k is in range {0, j}; j>0 and a range of angle is β.sub.i−1−π/2, βi−1+π/2.
[0075] For example, if the value of j=4, the boundary detecting module 208 projects j+1 rays i.e. 5 rays 402A-E at equal angles between β.sub.i−1−π/2, β.sub.i−1+π/2 such as γ.sub.0, γ.sub.1, γ.sub.2, γ.sub.3, γ.sub.4 as shown in
[0076] The boundary detecting module 208 uses one of γ.sub.i direction out of 5 rays to generate the path with the new line segment L.sub.i, if none of the rays 402A-E hit the boundary 408 as shown in
[0077]
[0078] As shown in
[0079] The updating of the current path continues as shown in
[0080]
[0081]
[0082]
[0083]
P.sub.tw=Pw/(sin(θ/2)), where θ=180−angle of turn.
[0084] Hence, narrow walled paths may be avoided while generating limitless navigation in virtual environment.
[0085]
[0086]
dP.sub.l=RAND(P.sub.l,b.sub.d)
[0087] The user 102 navigates in the virtual environment by advancing further with unequal path segment lengths with less frequent turns. The path segments are dynamic in length, a random length unit value between fixed path segment length and boundary distance. Using the above equation, the underutilized physical room space can be avoided to a greater extent while developing virtual environment scenes.
[0088]
[0089] In some embodiments, the input data includes (i) the initial point (P.sub.0) and head-yaw (β.sub.0) of the HMD at the initial point (ii) dimensions of boundary of the real physical space D(x, z), and (iii) path properties that include segment length (l) and path width (w).
[0090] In some embodiments, the shift angle is determined by γ.sub.j=β.sub.i−1−π/2+((π/j)*k), where k is in range {0, j}; j>0 and the shift angle ranges between β.sub.i−1−π/2, β.sub.i−1+π/2.
[0091] In some embodiments, the environmental properties comprise at least one of information regarding assets placed in the virtual reality environment, information about textures, or placement of assets in the virtual reality environment.
[0092] In some embodiments, points in the VR environment is determined by Pi+1(x)=1*sin β.sub.i+Pi(x), and Pi+1(z)=1*cos β.sub.i+Pi(z).
[0093] In some embodiments, the head-yaw (β.sub.i) at i.sup.th point ranges from β.sub.i−1−π/2, β.sub.i−1+π/2.
[0094] In some embodiments, the method includes generating the path using the line and path walls without an intersection of the path walls by maintaining the ratio of the width of the path to the length of the path at a constant to avoid the intersection of the walls in the virtual reality environment.
[0095] In some embodiments, the method includes generating the limitless path in the virtual environment with parallel walls of the path by correlating the width of the path at the turns with the angle of the turn.
[0096] In some embodiments, the method includes generating dynamic path segments based on a random length unit value between the length of the fixed path segment and the proximity distance, thereby utilizing the real physical space efficiently.
[0097] A representative hardware environment for practicing the embodiments herein is depicted in
[0098] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.