VIRTUAL REALITY IMAGE PLAYING DEVICE AND METHOD FOR PLAYING MULTIPLE VIRTUAL REALITY IMAGES BY USING ONE STREAMING IMAGE
20210394055 · 2021-12-23
Assignee
Inventors
Cpc classification
A63F13/212
HUMAN NECESSITIES
H04N21/21805
ELECTRICITY
A63F2300/634
HUMAN NECESSITIES
A63F13/497
HUMAN NECESSITIES
A63F13/5255
HUMAN NECESSITIES
H04N13/383
ELECTRICITY
H04N21/23424
ELECTRICITY
H04N21/8456
ELECTRICITY
A63F13/428
HUMAN NECESSITIES
H04N21/2387
ELECTRICITY
A63F13/211
HUMAN NECESSITIES
H04N21/4728
ELECTRICITY
H04N21/6587
ELECTRICITY
International classification
A63F13/355
HUMAN NECESSITIES
A63F13/497
HUMAN NECESSITIES
H04N21/218
ELECTRICITY
H04N21/234
ELECTRICITY
Abstract
The present invention provides a virtual reality image playing device, the device comprising: a streaming image input unit for receiving, through streaming, one streaming image including multiple virtual reality images captured in different directions at an identical time and at an identical point; a streaming time operation unit for adding a play position of a currently playing virtual reality image to a reference time of a virtual reality image to be newly played; and an image playing unit for playing a streaming image, which has been streamed, on a display device. When virtual reality images captured in different directions are to be played as a gaze of a user changes, the virtual reality image playing device plays multiple virtual reality images by using one streaming image by performing a streaming time jump of a streaming image, using a reference time at which an individual virtual reality image starts.
Claims
1. A virtual reality image playback device for playing back a plurality of virtual reality images from a single streaming image, the device comprising: a streaming image input unit configured to receive a streaming image including a plurality of virtual reality images taken at the same point in different directions at the same time; a streaming time calculation unit that, when a virtual reality image to be played back is changed according to a change in a user's gaze during streaming, synchronizes a virtual reality image to be newly played back by adding a play position of a virtual reality image currently played back to a reference time of the virtual reality image to be newly played back; and an image playback unit configured to play the streamed streaming image on a display device, wherein the plurality of virtual reality images are files stitched from one original image generated to implement virtual reality, the streaming image is a single file in which the plurality of virtual reality images are arranged in different time zones based on a reference time allocated to each of the plurality of virtual reality images, and the virtual reality image playback device plays back the plurality of virtual reality images from the single streaming image by using a method of jumping a streaming time of the streaming image in which a change in space is replaced with a change in time by using a reference time at which individual virtual reality images starts, when the plurality of virtual reality images taken in different directions has to be played back according to a change in a user's gaze.
2. A virtual reality image playback device for playing back a plurality of virtual reality images from a single streaming image, the device comprising: a streaming image input unit configured to receive a streaming image including a plurality of virtual reality images taken in different directions at the same time; a streaming time calculation unit that, when a virtual reality image to be played back is changed according to a change in a user's gaze during streaming, synchronizes a virtual reality image to be newly played back; and an image playback unit configured to play the streamed streaming image on a display device, wherein the streaming image is a single file in which the plurality of virtual reality images are arranged in different time zones based on a reference time allocated to each of the plurality of virtual reality images, and the streaming time calculation unit synchronizes a first virtual reality image and a second virtual reality image included in a streaming image in which a change in space is replaced with a change in time in a following way when selectively playing back according to the user's gaze, 1) when the first virtual reality image has to be played back, jumping is made to a streaming time at which a play position of the virtual reality image currently played back is added to a first reference time when the first virtual reality image starts, and 2) when the second virtual reality image has to be played back, jumping is made to the streaming time at which the play position of the virtual reality image currently played back is added to a second reference time when the second virtual reality image starts.
3. A virtual reality image playback method of playing back a plurality of virtual reality images from a single streaming image by using a virtual reality image playback device receiving the single streaming image including the plurality of virtual reality images, the method comprising: (a) receiving a single streaming image including the plurality of virtual reality images taken in different directions at the same time and obtained by replacing a change in space with a change in time; (b) playing back a virtual reality image corresponding to a current user's gaze and checking a play position of the virtual reality image currently played back; and (c) synchronizing a virtual reality image to be newly played back by adding the play position of the virtual reality image currently played back to a reference time of the virtual reality image to be newly played back when the virtual reality image to be played back is changed according to a change in the user's gaze during streaming.
4. A virtual reality image providing server for playing back a plurality of virtual reality images from a single streaming image, the server comprising: an image input unit configured to receive a file stitched from one original image generated to implement virtual reality and to receive the plurality of virtual reality images taken in different directions at the same time of the original image; a streaming image generating unit configured to allocate a separate reference time to each individual virtual reality image and to generate the plurality of received virtual reality images as a single streaming image file in which a change in space is replaced with a change in time without generating a separate file such that the plurality of virtual reality images are arranged in different time zones based on the reference time; and a streaming providing unit configured to stream the generated single streaming image to a virtual reality image playback device, wherein the streaming providing unit provides the streaming image by using a method of jumping a reference time within the single streaming image even when a virtual reality image to be played back is changed due to a change in a user's gaze.
5. The virtual reality image playback device of claim 1, wherein the image playback unit includes a wide area image playback unit configured to play back a wide area image included in the plurality of virtual reality images, and a patch image playback unit configured to generate a patch image by overlapping the patch image included in the plurality of virtual reality images on the wide area image, and the patch image is an image that represents part of the wide area image with different image quality.
6. The virtual reality image playback device of claim 1, wherein the image playback unit includes a plurality of divided image playback units configured to play back divided images included in the plurality of virtual reality images to overlap each other, and the divided images are obtained by dividing one original image into N regions that overlap each other.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0051]
[0052]
[0053]
[0054]
[0055]
[0056]
[0057] It is revealed that the accompanying drawings are exemplified as a reference for understanding the technical idea of the present disclosure, and the scope of the present disclosure is not limited thereby.
DETAILED DESCRIPTIONS
[0058] In describing the present disclosure, when it is determined that a related known function is obvious to those skilled in the art and may unnecessarily obscure the gist of the present disclosure, detailed descriptions thereof are omitted.
[0059] In the present disclosure, a plurality of virtual reality images are images stitched from an original image, and include a wide area image, a patch image, and a divided image.
[0060]
[0061] The concepts of the wide area image and the patch image according to the present disclosure will be described with reference to
[0062] The patch images V.sub.1 to V.sub.3 represent part of the virtual reality region 10 of 360 degrees. The patch images V.sub.1 to V.sub.3 may have different coverage regions, different areas, and different image quality. For example, the first patch image V.sub.1 may be a high-quality image that covers part of upper and lower portion of a front portion. When virtual reality content is a musical, the first patch image V.sub.1 may be a region covering a stage of the musical. The second patch image V.sub.2 may be an image covering part of an upper portion of a rear portion, and the third patch image V.sub.3 may be an image covering part of a lower portion of the rear portion.
[0063] The patch images V.sub.1 to V.sub.3 are played back by being overlapped or patched on the wide area image V.sub.0. Therefore, even when any one of the patch images V.sub.1 to V.sub.3 to be played back is turned off as necessary, the wide area image V.sub.0 is played back later, and thus, a blank does not occur in the image.
[0064] The patch images V.sub.1 to V.sub.3 are played back in synchronization with the wide area image V.sub.0. This is because the patch images V.sub.1 to V.sub.3 may cause dizziness to a user when synchronization of the wide area image V.sub.0 is not implemented.
[0065] An asynchronous content V.sub.4 refers to content randomly inserted by intention of a creator regardless of a plurality of virtual reality images. The asynchronous content V.sub.4 may also be a video or a specific event action. In terms of content, the asynchronous content may also be an advertisement or may be an event related to a virtual reality image.
[0066] The asynchronous content V.sub.4 is not synchronized with a plurality of virtual reality images. That is, the asynchronous content V.sub.4 is played back or operated by separate trigger information regardless of synchronization between the patch images V.sub.1 to V.sub.3 and the wide area image V.sub.0. In a preferred embodiment, the trigger information includes information on whether or not a user's gaze faces a position of the asynchronous content V.sub.4.
[0067] Concept of divided images V.sub.1 to V.sub.N according to the present disclosure will be described with reference to
[0068] In one embodiment, the divided images V.sub.1 to V.sub.N may be obtained by dividing one original image into N regions that are not superimposed on each other. A plurality of divided images V.sub.1 to V.sub.N may have different sizes and may have different image quality.
[0069] The plurality of divided images V.sub.1 to V.sub.N are played back in synchronization with each other. Some of the plurality of divided images V.sub.1 to V.sub.N may be turned off as necessary. Although not illustrated, the asynchronous content V.sub.4 may be displayed in a certain region of the divided images V.sub.1 to V.sub.N.
[0070]
[0071] In another embodiment, the divided images V.sub.1 to V.sub.N may be superimposed on each other by a certain region. The plurality of divided images V.sub.1 to V.sub.N are played back in synchronization with each other. In this case, the superimposed divided images are played back while overlapping each other. Some of the plurality of divided images V.sub.1 to V.sub.N may be turned off as necessary.
[0072] For example, four divided images covering 180 degrees may be superimposed on each other by 90 degrees (V.sub.1 is 270 degrees to 90 degrees, V.sub.2 is 0 degree to 180 degrees, V.sub.3 is 90 degrees to 270 degrees, and V.sub.4 is 180 degrees to 360 degrees). When a gaze coordinate is 45 degrees, any one of V.sub.1 and V.sub.2 may be turned on and V.sub.3 and V.sub.4 may be turned off
[0073] When the divided images are superimposed on each other in this way, the number of divided images to be played back may be reduced. Naturally, the number of divided images to be synchronized is also reduced. Accordingly, burden on the system is reduced.
[0074] In addition, there is an advantage in that, when the divided images are superimposed on each other, it is not necessary to tightly control on/off operations of the divided images according to a user's gaze among the divided images when the user's gaze changes rapidly. For example, even when V.sub.1 is turned off, V.sub.2 covers a region between 0 degree and 90 degrees, which are part of the region covered by V.sub.1, and thus, even when the on/off operations of the divided images are delayed, a possibility that a blank (a failure situation where nothing is displayed in the user's gaze) occurs is reduced. Meanwhile, an embodiment of the wide area image and the patch image may be mixed with an embodiment of the divided images. For example, the patch image may be played back in part of a plurality of divided images. In this case, a divided image overlapping the patch image will be understood as a wide area image.
[0075]
[0076] As can be seen from
[0077] The virtual reality image playback device 100 according to the present disclosure receives a streaming image from the virtual reality image providing server 200 to play back a virtual reality image for a user. In a preferred embodiment, the virtual reality image playback device 100 may be a smart terminal including a smartphone or may be a wearable device worn on a user's head but is not limited thereto.
[0078] In the present disclosure, a plurality of virtual reality images are files stitched from one original image generated to implement virtual reality. The streaming image is a single file in which a plurality of virtual reality images are arranged in different time zones based on a reference time allocated to each virtual reality image.
[0079] In the present disclosure, the reference time indicates a reference point at which a specific virtual reality image starts within a single streaming image. A play position refers to how far the virtual reality image currently played back is played. For example, there is a 10-minute virtual reality image, and when 3 minutes and 31 seconds are currently streaming, the play position is 3 minutes and 31 seconds.
[0080] In the present disclosure, the reference time and the play position (or play time) are used to describe a specific position in an image file. Therefore, in this concept, the above-described terms may be replaced with terms such as a reference frame and a play frame. That is, it is okay to treat time as a frame.
[0081] The virtual reality image playback device 100 according to the present disclosure may adopt a method of jumping a streaming time of a streaming image by using the reference time at which an individual virtual reality image starts, when a virtual reality image taken in a different direction has to be played back according to a change in a user's gaze. Accordingly, even when a single streaming image is streamed, a plurality of virtual reality images may be sufficiently played back.
[0082] This will be described in more detail as follows. In a preferred embodiment, the virtual reality image playback device 100 may include a streaming image input unit 110, a streaming time calculation unit 120, an image playback unit 130, and a gaze change detection unit 140.
[0083] The streaming image input unit 110 receives a single streaming image.
[0084] In the related art, when streaming a plurality of virtual reality images obtained by imaging different directions, a server stores a plurality of virtual reality images to stream a new virtual reality image file each time the user's gaze is changed, and thus, there was a great burden. In addition, there was a problem in that a delay occurred in the virtual reality playback device while the file was changed to be streamed.
[0085] However, the streaming image input unit 110 according to the present disclosure receives a single streaming image including a plurality of virtual reality images taken in different directions at the same point and at the same time, and thus, the above-described problem is solved. That is, by consecutively attaching a plurality of virtual reality images in different time zones within a single streaming image, even when reaching a situation where another virtual reality image has to be played back, it is possible to respond only by jumping a playback time of the streaming image without hassle of changing a file being streamed.
[0086] The streaming time calculation unit 120 synchronizes a virtual reality image to be newly played back when the virtual reality image to be played back is changed according to a change in a user's gaze during streaming. In a preferred embodiment, the streaming time calculation unit 120 synchronizes the virtual reality image to be newly played back by adding a play position of the virtual reality image currently played back at the reference time of the virtual reality image to be newly played back when the virtual reality image to be played back is changed according to the change in a user's gaze during streaming.
[0087] This will be described in more detail as follows. The streaming time calculation unit 120 may synchronize a first virtual reality image and a second virtual reality image included in the streaming image in the following way when selectively playing back according to the user's gaze.
[0088] 1) When the first virtual reality image has to be played back, jumping is made to a streaming time at which a play position of the virtual reality image currently played back is added to the first reference time when the first virtual reality image starts.
[0089] 2) When the second virtual reality image has to be played back, jumping is made to a streaming time at which a play position of the virtual reality image currently played back is added to a second reference time when the second virtual reality image starts.
[0090] However, in another embodiment of the present disclosure, the streaming time calculation unit may be excluded from the virtual reality image playback device. That is, the virtual reality image playback device does not calculate a streaming time to be directly jumped, and a virtual reality image providing server may calculate the streaming time to be directly jumped.
[0091] The image playback unit 130 plays back the streamed streaming image on a display device.
[0092] The gaze change detection unit 140 tracks a gaze direction of a user. To this end, the gaze change detection unit 140 may include an acceleration sensor, a tilt sensor, and so on.
[0093] The gaze change detection unit 140 may detect a gaze direction of a user to determine whether or not to change a virtual reality image which is played back. When the user's gaze is outside a space (or angle of view) of the virtual reality image currently played back, the gaze change detection unit 140 provides the virtual reality image to the streaming time calculation unit such that the streaming time calculation unit jumps the streaming time.
[0094] Meanwhile, in another embodiment, the gaze change detection unit 140 may track the gaze direction of the user and transmit a tracking result to the virtual reality image providing server. In this embodiment, the streaming time to be jumped is calculated by the virtual reality image providing server.
[0095] In addition, the virtual reality image playback device 100 according to the present disclosure may further include a configuration such as a communication unit for receiving a streaming image.
[0096] The virtual reality image providing server 200 generates a single streaming image by combining a plurality of virtual reality images and streams the streaming image to a virtual reality image playback device. In another embodiment, a server for generating a streaming image may be physically different from a server for streaming an actual streaming image.
[0097] The virtual reality image providing server 200 may include an image input unit 210, a streaming image generating unit 220, a streaming providing unit 230, and a switching time point detection unit 240.
[0098] The image input unit 210 receives a plurality of virtual reality images which are stitched files from one original image generated to implement virtual reality and are obtained by imaging different directions at the same time of the original image.
[0099] The streaming image generating unit 220 allocates a separate reference time to each individual virtual reality image and generates a single streaming image file without generating a plurality of virtual reality images which are input as a separate file such that the plurality of virtual reality images are arranged in different time zones based on a reference time. One generated streaming image may include information on the reference time.
[0100] Although not illustrated, the streaming image generating unit 220 may include a running time analysis unit that analyzes a running time when playing back a plurality of virtual reality images, a reference time allocation unit that generates a reference time not less than the running time and allocates the reference time to each of the plurality of virtual reality images, and a reference time storage unit that stores information about a reference time matched to the plurality of virtual reality images.
[0101] The streaming providing unit 230 streams the generated a single streaming image to the virtual reality image playback device. When generating a streaming image, the reference time allocated to the plurality of virtual reality images may be provided to the virtual reality image playback device together with the streaming image. The streaming providing unit 230 provides the streaming image by jumping the reference time in a single streaming image even when the user's gaze changes and a virtual reality image to be played back is changed.
[0102] The switching time point detection unit 240 is configured to calculate a streaming time when the virtual reality image playback device does not directly calculate a streaming time to be jumped. When the virtual reality image to be played back is changed as the user's gaze changes during streaming, the switching time point detection unit 240 synchronizes a virtual reality image to be newly played back. In a preferred embodiment, when the virtual reality image to be played back is changed as the user's gaze changes during streaming, the switching time point detection unit 240 synchronizes a virtual reality image to be newly played back by adding a play position of the virtual reality image currently played back to a reference time of the virtual reality image to be newly played back.
[0103]
[0104] As can be seen from
[0105] To this end, the running time analysis unit analyzes a running time when individual virtual reality images are played back. Because the plurality of virtual reality images are generated from a single original image, the running time of individual virtual reality images may be the same in most cases. In the embodiment of
[0106] The reference time allocation unit generates a reference time not less than the running time and allocates the reference time to each of the plurality of virtual reality images.
[0107] In a preferred embodiment, a reference time of an m.sup.th virtual reality image of N virtual reality images may be calculated by the following equation.
STANDARD TIME.sub.m=RUNNING TIME×(m−1)
[0108] For example, the reference time S1 of V1 may be 0, the reference time of V2 may be 10 minutes, and the reference time of V3 may be 20 minutes.
[0109] The reference time storage unit stores information on the reference times matched to the plurality of virtual reality images.
[0110]
[0111] As can be seen from
[0112] When it is determined that the virtual reality image V2 has to be played back because a user's gaze is changed, jumping is made to a streaming time at which the play position t.sub.play is added to the reference time of the virtual reality image V2. That is, 13 minutes and 30 seconds obtained by adding 3 minutes and 30 seconds which is the play position t.sub.play, to the reference time of 10 minutes of the virtual reality image V2 is the streaming time to be jumped.
[0113] The present disclosure solves the problems of the related art by changing time played back in one streaming file rather than changing the file to be streamed as described above. In brief, the present disclosure solves the problems of the related art by replacing movement of a space with movement of time.
[0114]
[0115] In the virtual reality image playback method according to the present disclosure, a plurality of virtual reality images are files stitched from one original image generated to implement virtual reality. The streaming image is a single file in which a plurality of virtual reality images are arranged in different time zones based on a reference time allocated to each virtual reality image. The reference time refers to a reference point at which a specific virtual reality image starts within a single streaming image. The play position refers to how far the virtual reality image currently played back is played. For example, there is a 10-minute virtual reality image, and when streaming is made for 3 minutes and 31 seconds, the play position may be 3 minutes and 31 seconds. In the present disclosure, the reference time and the play position (or play time) are used to describe a specific position in an image file. Therefore, in this concept, the reference time and the play position may be replaced respectively with a reference frame and a play frame. That is, it is okay to treat time as a frame.
[0116] In the virtual reality image playback method according to the present disclosure, when a virtual reality image taken in a different direction has to be played back according to a change in a user's gaze, a method of jumping a streaming time of a streaming image by using a streaming time including a reference time and a play position at which the individual virtual reality image starts is adopted, and thus, even when receiving a single streaming image, a plurality of virtual reality images may be played back.
[0117] As can be seen from
[0118] A single streaming image including a plurality of virtual reality images taken in different directions at the same point and at the same time is received (S1100).
[0119] A virtual reality image corresponding to a current user's gaze is played, and a play position of the virtual reality image currently played back is checked (S1200).
[0120] When the virtual reality image to be played back is changed according to a change in a user's gaze during streaming, a virtual reality image to be played back is newly synchronized by adding a play position of a virtual reality image currently played back to the reference time of the virtual reality image to be newly played back (S1300).