SYSTEM AND METHOD FOR PROVIDING TACTILE FEEDBACK FOR USERS OF VIRTUAL REALITY CONTENT VIEWERS
20170300116 · 2017-10-19
Inventors
- Martin Lyons (Henderson, NV)
- Rolland Steil (Las Vegas, NV, US)
- Marvin A. HEIN, JR. (Las Vegas, NV, US)
- Jeremy Michael HORNIK (Chicago, IL, US)
- Bryan M. Kelly (Alamo, CA)
- Gabriel BARON (Henderson, NV, US)
Cpc classification
G06F3/017
PHYSICS
G07F17/3206
PHYSICS
H04N21/6587
ELECTRICITY
G07F17/3213
PHYSICS
G06F3/016
PHYSICS
International classification
A63F13/28
HUMAN NECESSITIES
A63F13/40
HUMAN NECESSITIES
H04N21/478
ELECTRICITY
G06F3/02
PHYSICS
Abstract
A gaming system and method for integrating tactile feedback into a virtual reality environment as viewed by a virtual reality viewer is disclosed. A physical object, which may be a physical button panel, such as a button panel printed on paper, dice, playing cards, coins or chips, a floor or any other tangible object, has a view thereof incorporated into the virtual reality environment. When the user touches the physical object, the touch gesture is captured and processed by the system to interpret the touch gesture as an input. The physical object may include a printed, projected or touch screen panel, a hand-held object, a portion of a gaming machine cabinet, a table tops, a floor and the like.
Claims
1. A system for providing tactile feedback for a user of a virtual reality viewer, the system including one or more servers to package and control virtual reality content delivered to the viewer responsive to user inputs and a transceiver to deliver the content to the viewer and receive and transmit inputs from the user to the one or more servers through a communication network, the system comprising: a tangible object associable with at least one user input, a user's touch of the object providing a tactile feedback to the user, the physical object not providing any signal to the one or more servers responsive to the touch by the user; a video camera to capture real-time image data corresponding to the user's touch of the tangible object; a controller to allocate a virtual reality input function for the object, receive the image data and (i) synchronize a physical touch of the object with a generated virtual reality image corresponding to a touch of the object, (ii) allocate a virtual reality input function and corresponding signal to the touch of the object and (iii) provide the signal to the transceiver.
2. The system of claim 1 wherein the viewer includes at least one position sensor to detect when a field of view of the user includes the tangible object wherein the controller is configured to generate an augmented reality image of the object to, from the user's viewpoint, overlay one or more images on the object.
3. The system of claim 1 wherein the tangible object comprises visually defined positions to be associated with a plurality of different user inputs each having an associated virtual reality input function and corresponding signal, the controller configured to synchronize the virtual reality images to the visually defined positions and to detect from the image data a touch of one of the visually defined positions to provide its associated signal.
4. The system of claim 1 wherein the tangible object is compressible and the user's touch comprises compressing the object.
5. The system of claim 1 wherein the user's touch comprises pressing upon the object.
6. The system of claim 1 wherein the tangible object comprises a depressible button.
7. The system of claim 1 wherein the user's touch comprises moving the object.
8. The system of claim 1 wherein the object comprises a flat surface and wherein the input signal comprises a location of the user's touch on the object.
9. The system of claim 8 wherein the flat surface further comprises a haptic feedback device.
10. The system of claim 1 wherein the tangible object is selectable by the user from a set of objects viewable via the virtual reality viewer.
11. The system of claim 1 wherein the tangible object is dynamically assigned by the controller from a set of objects detectable by a camera of the system.
12. The system of claim 1 wherein at least a portion of the body of the user is depicted by the virtual reality viewer.
13. The system of claim 1 wherein the user is depicted by the virtual reality viewer as an avatar and wherein the user's touch affects the depiction of the avatar.
14. A system for providing tactile feedback for a user of a virtual reality viewer for playing a virtual gaming device, the system including one or more servers to package and control virtual reality gaming content delivered to the viewer responsive to user inputs and a transceiver to deliver the content to the viewer and receive and transmit inputs from the user to the one or more servers through a communication network, the system comprising: a physical button panel for buttons associable with different user inputs, the user's touch at a button providing a tactile feedback to the user, the panel not providing any signal to the one or more servers responsive to a button touch; a video camera to capture real-time image data corresponding to the user's touch at the button panel; a controller to allocate a virtual reality input function for the buttons, receive the image data and (i) synchronize a physical touch at the panel with a generated virtual reality image corresponding to a touch of the button, (ii) allocate a virtual reality input function and corresponding output signal to the touch of the button and (iii) provide the signal to the transceiver.
15. The system of claim 14 wherein the viewer includes position sensors to detect when the user's field of view includes the tangible object wherein the controller is configured to generate an augmented reality image of the object to, from the user's viewpoint, overlay one or more images on the object.
16. A method for providing tactile feedback to a user of a virtual reality viewer via a system including one or more servers to package and control virtual reality content delivered to the viewer responsive to user inputs and a transceiver to deliver the content to the viewer and receive and transmit inputs from the user to the one or more servers through a communication network, the method comprising: providing a tangible object associable with at least one user input, a user's touch of the object providing a tactile feedback to the user, the physical object not providing any signal to the one or more servers responsive to the touch by the user; capturing, with a camera, real-time image data corresponding to the user's touch of the physical object; receiving the image data; synchronizing image data representing the touch of the physical object with a generated virtual reality image corresponding to the touch of the object; allocating a virtual reality input function and corresponding signal to the touch of the object; and providing the signal to the transceiver.
17. The method of claim 16 wherein the viewer includes one or more position sensors to detect when the user's field of view includes the tangible object; and comprising the step of generating a virtual reality image of the physical object to, from the user's viewpoint, overlay one or more images on the physical object.
18. The method of claim 16 wherein the tangible object is compressible and the user's touch comprises compressing the object.
19. The method of claim 16 wherein the user's touch comprises pressing upon the object.
20. The method of claim 16 wherein the object comprises a flat surface and wherein the input signal comprises a location of the user's touch on the object.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025] While the invention is susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. It should be understood, however, that the invention is not intended to be limited to the particular forms disclosed. Rather, the invention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
DETAILED DESCRIPTION
[0026] While this invention is susceptible of embodiment in many different forms, there is shown in the drawings and will herein be described in detail preferred embodiments of the invention with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the broad aspect of the invention to the embodiments illustrated. For purposes of the present detailed description, the singular includes the plural and vice versa (unless specifically disclaimed); the words “and” and “or” shall be both conjunctive and disjunctive; the word “all” means “any and all”; the word “any” means “any and all”; and the word “including” means “including without limitation.”
[0027] For purposes of illustrating an embodiment of the invention, it will, unless otherwise indicated, be described with reference to a virtual reality environment for casino games. It should be understood that the invention has utility outside of gaming for environments having user button or other touch inputs to control an aspect of, respond to queries from and provide other inputs relevant to the virtual reality experience of the user for example, button-type inputs where tactile feedback to the user may enhance other computer gaming, publishing, digital photo-processing or other environments susceptible to virtual reality (“VR”) viewing by a viewer.
[0028] For purposes of the present detailed description, the terms “wagering game,” “casino wagering game,” “gambling,” “slot game,” “casino game,” and the like include games in which a player places at risk a sum of money or other representation of value, whether or not redeemable for cash, on an event with an uncertain outcome, including without limitation those having some element of skill. In some embodiments, the wagering game involves wagers of real money, as found with typical land-based or online casino games. These types of games are sometimes referred to as pay-to-play (P2P) gaming. In other embodiments, the wagering game additionally, or alternatively, involves wagers of non-cash values, such as virtual currency, and therefore may be considered a social or casual game, such as would be typically available on a social networking web site, other web sites, across computer networks, or applications on mobile devices (e.g., phones, tablets, etc.). These types of games are sometimes referred to play-for-fun (P4F) gaming. When provided in a social or casual game format, the wagering game may closely resemble a traditional casino game, or it may take another form that more closely resembles other types of social/casual games.
[0029] Referring to
[0030] The present invention can also apply to wired networks as well where the VRV 10 is connected by a cable to, for example, a game console, PC computer or the like. In a gaming environment such as a casino where gaming supporting VR content is provided to a VRV 10 provided by or to the player, the communication network will typically be wireless.
[0031] As suggested in
[0032] The VRV 10 may also include a gyroscope, accelerometer, compass and other devices. Modern smart phones often include these devices. As such the VRV 10 can detect movement, direction and speed of movement of the player's head. To provide the player with an immersive VR experience, the VRV 10 may be controlled to alter the VR view of the player as he/she, turns their head or looks up or down. The VRV 10 provides signals responsive to detecting such movements to one or more VR rendering sources (discussed below) to alter in real-time the VR view experienced by the player. For example, the player viewing the gaming device video display 20 may turn their head to the right resulting in the VR content streaming to the VRV 10 being altered to show a view of neighboring gaming devices, other people, or other scenery. VR cameras can acquire a 360°, live video image at a location such as in front of a gaming machine.
[0033] To provide VR content to the VRV 10, the provider may record the VR environment or a portion thereof to be rendered to the player. For example, and with continuing reference to
[0034] As discussed above, it is known to provide for gesture recognition for VRVs such as recognition of hand gestures and finger gestures. In environments where there is no hand-held or hand actuated controller, there are no means or insufficient means for a user to have tactile feedback for inputs such as finger touches or finger slides on buttons. In the illustration of
[0035] To provide tactile feedback to the player, a physical, communicatively inert, button panel 30 is provided an example of which is shown at
[0036] To coordinate the physical button panel 30 with the generated VR environment viewed by the player such as the gaming machine display of
[0037] As can be appreciated to provide tactile feedback to the player, the player touches the communicatively inert physical button panel 30 as shown at 40 (
[0038] The physical button panel 30 can be of any configuration. Where, for example, the physical button is a laser projected button panel projected on a rigid surface, the physical button panel 30 can change based upon the game content being presented. However the button touches are captured by the camera 14 of the VRV 10 and are not, with respect to the VR content, input via the laser projected button panel. That is, the player may wish to play a hypothetical game of “Queen's Treasure” and may so indicate that through their VRV 10. The system would package for delivery to the player the associated VR content and may send a signal through the network to a laser projector to project the corresponding button panel configuration on a rigid surface to define the appropriate physical button panel 30. The VRV 10 captures the laser projected button panel and synchronizes the physical button panel 30 into the VR content for the play of “Queen's Treasure”. The player's touches at the laser projected physical button panel 30 are detected by the camera 14 which interprets the same as an appropriate input.
[0039] The acquisition and incorporation of a virtual replica of the physical button panel 30 into the VR content may through augmented reality in a fashion as described in Lyons, et al U.S. Pat. No. 9,269,219 issued Feb. 23, 2016, published Oct. 24, 2013 and titled “System and Method for Augmented Reality with Complex Augmented Reality Video Images” the disclosure of which is incorporated by reference.
[0040] To provide the VR content to the VRV 10 in a casino environment according to an embodiment of the invention the VRV 10 is in communication with a system 300 as illustrated in
[0041] To configure the VRV 10 according to an embodiment of the present invention where the VRV 10 is a player's smart phone, CCS 302a may be configured to store downloadable configuration software client applications adapted to be downloaded by the player for configuring their smart phone device to receive and process data as described herein. During download this software client may also return to CCS 302a data such as data related to the smart phone configuration, e.g. display size and resolution, video camera resolution and capabilities, e.g. infrared enabled, processing capabilities and operating system and accessory links to receive data from the smart phone device such as the camera, gyroscope, compass and accelerometer for determining the view direction and movement of the VRV 10.
[0042] The smart phone 400 typically includes peripherals such as the camera 14, a gyroscope 418, compass 420 and speaker 422. Other peripherals such as one or more accelerometers may be provided to determine acceleration associated with the movement of the smartphone 400.
[0043] To configure the smart phone 400 into the VRV 10 and with reference to
[0044] The player accesses the system 300 and during the process confirms their credentials and acquires at 700 the appropriate software client application through a download from the CCS 302a to their smart phone 400 to arrange the controller 402 and various modules 404, 406, 410, 412, 414 at the smart phone 400 for configuration as the controller for the VRV 10 to support the features of this invention. As shown in
[0045] At 702 the player launches or initiates the client application to receive VR content. In an embodiment a video instruction may tell the player to move their head such that the VRV 10 camera 14 captures at 704 an image of the physical button panel 30. In an embodiment the controller 402 alone or with processing at the system 300 at 706 synchronizes the view of the physical button panel 30 into the VR content for the gaming device as described with reference o
[0046]
[0047] To provide the player with a platform to play the virtual reality supported game, as shown in
[0048] One or more features of the present invention may be provided to a player who is remotely located from a casino enterprise by an iGaming system for either P2P or P4F gaming. That is a player at home may desire to have a VR gaming experience to play a game for fun wagering virtual credits or, where legal, actually wagering value consideration.
[0049] The website 800 also accesses a player-centric iGaming platform level account module 804 at 806 for the player to establish and confirm credentials for play and, where permitted, access an electronic funds account (eWallet) for wagering. The account module may include or access data related to the player profile (player-centric information desired to be retained and tracked by the host), the player's eWallet and deposit and withdrawal records, registration and authentication information such as username and password, name and address information, date of birth, a copy of a government issued identification document such as a driver's license or passport and biometric identification criteria such as fingerprint, facial recognition data) and a responsible gaming module containing information such as self-imposed (or jurisdictionally imposed) gaming restraints such as loss limits, daily limits and duration limits. The account module 804 may also contain and enforce geo-location limits such as geographic areas where the player may play P2P games, user device IP address confirmation and the like.
[0050] The account module 804 communicates at 805 with a game module 816 for completing log-ins, registrations and other activities. The game module 816 may also store or access a player's gaming history such as player tracking and loyalty club account information. The game module 816 may provide static web pages to the VRV 10 from the game module 816 through line 818 whereas, as stated above, the live VR content is provided from the gaming server 814 to the web game client through line 811.
[0051] The VR game server 814 is configured to provide interaction between the game and the player such as receiving wager information, game selection, button interaction gesture recognition, inter-game player selections or choices to play a game to its conclusion and well the random selection of game outcomes and graphics packages which, alone or in conjunction with the downloadable game client 808/web game client 802 and game module 816 provide for the display of game graphics and player interactive interfaces. At 818 player account and log-in information is provided to the gaming server 814 from the account module 804 to enable gaming. 820 provides wager/credit information between the account module 804 and gaming server 814 for the play of the game and may display credits/eWallet availability. 822 provides player tracking information for the gaming server 814 for tracking the player's play. The tracking of play may be used for purposes of providing loyalty rewards to a player, determining preferences and the like.
[0052] All or portions of the features of
[0053] In a further embodiment where a player at a physical gaming machine would like to continue gaming elsewhere in the casino in a VR environment, the player may elect to move the game being played for play using the VRV 10 at another location such as a player station 600 in a bar or restaurant. This may be advantageous where, for example, the casino venue is limited to a number of gaming machines. The player using their smart phone 400 would go through the steps to transfer the game to the mobile device such as disclosed in Hedrick et al, US Pub App 2015/0228153A1 published Aug. 13, 2015 and titled “System and Method for Remote Control Gaming Sessions Using a Mobile Device” the disclosure of which is incorporated by reference. The system 300 recognizes the request to transfer and thereafter moves the game experience to a VR experience as described above.
[0054] The acquisition of the physical button panel 30 for integration into the VR content may be through augmented reality technology as described in Lyons, et al U.S. Pat. No. 8,469,260 issued Jun. 25, 2013 and titled “System and Method for Assisted Maintenance in a Gaming Machine Using a Mobile Device” the disclosure of which is incorporated by reference. The player with the VRV 10 camera 14 acquires a video of the physical button panel 30 and in an embodiment the bar code 34. The controller 402 and/or system 300 receive the video data and use that information to overlay function graphics for the buttons.
[0055] A generic physical input device other than a button panel may take the form of a compressible ball or a cube or other multi-faceted object that fits in the player's hand. For example, the object may be constructed of foam or rubber. The object can be squeezed and released, acting as button, when the camera 14 of VRV 10 detects the player's hand so acting on the object. In some embodiments, the blank object (as illustrated in
[0056] The above examples of buttons and a button panel may be extended to any number of tangible physical objects which are also within the scope of the various embodiments of the invention. One example of a VR game which may be made available in accordance with one or more embodiments of a system as described by
[0057] As described above with respect to buttons, the camera 14 of VRV 10 acquires a video of each physical object and its orientation on the table or in each player's hands. The controller 402 and/or system 300 receive the video data and use that information to overlay values on the cards and chips, position an avatar of each player around a virtual table and mimic their movements, etc. The values of the playing cards do not matter, nor does the color of the chips, the size of the table, etc. The inclusion of the physical objects in play of the game provides individualized tactile feedback to each player while playing a virtual game presented on the VRV 10. Once the objects are detected, the system overlays all relevant markings, such as backs and rank and suit, on the cards and colors or values on the chips according to their orientation in physical space. For example, if a card is face up, its face is shown. If not, its back is shown. Similarly, if a player “peeks” at his physical cards by lifting physically lifting up a corner, tucks his cards under his chips to signal “staying” in Blackjack, moves one or more chips into a betting circle or the like, these actions will be represented in the virtual world via that player's presentation on the VRV 10 and also in the virtual worlds of any other players of the game.
[0058] In accordance with one or more embodiments, system 300 can also detect if any of the required objects is missing and suspend game play until all required objects are provided and ready for use. Similarly, some embodiments may require the placement of certain objects in certain locations in order for game play to start or continue. For example, the game may direct a player to place his two playing cards in a space marked by a rectangle or to place one of his chips in an ante circle depicted by VRV 10.
[0059] In accordance with still other embodiments, a single die or two or more dice may be used. Again, the player has dice he can physically hold, shake and throw in order to provide tactile feedback to his VR game. The VRV 10 tracks the dice on a tabletop or floor and represents their location on its display so they can be picked up again by the player. As with the card example above, when the player throws the dice, the face that actually lands upright is irrelevant as the image provided in the virtual world will show the outcome determined by the game engine. In accordance with some embodiments, to avoid having to track the dice and have them be picked up by the player, they may be in sealed cup. When it is time to roll the dice, the player can still shake and feel the dice in the cup, but when the player makes a throwing motion, the virtual dice appear thrown while the physical dice remain in the cup. The cup is next used when it is time to throw the dice again.
[0060] In accordance with one or more embodiments, a floor space may become a source of feedback for the player. If a player has an open floor space available in a room, camera 14 of VRV 10 captures an image of the space and determines its size in order to determine a scale usable in the virtual scene. The floor space then becomes akin to a touchscreen surface and the location of the player's feet within the space determines where a “touch” occurs. For example, in a game of virtual roulette, the player may not be sitting at table but, instead, be represented as an avatar in a large virtual world who can walk around on the betting layout, placing wagers or issuing commands with his feet by stepping on virtual buttons portrayed by the VRV 10. A selection or wagering action, for example, may occur if the player jumps up and down, taps his foot, etc. In these cases, the input signal includes not only an activation signal, but position information, such as x-, y- and z-coordinates, as well, all of which may be combined by the system in evaluating the nature of the input.
[0061] Similarly, in accordance with still other embodiments, a surface such as a blank tabletop provided by the player can become the scene for a 3D world portrayed by the VRV 10. The player can walk around edges of the table and see the virtual world or game from different perspectives. In accordance with still other embodiments, the tabletop may also serve as a touchscreen over which the player can “walk” around the surface 1000 of the game space with his fingers, as shown in
[0062] Alternately, the player may stay in place and, by using hand gestures or pressing virtual controls on the surface of the table, rotate or otherwise alter the presentation of the table in order to view it from different angles. In some embodiments, a haptic feedback pad may be placed on the table to provide additional feedback when dice, cards, chips and the like hit the table.
[0063] A VR game may use existing buttons and controls on an existing device such as a gaming machine. At various points in the VR game, more controls than are provided by the existing device may be required and ask a player to dynamically assign objects he can feel and also see in the rendered scene as the new controls. For example, an extra button may be required. In accordance with one or more embodiments, the game may ask the player to select a visible object that he can also feel for use as the button or control. The VRV 10 may display portions of the player's body visible to camera 14, such as his wrist, and the player may select the face of his wristwatch as the additional control. During the game, any time the player touches the face of his wristwatch, the control is activated. Similarly, various surfaces on the gaming machine itself may be selected. In another non-limiting example, the player may elect to use the center top edge of gaming machine cabinet 502 as a control. Again, during the game, any time the camera 14 of VRV 10 detects the player touching the center top edge of the gaming machine cabinet, the control is activated.
[0064] In accordance with some embodiments, the VR game may assign certain unused spaces on a physical device such as a physical button panel 602 or gaming machine cabinet 502 and, using augmented reality, overlay the new control in that space. When the player touches the overlaid control, the underlying surface provides tactile feedback that the additional control has been touched.
[0065]
[0066] At block 1110 of
[0067] In accordance with some embodiments, the viewer includes position sensors to detect when the user's field of view includes the physical object. In these cases; the optional step of generating an augmented image of the physical object to, from the user's viewpoint, overlay one or more images on a live image of the physical object, is performed at block 1120.
[0068] At block 1130, since the physical object does not provide any signal to the one or more servers responsive to a touch by the user, a virtual reality system input function and a signal corresponding to a detected touch of the object is assigned.
[0069] At block 1140, the camera captures real-time image data corresponding to the user's touch of the physical object determined in block 1110 and the image data is sent to and received by the system's server(s) for processing at block 1150.
[0070] At block 1150, the user's touch of the physical object is synchronized with a generated virtual reality image corresponding to the touch of the object to provide visual feedback to the user. As noted above, tactile feedback to the user is provided by the physical object itself
[0071] Finally, at block 1160, the signal assigned in block 1130 is provided to the transceiver and sent to the server.
[0072] The order of actions as shown in
[0073] While the above invention has been described with reference to a gaming environment, it has applications to VR users in other environments where touch feedback would be advantageous. For example, at home, a user may want to engage in online banking or other eCommerce activity. They would print a physical button panel and acquire the client application for providing the VR environment. The user could virtually walk through a store or mall and use the buttons, supported by tactile feedback, to make selections.
[0074] Each of these embodiments and obvious variations thereof is contemplated as falling within the spirit and scope of the claimed invention, which is set forth in the following claims. Moreover, the present concepts expressly include any and all combinations and sub combinations of the preceding elements and aspects.