METHOD AND SYSTEM FOR USING SENSORS OF A CONTROL DEVICE FOR CONTROL OF A GAME
20180104573 ยท 2018-04-19
Assignee
Inventors
- Mark John Jeffery (Mesa, AZ, US)
- Robert Sunshin Komorous-King (Berkeley, CA, US)
- Manoj Kumar Rana (Gurgaon, IN)
- James S. Frey (Glenview, IL, US)
- Jordan Matthew Blackman (Los Angeles, CA, US)
Cpc classification
A63F13/40
HUMAN NECESSITIES
G06F3/041
PHYSICS
G06F1/1694
PHYSICS
A63F13/92
HUMAN NECESSITIES
G06F3/017
PHYSICS
A63F13/2145
HUMAN NECESSITIES
G06F2200/1637
PHYSICS
A63F2300/105
HUMAN NECESSITIES
G06F2203/0381
PHYSICS
A63F13/26
HUMAN NECESSITIES
A63F13/428
HUMAN NECESSITIES
G06F3/0346
PHYSICS
A63F13/211
HUMAN NECESSITIES
A63F13/27
HUMAN NECESSITIES
International classification
A63F13/211
HUMAN NECESSITIES
A63F13/27
HUMAN NECESSITIES
A63F13/2145
HUMAN NECESSITIES
A63F13/40
HUMAN NECESSITIES
A63F13/92
HUMAN NECESSITIES
Abstract
A control device with a touch screen and motion sensors is held in one hand with the screen facing the user. Preferably, thumb motion of the hand holding the control device on the touch screen sensor of the control device is the input to control the motion and animations of an avatar, wherein the avatar motion is displayed on the control device touch screen or, in an embodiment, on an external display device. An important aspect of the present invention is tilting the control device, causing an angular rotation velocity, which can trigger a game event such as throwing, kicking, shooting or other action of the game.
Claims
1. A system for control of a game, comprising: a control device having motion sensors and a touch screen; a game processor; and a display device; wherein the game processor renders the game based at least in part on motion sensor data relating to tilt gestures obtained from the motion sensors to simulate and animate game events and touch sensor data obtained from the touch screen to provide placements of an avatar on the display device.
2. The system of claim 1, wherein the game processor is remote from the mobile device.
3. The system of claim 1, wherein the control device is a smart phone.
4. The system of claim 1, wherein the display device is external to the mobile device.
5. The system of claim 1, wherein the motion sensors include motion sensors external to the control device.
6. The system of claim 1, wherein the control device is held in portrait mode.
7. The system of claim 1, wherein the control device is held in one hand.
8. The system of claim 1, wherein the motion sensors includes a gyroscope and an accelerometer.
9. The system of claim 1, wherein motion sensor data relating to tilt gestures includes a computed angular velocity.
10. The system of claim 9, wherein the game processor simulates and animates the game events by determining whether the computed angular velocity exceeds a predetermined threshold, and if so, selects the game event.
11. The system of claim 9, wherein the computed angular velocity includes a computed maximum angular velocity.
12. The system of claim 1, wherein the simulated and animated game events results in an animation of one of shooting a basketball, throwing an American football, and bowling a bowling ball.
13. The system of claim 11, wherein the game processor includes a physics engine and the computed maximum angular velocity is input to the physics engine.
14. The system of claim 11, wherein the simulated and animated game events includes a rendering of a virtual object trajectory with initial velocity proportional to the computed maximum angular velocity.
15. The system of claim 1, wherein the simulated and animated predetermined game events results in an animation of one of hitting a golf ball, hitting a tennis ball, pitching a baseball, hitting a baseball, hitting a hockey puck, kicking a soccer ball, casting a fishing rod, and a boxing punch.
16. The system of claim 1, wherein motion sensor includes a gravity sensor.
17. The system of claim 1, wherein motion sensor data relating to tilt gestures includes a computed rotation angle relative to the X-axis of the control device.
18. The system of claim 17, wherein the simulated and animated game events at least in part includes rendering a virtual object trajectory with direction proportional to a computed rotation angle.
19. The system of claim 1 wherein the placements of an avatar on the display device at least in part correspond to changes in relative positions of touches on the touch screen.
20. The system of claim 1, wherein the game processor further displays a feedback meter providing an indication of the strength of the tilt gesture.
21. The system of claim 1, wherein the game processor further displays a feedback meter providing an indication of the direction of the tilt gesture.
22. A method for controlling a game, comprising: obtaining motion sensor data and touch sensor data from a game control device; and controlling the game based at least in part on the obtained motion sensor data and the touch sensor data; wherein the touch sensor data controls placement of an avatar on a display device and the motion sensor data controls a game event.
23. The method of claim 22, wherein the obtained motion sensor data includes data obtained from a gyroscope and an accelerometer of the game control device related to a tilting gesture.
24. The method of claim 22, wherein the obtained touch sensor data includes data obtained from a touch screen of the game control device.
25. The method of claim 22, wherein the obtained motion sensor data includes data obtained from a gyroscope and an accelerometer of the game control device and the obtained touch sensor data includes data obtained from a touch screen of the game control device, this data obtained while the control device is held in a single hand of a user.
26. The method of claim 25, wherein the control device is held in portrait mode.
27. The method of claim 22, wherein the touch sensor data corresponds to thumb touches of the holding hand.
28. The method of claim 22, further comprising, prior to obtaining the motion sensor data, computing the angular velocity data of the control device for a gesture of the game control device; wherein the obtained motion sensor data includes the computed angular velocity data of the control device.
29. The method of claim 28, wherein a computed angular velocity includes a computed maximum angular velocity.
30. The method of claim 29, wherein the computed maximum angular velocity is input to a physics engine.
31. The method of claim 22, wherein the event is one of shooting a basketball, throwing an American football, and throwing a bowling ball.
32. The method of claim 22, wherein controlling the game includes using the obtained touch sensor data and the motion sensor data to simultaneously control different aspects of game play.
33. The method of claim 31, wherein controlling the game at least in part includes rendering a virtual object trajectory with initial velocity proportional to the computed maximum angular velocity.
34. The method of claim 22, wherein the event is one of hitting a golf ball, hitting a tennis ball, pitching a baseball, hitting a baseball, hitting a hockey puck, kicking a soccer ball, casting a fishing rod, and a boxing punch.
35. The method of claim 22, wherein motion sensor data includes a gravity sensor data.
36. The method of claim 23, wherein motion sensor data relating to tilt gestures includes a computed rotation angle relative to the X-axis of the control device.
37. The method of claim 23, wherein controlling the game includes rendering a virtual object trajectory with direction proportional to a computed rotation angle.
38. The method of claim 22 wherein the touch sensor data corresponds to continuous touching of the touch screen such that placement of the avatar is proportional to the change in relative position of the touching on the touch screen for the duration of the continuous touching.
39. The method of claim 22, wherein controlling the game includes rendering a graphic to provide visual feedback to the user regarding strength of a gesture.
40. The method of claim 22, wherein controlling the game includes rendering a graphic to provide visual feedback to the user regarding direction of a gesture.
41. The method of claim 22, wherein the game is a fighting game.
42. The method of claim 22, wherein the game is a virtual reality game.
43. A method for controlling a game, comprising: obtaining motion sensor data from a game control device related to a tilt gesture; computing a maximum angular velocity of the control device from the obtained motion sensor data; and controlling the game based at least in part on the computed maximum angular velocity.
44. The method of claim 43, wherein controlling the game includes determining whether the computed maximum angular velocity exceeds a predetermined threshold, and if so, selecting a predetermined game event.
45. The method of claim 43, wherein controlling the game results in an animation of one of shooting a basketball, throwing an American football, and bowling a bowling ball.
46. The method of claim 43, wherein the computed maximum angular velocity is input to a physics engine.
47. The method of claim 43, wherein controlling the game at least in part includes rendering a virtual object trajectory with initial velocity proportional to the maximum computed angular velocity.
48. The method of claim 43, wherein the obtained motion sensor data includes data obtained from a gyroscope and an accelerometer of the game control device.
49. The method of claim 48, wherein the obtained motion sensor data includes data obtained by applying a low pass filter to accelerometer data.
50. The method of claim 43, further comprising, prior to obtaining the motion sensor data, computing the rotation angle of the control device relative to the X-axis for a gesture of the game control device; wherein the obtained motion sensor data includes the computed rotation angle data of the control device.
51. The method of claim 50, wherein the event at least in part includes rendering a virtual object trajectory with direction proportional to the computed rotation angle data.
52. The method of claim 43, wherein the controlling the game results in an animation of one of hitting a tennis ball, pitching a baseball, hitting a baseball, hitting a hockey puck, kicking a soccer ball, casting a fishing rod, and a boxing punch.
52. The method of claim 43, wherein a rendered graphic provides visual feedback to the user regarding strength of a gesture.
53. The method of claim 50, wherein a rendered graphic provides visual feedback to the user regarding the direction of a gesture.
54. The method of claim 43, wherein the game is a fighting game.
55. The method of claim 43, wherein the game is a virtual reality game.
56. A method for controlling a game, comprising: obtaining motion sensor data from a game control device related to a tilt gesture; computing a maximum angular velocity of the control device from the obtained motion sensor data; controlling the game based at least in part on the computed maximum angular velocity; and displaying a feedback meter providing an indication of the strength of the tilt gesture.
57. The method of claim 56, wherein controlling the game includes determining whether the computed maximum angular velocity exceeds a predetermined threshold, and if so, selecting a predetermined game event.
58. The method of claim 56, wherein controlling the game results in an animation of one of shooting a basketball, throwing an American football, and bowling a bowling ball.
59. The method of claim 56, wherein controlling the game at least in part includes rendering a virtual object trajectory with initial velocity proportional to the computed maximum angular velocity.
60. The method of claim 56, wherein the obtained motion sensor data includes data obtained from a gyroscope and an accelerometer of the game control device.
61. The method of claim 56, further comprising, prior to obtaining the motion sensor data, computing the rotation angle of the control device relative to the X-axis for a gesture of the game control device; wherein the obtained motion sensor data includes the computed rotation angle data of the control device.
62. The method of claim 61, wherein the event at least in part includes rendering a virtual object trajectory with direction proportional to the computed rotation angle data.
63. The method of claim 56, wherein the controlling the game results in an animation of one of hitting a tennis ball, pitching a baseball, hitting a baseball, hitting a hockey puck, kicking a soccer ball, casting a fishing rod, and a boxing punch.
64. A cloud-based gaming system, comprising: a plurality of control devices each having motion sensors and a touch screen; a gaming server including a gaming rules engine; and a plurality of display devices; wherein the plurality of control devices and display devices are connected via the Internet to the gaming server; wherein the gaming rules engine manages game play for a plurality of users for a plurality of games being concurrently played, each user using one of the control devices to control play in a respective game; and wherein the gaming server receives motion sensor data sensor data from each of the control devices to control a respective game being played, the motion sensor data relating to tilt gestures to control a game event for a respective game.
65. The method of claim 64, wherein the motion sensor data includes data from a gyroscope and an accelerometer.
66. The method of claim 64, wherein control of the game by the gaming server includes determining whether a computed maximum angular velocity exceeds a predetermined threshold, and if so, selecting a predetermined game event.
67. The method of claim 64, wherein control of the game by the gaming server includes animation of one of shooting a basketball, throwing an American football, and bowling a bowling ball.
68. The method of claim 66, wherein control of the game by the gaming server includes rendering a virtual object trajectory with initial velocity proportional to the maximum computed angular velocity.
69. The method of claim 64, wherein control of the game by the gaming server includes animation of one of hitting a tennis ball, pitching a baseball, hitting a baseball, hitting a hockey puck, kicking a soccer ball, casting a fishing rod, and a boxing punch.
70. The method of claim 64, wherein control of the game by the gaming server includes rendering a graphic providing visual feedback to the user regarding strength of a gesture.
71. The method of claim 70, wherein the rendered graphic is displayed on one of the control devices.
72. The method of claim 64, wherein the game is a virtual reality game.
74. The method of claim 64, wherein the plurality of control devices are in a stadium.
75. The method of claim 64, wherein at least one display device is the digital board in a stadium.
76. The method of claim 64, wherein the game is a virtual reality game.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
DETAILED DESCRIPTION OF THE INVENTION
[0055] For clarity and consistency, the following definitions are provided for use herein:
[0056] As used herein, a control device refers to a portable device having sensors, including, but not limited to, a gyroscope, accelerometer and a touch sensor. In certain embodiments, the sensors are integral to the control device. However, in other embodiments, the sensors can include external sensors. In certain embodiments the control device may have integrated memory and a processor, and in other embodiments the processing may be enabled in a console or PC based system or other mobile device, connected via a cable or wirelessly to the control device.
[0057] As used herein, a display device is any display with the capability to display a web page, 3-D graphics engine output or any other downloadable application. A display device also includes a virtual reality headset with the ability to connect to a control device.
[0058] As used herein, a sensor is any device collecting data. Non-limiting examples of a sensor can be a gyroscope, touch sensor, accelerometer, camera, audio input, Doppler depth sensor, infrared motion sensor or thermal imaging camera.
[0059] As used herein, an animation is any graphically rendered output for a game, typically rendered by a graphics engine at the appropriate frame rate for a display device. Non-limiting illustrative examples of animations include pictures, lines, shapes, textures, videos, 3D renderings such as moving balls, or 3D rendered avatar movement.
[0060]
[0061] A representative display 308 usable in conjunction with the present invention is an LED-backlit IPS LCD 7501334 pixels 16 M colors, with an integrated capacitive 3D touchscreen that is the touch sensor illustrated by 102. A representative motion sensor 101 useable in conjunction with the present invention is M10 Motion coprocessor gyroscope, and the representative accelerometer is the M10 Motion coprocessor. However, it is to be understood that the present invention is not limited to motion or touch sensor or technology currently available. As shown, additional sensors 310 may be connected 308 (wirelessly or via a cable) to the control device 300.
[0062] The exemplary mobile device 325 illustrated in
[0063]
[0064] The methods and systems described herein are not limited to mobile devices such as Apple and Android smartphones, and the control device is not required to connect to the Internet. The disclosed technology for the sensors, internal or external to the device, is understood to be non-limiting and the quality of the sensor outputs are expected to improve over time. As an illustrative example, the touch screen sensor usable in conjunction with the present invention can be based upon any of various methods such as resistive, capacitive, optical imaging, or other method of touch detection such as a personal computer mouse.
[0065] Referring to
[0066] It is to be understood that there may be a multitude of control device sensors, and hence the specific sensors used, and the specific outputs of a sensor used are understood to be non-limiting. It is to be further understood that multiple sensors may be used simultaneously, and that while the invention is illustrated by examples with a single control device 300 the method is extensible to multiple sensors or control devices. As illustrative non-limiting examples, (1) a control device 300 could be held in one hand and additional sensors 310 on a wrist, or (2) a control device 300 could be held in each hand and a sensor 310 in a virtual reality display device headset. These examples are understood to be non-limiting methods and systems of the present invention are extensible to an arbitrary number of sensors attached to different parts of the body, such as the ankles, elbows, knees, and head.
[0067] An embodiment of the invention is for simultaneous touch and gesture input to a mobile device 325 in order to control a sports game. While prior art discloses independently (1) touch input and (2) motion gesture input to control a game, there are significant synergies to combining these two previously independent modalities. Preferably, the mobile device 325 is held in one hand in portrait mode, with the screen 308 facing the user. Preferably, in this illustrative embodiment, touch input to the touch screen 308 is by the thumb of the holding hand to control a game avatar movement in any direction, and titling gestures trigger shooting of objects, such as a basketball, soccer ball or other object, displayed 200 on the screen of the mobile device 308 or other external display device 350. Furthermore, the currently disclosed method of gesture analysis enables a game of skill shooting long or short, left, right off the backboard, as an illustrative example for a basketball game. The disclosed invention is therefore a method and system of one-handed control of a game with high-fidelity and overcomes significant limitations of the prior art, which typically requires button and joystick input to a mobile device or controller held with two hands in landscape mode.
[0068] These and other novel elements of the invention will become apparent from the following detailed description of the invention in the context of control of a sports game for basketball and then with respect to other sports games including football, bowling, soccer, baseball and tennis. However, it is to be understood that the following examples are not meant to be limiting.
[0069] Basketball Game Embodiment
[0070] Referring to
[0071] The animation controller 150 detects specific basketball-related events such as dribbling across the court, arm movements, shooting a ball or attempting to block a shot (events 140) based in part upon the sensor inputs 002 and 003, the logic of the layered logic trees used by the logic engine 130, and listener protocols used by the event manager 135. As an example, a thumb input 003 such as a screen swipe to move a player on the court sensed by the touch sensor 002 will trigger an event 140 in the animation controller 150 to push a specific animation from the database 145 for rendering and display 200 by the 3D graphics display engine 210. For a gesture input 002, the logic engine 130 creates an event 140 that triggers the physics engine 175 to render a ball flight simulation and an animation 145 related to the event 140 (a basketball shot, for example). An important aspect of the present invention is the use of multiple concurrent sensor 100 inputs to control a game and the resulting blended animation data 180 that results from in blended game output 215 rendered on the game display 200.
[0072] The inventive method is illustrated in more detail through various examples.
[0073]
[0074]
[0075] An additional feature of the invention is a feedback meter 155, illustrated in an embodiment in
[0076]
[0077] It is customary in the art to place an orthogonal coordinate system (X, Y, Z) on the control device 300 such that Y is along the long axis of the device, X is perpendicular to Y and is on the short axis, and Z is perpendicular to both X and Y. Motion sensor 101 outputs are then referenced relative to this object coordinate system.
[0078]
[0079] Sensor Kinetics by Innoventions, Inc. has a sensor fusion gravity sensor, which produces similar data output to that shown in
[0080] As the control device 300 is rotated in space, the rotation can be detected from the X, Y, Z gravity data (g.sub.X, g.sub.Y, g.sub.Z). Typical gravity data outputs of motions sensors 101 have maximum ranges from +9.8 m/sec.sup.2 to 9.8 m/sec.sup.2. The magnitude of the X earth gravity vector g.sub.X is related to the angle 008 by:
[0081] In a first order Taylor series approximation sin() so that:
g.sub.X=g sin()g .
[0082] Hence, g.sub.X/g. Therefore, gravity sensor data g.sub.x/g is approximately equal to the angle 008 in radians.
[0083]
[0084]
[0085] As shown, touch and gesture controls can simultaneously control an avatar for running and shooting during a virtual basketball game. In a preferred embodiment the touch input is understood to be continuous, with the shot gesture 075 occurring at any time by the user 010. An aspect of the invention is that no virtual joystick is rendered on the display 308 of the control device 300. The touch motion is instead centered about the last point the finger or thumb was placed on the control device touch sensor. Furthermore, a preferred embodiment has hundreds of animations in the content database and the simultaneous gesture and touch sensor inputs trigger a multitude of animations which are rendered as blended game output, where illustrative non-limiting examples include dunking, crossovers and spin moves. Hence, the exemplary illustrations are to be understood as showing a very small subset of possible gestures, movements and events for an actual virtual basketball game.
[0086] An additional and important aspect of the invention is the graphical feedback meter 155 that is updated periodically proportional to the magnitude of the sensor 100 inputs. Preferably the updates occur at the frame rate of the system 400 and the feedback meter effectively dynamically registers the strength of a gesture 075.
[0087] The method and system 400 are not limited to a single player basketball shooting game.
[0088] A feature of the invention, illustrated in the embodiment of
[0089] The blended game output 215 and feedback meter 155 are not limited to rendering on the display 308 of the control device 300.
[0090] The invention has at least three embodiments incorporating a control device 300 and a display device 200: (1) the control device 300 is also the display device 200, such as a mobile smart phone (2) the control device 300 is connected to an external display device 200, via a cable, Bluetooth or other local area network, and (3) the control device 300 is connected to the display device 200 via a cloud-based gaming platform 500. In the embodiment (2) the display device maybe connected to a gaming console such as a PlayStation 4 or Xbox One, or a personal computer (PC). In embodiment three (3) it is to be understood that the display device and control device are internet enabled, whereas in the other two embodiments, (1) and (2), the display and control device are not required to be connected to the internet. Hence, the connection method of the control device to the display device is understood to be non-limiting.
[0091] Cloud-Based Gaming Platform Embodiment
[0092]
[0093] As shown, the three major components of the gaming platform 500 are the control devices 300, a gaming server 450, and display devices 350. The gaming server 400 includes a gaming rules engine 451 that manages a plurality of games being played. As shown, the gaming rules engine 451 has access to a user database 455, and a gaming resources database 460. The user database 455 stores login information and game information. For basketball, the game information can include data for each shot made during the game, the player's current score, current level number, etc. The gaming resources database 460 can include graphical content for simulating the game on the display device 350.
[0094] In the illustrated embodiment, the gaming server 450 is cloud-based enabling global connectivity via the Internet 475. For each user, the user's control device 300 and display device 350 can be simultaneously connected to the gaming server 500 through separate and distinct Internet connections 425. Preferably, the internet connections 425 in
[0095] Illustrative Preferred Embodiments
[0096] In the description of the present invention, exemplary methods for performing various aspects of the present invention are disclosed. It is to be understood that the methods and systems of the present invention disclosed herein can be realized by executing computer program code written in a variety of suitable programming languages, such as C, C++, C#, Objective-C, Visual Basic, and Java. It is to be understood that in some embodiments, substantial portions of the application logic may be performed on the display device using, for example, the AJAX (Asynchronous JavaScript and XML) paradigm to create an asynchronous web application. Furthermore, it is to be understood that in some embodiments the software of the application can be distributed among a plurality of different servers.
[0097] It is also to be understood that the software of the invention will preferably further include various web-based applications written in HTML, PHP, Javascript, XML and AJAX accessible by the clients using a suitable browser (e.g., Safari, Microsoft Edge, Internet Explorer, Mozilla Firefox, Google Chrome, Opera) or downloadable and executable as a stand-alone application to a suitably conFIG.d display device. Furthermore, the graphics engine software maybe one of Unreal, Unity, GameMaker or other software system capable of rendering 2D and/or 3D graphics on a display device 350.
[0098] In a preferred embodiment where the display device 350 is the control device 300, we use the Unity 3D game engine primarily for the implementation of the system 400. For the alternate preferred embodiment of the cloud-based system 500, preferably we install Unity on both the control device 300 and display device 350, and use the web-socket protocol to communicate via the gaming server 450.
[0099] Preferably Unity 5 with a frame rate of 30 frames per second, such that the system 400 is updated every 33 msec., is used. However, the frame rate is limited by the computing power of the control device 300 and display device 350, so that we anticipate higher frame rates in the future.
[0100] Gesture Sensing
[0101] For gestures of the type corresponding to
[0102] For gestures of the type corresponding to
[0103] The gesture recognition and analysis for the basketball game embodiment is performed as follows:
[0104] 1) Measure and store input gyroscope data via the call Input.gyro.rotationRateUnbiased, for the gesture
[0105] 2) Measure the input acceleration data via the call Input.acceleration, for the gesture
[0106] 3) Check if the gyroscope measurement meets a minimum instantaneous rotation threshold, 3.5 rad/sec in the basketball embodiment, if yes, begin the gesture co-routine.
[0107] 4) The gesture co-routine performs its own task every frame as follows:
[0108] (a) Store a new variable for the peak rotation velocity during this shot gesture. Initially populate this variables with the instantaneous sensor measurements for the frame the co-routine was started.
[0109] (b) For the duration of this co-routine, preferably 250 ms in the basketball embodiment, compare instantaneous gyroscope measurements to the stored peak values, replacing peak values with the instantaneous measurements if they are larger.
[0110] (c) Store a new variable for the peak x-axis acceleration during this shot gesture. Initially populate this variables with the instantaneous sensor measurements for the frame the co-routine was started.
[0111] (d) For the duration of this co-routine, preferably 250 ms in the basketball embodiment, compare the absolute value of the instantaneous accelerometer measurements to the absolute value of the stored peak values, replacing peak values with the instantaneous measurements if they are larger.
[0112] (e) Co-routine is finished once the maxima are located, the stored peak values are final, and is passed to the PlayerShotCalculator class to create a target position and trajectory for the shot. The stored peak gyro value is used to adjust the target position forward/back, and to increase/decrease the ball flight time. The stored peak x-axis acceleration is used to adjust the target position left/right.
[0113] Touch Sensing
[0114] Touch input to the control device is straight forward and preferably uses the Unity APIs Input.GetMouseButtonDown, GetMouseButton and Input.mousePosition: GetMouseButtonDown returns true only on the frame the user first pressed the mouse button or touched the screen, GetMouseButton returns true every frame while the button is held down or a touch remains on the screen, and Input.mousePosition returns the pixel coordinates of a touch or mouse position.
[0115] To capture touch movement every frame we check if the user begins a touch with GetMouseButtonDown (0). If yes then store the touch position with Input.mousePosition. We then check if the user continues to touch the screen with GetMouseButton(0). We then compare the current touch position with the touch position stored when the touch first began. If the user is no longer touching the screen, we reset relevant values to 0. The advantage of this method is a virtual joystick that is always centered where the user first touches the screen. If the user is no longer touching the screen, it will be re-centered wherever the user begin touching again.
[0116] Animation Database
[0117] The database 145 of the system 400 preferably comprises various graphical and animation files. Preferably animations are encoded in a FBX (filmbox) and encode texture, mesh, bone, and animation data. The animations are captured from human movements via a motion capture (MOCAP) studio. Representative MOCAP systems include VICON, QUALISIS, Xsens, Optitrack, and IPI.
[0118] The method to capture, clean and import MOCAP FBX files into a graphic engine, such as Unity 5, is well known to those skilled in the art. Furthermore, the method of animation control via a blended logic tree is also well known to those skilled in the art. The inventive method disclosed in the preferred embodiment herein, however, is to use multiple sensor 100 inputs to control the animations 145 wherein the input control includes simultaneously both touch and gesture.
[0119] The illustrative embodiments of the disclosed method do not require Unity however. As an illustrative example on Android devises, access to the gyroscope is done with SensorManager's getDefaultSensor (Sensor.TYPE_GYROSCOPE) in the SDK. Touches are accessed by the MainActivity by overriding the onTouchEvent(MotionEvent event) method and Touches are accessed by a view by registering a View.OnTouchListener with the view's setOnTouchListener( ). Hence the platforms (IOS/Android), SDK, calls, and graphics engine and are non-limiting to the method disclosed herein.
[0120] Gaming Platform
[0121] For the cloud-based game platform 500 embodiment, we implement the method 250 as a native application 306 for both Apple IOS and Android control devices 300. Data capture on an Apple Device is enabled via the Apple iOS CMMotionManager object to capture device motion data, attitude, accelerometer and gravity. We use the Gravity method of CMAcceleration subclass of CMDeviceMotion object to capture the gravity sensor data. We use the Attitude method of CMAttitude subclass of CMDeviceMotion object to capture the attitude sensor data. We call startDeviceMotionUpdatesToQueue:withHandler method of the CMMotionManager object to begin the data capture. Data is captured at 1/100th of second's intervals. We set the data capture interval using deviceMotionUpdatelnterval property.
[0122] In a preferred embodiment 500, the gaming engine 450 is implemented using Amazon web services, and the web-enabled display 350 for all major commercially available compatible web browsers (Firefox and Safari). Preferably, we use the Unity 5 graphics engine called from the application 306 and in an embodiment install Unity 3D 5 in an appropriate HTML 5.0 web page of the display device 350. In an alternate preferred embodiment, the Unity 5 graphics engine is compiled as a stand-alone native application and downloaded to the display device, wherein the application has the ability to connect to the internet via the web-socket protocol and receive input data from the control device 300 via the gaming server 450.
[0123] We communicate data in the platform 500 using web socket connections. The control device 300 uses the WebSocket API to send data to the gaming server 450, and the browser 350 where the Unity 3D graphics engine is installed on the control device 300 and the web-enabled display 350. A web socket connection with the browser is persistent for the duration of a played game.
[0124] We use the WebSocket API to receive data from the control device 300 and communicate with the Unity game engines. As an example, when UnityAndroid completely loads, it sends a callback to our native app gameLoadedOnDevice( ). In the UnityWeb case, it sends a socket call back to a native browser app. The native browser app sends back the details of the play result, to UnityWeb by calling unity.sendMessage(unity function). To replicate the device's behavior on the web-enabled display 350, UnityAndroid or UnityiOS does all the socket communication with the server via the native app only. Appropriate methods are defined in the native app 306 that handles the socket calls. Unity just calls the respective methods whenever needed. The response to network calls is also listened for by the native app and it communicates these data back to Unity via unity.sendMessage(unity function).
[0125] The method 400 algorithm keeps running in the background when a user 010 starts the UnityAndroid or UnityiOS. Whenever the method 400 detects sensor 100 input and subject to the logic 130, the method 400 sends the trigger event 140 to the UnityAndroid or UnityiOS and web socket call to UnityWeb. It is to be understood that the software and systems calls disclosed in this preferred embodiment will change in the future, and therefore the embodiment is non-limiting.
[0126] For clarity in the basketball example, we illustrated the method using a single control device 300 with integrated sensors 100; however this example is non-limiting.
[0127] In-Stadium Game Embodiment
[0128] U.S. Published Patent Application 2016/0260319 entitled Method and System for a Control Device to Connect and Control a Display Device to Jeffery et al., the contents of which are incorporated herein by reference in their entirety, has previously disclosed multiple users playing a sports game simultaneously on a digital board in a stadium.
[0129] These screens were originally made of 16 or more small flood-beam CRTs (cathode ray tubes) and ranged from 2 to 16 pixels. The newest model JumboTron and Jumbovision screens are now large-scale LED displays. Both the newer and older versions enable multiple device connections and can be connected with various audio and video formats. These systems can display almost any type of format connected with any of the following: VGA, DVI, HDMI and Coaxial with USB connectivity on the latest systems. That is, JumboTrons can project computers, smartphones, Blu-ray players, and other digital devices. Hence, it is straightforward to display a game output 200 of the invention, such as a web-page in an embodiment, on a JumboTron, and create a display device 350 for 1000's of simultaneous users. However it is understood that the example is illustrative and non-limiting.
[0130] The mode of play for the embodiment illustrated in
[0131] Illustrative Sports Game Embodiments
[0132] In the following description we illustrate a multitude of possible variations of the present invention to video and mobile games such as football, bowling, tennis, baseball, hockey, soccer, fishing, and a third person fighting game. These examples are understood to be illustrative and non-limiting. For brevity, we disclose embodiments via the respective touch and gesture inputs and corresponding avatar 015 game output 200 for each example, since these sensor 100 inputs and the method 400 enable the game output 200. Where applicable, we point out unique features of the invention illustrated by the specific embodiments.
[0133]
[0134]
[0135]
[0136]
[0137]
[0138]
[0139]
[0140]
[0141]
[0142]
[0143] It is to be understood that many additional games may be derived from the touch and gesture control method illustrated in
[0144] Virtual Reality Game Control
[0145] The methods and systems of the disclosed invention are also applicable to virtual reality (VR) game applications. A representative VR headset is the Samsung Gear VR, which is a headset comprising mechanical lenses, a track pad, and two proprietary buttons (collectively sensors 100). An Android mobile phone 300 is clipped into the Gear VR headset, and provides the display 308 and processor 303, illustrated in
[0146] The Oculus Rift (Oculus VR) is an illustrative VR system that is powered by an external personal computer (PC). The Oculus includes a headset with architecture similar to the control device 300 with a communication interface 301, OLED panel for each eye display 308, a RAM memory controller 305, and a power supply 307. The communication interface 301 controls various inputs including a headphone jack, an XBOX One controller, motion sensor 101 inputs, HDMI, USB 3.0 and USB 2.0, and 3D mapped space input via a constellation camera system. The OLED panel for each eye is an HD, or optional UHD, and uses a low persistence display technology rendering an image for 2 milliseconds of each frame. The RAM memory controller 305 renders 3D audio with input of 6DOF (3-axis rotational tracking+3-axis positional tracking) through USB-connected IR LED sensor, which tracks via the constellation method. The power supply 307 is enabled via a USB connection to the PC connected to the constellation cameras. The PC required to operate the Oculus has the following minimum specifications: CPU equivalent to an Intel Core i5-4590, at least 8 GB or RAM, at least an AMD Radeon R9 290 or Nvidia GeForce GTX 970 graphics card, an HDMI 1.3 output, three USB 3.0 ports and one USB 2.0 port with Windows 8 or newer. The Oculus supports two additional external sensor devices 310 called Oculus Touch, one for each hand, and each with two buttons, a touch sensitive joystick and motion sensors. As illustrative prior art, shooting in an Oculus game is typically controlled by a button press on the external sensor device 310.
[0147]
[0148] In the illustrative embodiment of
[0149] While this invention has been described in conjunction with the various exemplary embodiments outlined above, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, the exemplary embodiments of the invention, as set forth above, are intended to be illustrative, not limiting. Various changes may be made without departing from the spirit and scope of the invention made without departing from the spirit and scope of the invention.