Position-dependent gaming, 3-D controller, and handheld as a remote
10981055 · 2021-04-20
Assignee
Inventors
- Jeffrey R. Stafford (Redwood City, CA)
- Yunpeng Zhu (Foster City, CA, US)
- Steven Osman (San Francisco, CA, US)
Cpc classification
A63F13/92
HUMAN NECESSITIES
A63F13/5258
HUMAN NECESSITIES
A63F13/213
HUMAN NECESSITIES
A63F13/65
HUMAN NECESSITIES
A63F13/5255
HUMAN NECESSITIES
A63F2300/105
HUMAN NECESSITIES
A63F13/216
HUMAN NECESSITIES
A63F2300/69
HUMAN NECESSITIES
A63F13/23
HUMAN NECESSITIES
A63F2300/301
HUMAN NECESSITIES
A63F13/26
HUMAN NECESSITIES
A63F13/211
HUMAN NECESSITIES
A63F2300/5553
HUMAN NECESSITIES
A63F2300/6607
HUMAN NECESSITIES
A63F13/426
HUMAN NECESSITIES
International classification
A63F13/426
HUMAN NECESSITIES
G06F3/00
PHYSICS
A63F13/5258
HUMAN NECESSITIES
A63F13/213
HUMAN NECESSITIES
G06T19/00
PHYSICS
A63F13/26
HUMAN NECESSITIES
Abstract
Methods and systems for using a position of a mobile device with an integrated display as an input to a video game or other presentation are presented. Embodiments include rendering an avatar on a mobile device such that it appears to overlay a competing user in the real world. Using the mobile device's position, view direction, and the other user's mobile device position, an avatar (or vehicle, etc.) is depicted at an apparently inertially stabilized location of the other user's mobile device or body. Some embodiments may estimate the other user's head and body positions and angles and reflect them in the avatar's gestures.
Claims
1. A system for augmented video, the system comprising: one or more processors; and one or more memories storing computer-readable instructions that, upon execution by the one or more processors, cause the system to: determine a distance and a direction between a first mobile device and a first display, wherein the first mobile device comprises a second display that is different from the first display; determine a first position coordinate of the first mobile device using the distance and the direction; determine a position for a displayable object associated with a second mobile device, wherein the position is determined based on the first position coordinate of the first mobile device and a second position coordinate of the second mobile device; and cause the first mobile device to display the displayable object in the position on the second display.
2. The system of claim 1, wherein the first mobile device is configured to control the first display.
3. The system of claim 1, wherein the first mobile device and the second mobile device are video game controllers, and wherein the first display presents first content of a virtual game environment, wherein the second display presents second content of the virtual game environment, and wherein the displayable object comprises an avatar.
4. The system of claim 1, wherein the distance and the direction are determined based on a first reference point on the first mobile device and a first origin on the first display, wherein the first position coordinate of the first mobile device comprises first coordinates of the first reference point defined relative to the first origin.
5. The system of claim 4, wherein the second position coordinate of the second mobile device comprises second coordinates of a second reference point on the second mobile device defined relative to a second origin of a third display.
6. The system of claim 5, wherein determining the position for the displayable object comprises determining a relative distance and a relative direction between the first mobile device and the second mobile device based on the first coordinates, the second coordinates, the first origin, and the second origin.
7. The system of claim 1, wherein the displayable object is displayed as a 180° rotated image through a center of the second display.
8. The system of claim 1, wherein the first mobile device comprises glasses having an integrated display in at least one lens of the glasses.
9. The system of claim 1, wherein the distance is determined based at least in part on a marker on the first display, and wherein the direction is determined based at least in part on a video camera of the first mobile device.
10. The system of claim 1, wherein the distance and the direction are determined based at least in part on an off-board camera system of the system, wherein the off-board camera system is fixed to the first display.
11. A method for augmenting video, the method implemented by a system and comprising: determining a distance and a direction between a first mobile device and a first display, wherein the first mobile device comprises a second display that is different from the first display; determining a first position coordinate of the first mobile device using the distance and the direction; determining a position for a displayable object associated with a second mobile device, wherein the position is determined based on the first position coordinate of the first mobile device and a second position coordinate of the second mobile device; and causing the first mobile device to display the displayable object in the position on the second display.
12. The method of claim 11, wherein the first mobile device and the second mobile device are video game controllers, and wherein the first display presents first content of a virtual game environment, wherein the second display presents second content of the virtual game environment, and wherein the displayable object comprises an avatar.
13. The method of claim 11, wherein the distance and the direction are determined based on a first reference point on the first mobile device and a first origin on the first display, wherein the first position coordinate of the first mobile device comprises first coordinates of the first reference point defined relative to the first origin.
14. The method of claim 13, wherein the second position coordinate of the second mobile device comprises second coordinates of a second reference point on the second mobile device defined relative to a second origin of a third display.
15. The method of claim 14, wherein determining the position for the displayable object comprises determining a relative distance and a relative direction between the first mobile device and the second mobile device based on the first coordinates, the second coordinates, the first origin, and the second origin.
16. One or more non-transitory computer-readable storage media storing instructions that, upon execution by one or more processors of a system, cause the system to: determine a distance and a direction between a first mobile device and a first display, wherein the first mobile device comprises a second display that is different from the first display; determine a first position coordinate of the first mobile device using the distance and the direction; determine a position for a displayable object associated with a second mobile device, wherein the position is determined based on the first position coordinate of the first mobile device and a second position coordinate of the second mobile device; and cause the first mobile device to display the displayable object in the position on the second display.
17. The one or more non-transitory computer-readable storage media of claim 16, wherein the first mobile device and the second mobile device are video game controllers, and wherein the first display presents first content of a virtual game environment, wherein the second display presents second content of the virtual game environment, and wherein the displayable object comprises an avatar.
18. The one or more non-transitory computer-readable storage media of claim 16, wherein the distance and the direction are determined based on a first reference point on the first mobile device and a first origin on the first display, wherein the first position coordinate of the first mobile device comprises first coordinates of the first reference point defined relative to the first origin.
19. The one or more non-transitory computer-readable storage media of claim 18, wherein the second position coordinate of the second mobile device comprises second coordinates of a second reference point on the second mobile device defined relative to a second origin of a third display.
20. The one or more non-transitory computer-readable storage media of claim 19, wherein determining the position for the displayable object comprises determining a relative distance and a relative direction between the first mobile device and the second mobile device based on the first coordinates, the second coordinates, the first origin, and the second origin.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) A further understanding of the nature and advantages of the present invention may be realized by reference to the following drawings. In the appended figures, similar components or features may have the same reference label.
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14) The figures will now be used to illustrate different embodiments in accordance with the invention. The figures are specific examples of embodiments and should not be interpreted as limiting embodiments, but rather exemplary forms and procedures.
DETAILED DESCRIPTION
(15) Generally, methods and systems are described for multi-player video games and other interactive video presentations for which augmented video is presented on a user's mobile device display based on the relative position of another user. A user can hold up his device and see an avatar, vehicle, game marker, target crossbars, or other object in the place of where the other user is sitting. In some embodiments, the other user's avatar on the display can move, look, etc. in the same manner as the other user's physical movements. For example, if the other user turns toward the first user, the display will show the avatar turning toward him.
(16) In some embodiments, the users can be located in different rooms across town, but their avatars are rendered on their respective mobile device's screens as if their avatars were seated next to each other in the same room. A common reference point for each of the players can be the center of his or her fixed display.
(17) This description provides examples only, and is not intended to limit the scope, applicability, or configuration of the invention. Rather, the ensuing description of the embodiments will provide those skilled in the art with an enabling description for implementing embodiments of the invention. Various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention.
(18) Thus, various embodiments may omit, substitute, or add various procedures or components as appropriate. For instance, it should be appreciated that, in alternative embodiments, the methods may be performed in an order different from that described, and that various steps may be added, omitted, or combined. Also, features described with respect to certain embodiments may be combined in various other embodiments. Different aspects and elements of the embodiments may be combined in a similar manner.
(19) It should also be appreciated that the following systems, methods, and software may individually or collectively be components of a larger system, wherein other procedures may take precedence over or otherwise modify their application. Also, a number of steps may be required before, after, or concurrently with the following embodiments.
(20)
(21) A “coordinate” is any of a set of numbers, characters, or symbols used in specifying the location of a point on a one-dimensional line, on a two-dimensional surface, or in three-dimensional space. Coordinates may be orthogonal, such as Cartesian, polar and/or cylindrical, spherical, or non-orthogonal such as those describing a location on the surface of a sphere.
(22) A “view direction” or “look angle” is a direction in space toward which a user's face is pointed or a corresponding user's mobile device is pointed. A view direction can include azimuth and elevation angles relative to the user. A view direction can include a bearing direction in relation to a fixed point.
(23) A mobile device can include a handheld device, such as a Portable Playstation®, a user-worn device, such as glasses with an integrated display, or other electronic devices.
(24) Using the coordinates representing the position and view direction, the mobile display can be used as a secondary display to ‘look around’ the virtual environment. For example, a player driving a virtual tank can slew his mobile device to the left to see enemy troops to the left side outside of the view of the fixed display. As another example, the player can use his mobile device display to zoom into the horizon of the display. The mobile device can act as virtual binoculars to better resolve figures in the distance that might be a threat.
(25)
(26)
(27) Although polar/cylindrical coordinates are used here in the examples, other coordinate systems can be used, such as Cartesian and spherical coordinate systems.
(28)
(29) If mobile device 108 is slewed to the right, then avatar 414 disappears off the left side of the display. If mobile device 108 is slewed to the left, then avatar 414 disappears off the right side of the display. In some embodiments, it can appear as if the embedded display is transparent and the view of the room in the background is the same, except for the other player being overlaid with graphics depicting an avatar.
(30)
(31)
(32)
(33) This mirrored movement can be useful to simulate games in which players play across from one another, such as tennis, handball, chess, etc. This can be used by players in the same room with the same, central fixed display or by players in different rooms with their own displays.
(34)
(35)
(36) Camera 920 can also be enabled to track faces as is known in the art. Facial tracking technology can work to directly determine the position and view direction of a player's head, eyes, nose, etc. A camera on mobile device 908 can also be used to track the player's head.
(37) Video game console 922 connects to camera 920 and fixed display 102. Video game console connects wirelessly, through wireless port 924, with mobile device 908 through wireless link 926. Wireless link 926 can be radio frequency, infrared, etc. The camera may output the position of tracked objects to console 922, or the camera may output raw video to console 922 and console 922 processes the raw video to determine the position, velocity, etc. of tracked objects.
(38) Console 922 can send the coordinates of the tracked objects to mobile device 908 along with the determined view direction of mobile device 908. Mobile device 908 can then use the coordinates and view direction to render an avatar in the correct position on its screen.
(39) In some embodiments, wireless link 926 can be used to send remote control-like commands to the video display. For example, a cellular phone can be used to turn up or down the volume on a television.
(40)
(41) The position of mobile device 1008 can be used as an input to a video game. For example, a user can pace around his living room floor, marking locations where she will have her battleships for a virtual board game of Battleship®. In another example, a virtual game of ‘Marco Polo’ can be played in which players attempt to guess the location of other players without the use of their eyes. A player could move around his T.V. room in order to simulate his virtual position on a field or in a pool.
(42) In other embodiments, the mobile device can automatically determine its position and view direction using a Global Positioning System (GPS) receiver, accelerometer-based inertial system, mechanical or solid-state gyroscope, electronic magnetic compass, radio frequency triangulation, and/or other methods known in the art.
(43)
(44)
(45) A graphics subsystem 1240 is further connected with data bus 1235 and the components of the computer system 1200. The graphics subsystem 1240 includes a graphics processing unit (GPU) 1245 and graphics memory 1250. Graphics memory 1250 includes a display memory (e.g., a frame buffer) used for storing pixel data for each pixel of an output image. Graphics memory 1250 can be integrated in the same device as GPU 1245, connected as a separate device with GPU 1245, and/or implemented within memory 1210. Pixel data can be provided to graphics memory 1250 directly from the CPU 1205. Alternatively, CPU 1205 provides the GPU 1245 with data and/or instructions defining the desired output images, from which the GPU 1245 generates the pixel data of one or more output images. The data and/or instructions defining the desired output images can be stored in memory 1210 and/or graphics memory 1250. In an embodiment, the GPU 1245 includes 3D rendering capabilities for generating pixel data for output images from instructions and data defining the geometry, lighting, shading, texturing, motion, and/or camera parameters for a scene. The GPU 1245 can further include one or more programmable execution units capable of executing shader programs.
(46) The graphics subsystem 1240 periodically outputs pixel data for an image from graphics memory 1250 to be displayed on display device 1255. Display device 1255 can be any device capable of displaying visual information in response to a signal from the computer system 1200, including CRT, LCD, plasma, and OLED displays. Computer system 1200 can provide the display device 1255 with an analog or digital signal.
(47) In accordance with various embodiments, CPU 1205 is one or more general-purpose microprocessors having one or more processing cores. Further embodiments can be implemented using one or more CPUs with microprocessor architectures specifically adapted for highly parallel and computationally intensive applications, such as media and interactive entertainment applications.
(48) The components of the system 108 of
(49) It should be noted that the methods, systems, and devices discussed above are intended merely to be examples. It must be stressed that various embodiments may omit, substitute, or add various procedures or components as appropriate. For instance, it should be appreciated that, in alternative embodiments, the methods may be performed in an order different from that described, and that various steps may be added, omitted, or combined. Also, features described with respect to certain embodiments may be combined in various other embodiments. Different aspects and elements of the embodiments may be combined in a similar manner. Also, it should be emphasized that technology evolves and, thus, many of the elements are examples and should not be interpreted to limit the scope of the invention.
(50) Specific details are given in the description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the embodiments.
(51) Also, it is noted that the embodiments may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
(52) Moreover, as disclosed herein, the term “memory” or “memory unit” may represent one or more devices for storing data, including read-only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices, or other computer-readable mediums for storing information. The term “computer-readable medium” includes, but is not limited to, portable or fixed storage devices, optical storage devices, wireless channels, a sim card, other smart cards, and various other mediums capable of storing, containing, or carrying instructions or data.
(53) Furthermore, embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a computer-readable medium such as a storage medium. Processors may perform the necessary tasks.
(54) Having described several embodiments, it will be recognized by those of skill in the art that various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the invention. For example, the above elements may merely be a component of a larger system, wherein other rules may take precedence over or otherwise modify the application of the invention. Also, a number of steps may be undertaken before, during, or after the above elements are considered. Accordingly, the above description should not be taken as limiting the scope of the invention.