Patent classifications
A63F13/25
USING GAZE TRACKING TO EFFECT PLAYER CHOICES IN MULTI PLAYER INTERACTIVE NARRATIVES
The present disclosure is directed to controlling outcomes in a game that includes multiple different users playing respective roles of specific virtual characters. The multiple different users may be present at a same physical location, or the different users may be located at different physical locations when movement of their eyes is tracked. Here, a user may choose one of a set of provided selections by simply looking at the chosen selection for a period of time or by looking at the chosen selection and performing an action or gesture. This functionality allows multiple different users to control actions performed by different specific characters via an online multiplayer system. Depending on what a first user looks at, that first user or a second user may be provided with a corresponding set of selections or audio/visual content via respective gaming devices operated by the first and the second user.
BASKETBALL ARCHITECTURE
A basketball system comprising a court system having shot locations defined by shot location lines on the play surface and the shot areas defined by shot area lines on the play surface, and a goal system comprising a respective backboard suspended above the play surface at each opposing farthest end of the play surface, an instrumentation system comprises sensors to detect activities within the boundary line of the court system, and a respective penalty box for each team in which players, coaches, and referees can serve a penalty assessed for an infraction.
AUDIO DEVICE CONFIGURED FOR DAISY CHAINING
A method and system for daisy chaining tournament audio controllers, where the method comprises, in a headset coupled to a first audio controller, the first audio controller being in a daisy chain of audio controllers: receiving a chat signal from a second audio controller in the daisy chain of audio controllers, receiving a microphone signal from a microphone in the headset, summing the chat signal with the microphone signal, communicating the summed signal to a third audio controllers in the daisy chain, and communicating the chat signal to the headset. The microphone signal may be removed from the summed chat signal and microphone signal by adding a second microphone signal 180 degrees out of phase with the microphone signal. The chat signal may be summed with the microphone signal at an amplitude set by a user of the headset after the removal of the microphone signal.
INFORMATION PROCESSING DEVICE, CONTROL METHOD OF INFORMATION PROCESSING DEVICE, AND PROGRAM
An information processing device obtains information regarding the position of each fingertip of a user in a real space, and determines contact between a virtual object set within a virtual space and a finger of the user. The information processing device sets the virtual object in a partly deformed state such that a part of the virtual object, the part corresponding to the position of the finger determined to be in contact with the object among the fingers of the user, is located more to a far side from a user side than the finger, and displays the virtual object having the shape set thereto as an image in the virtual space on a display device.
INFORMATION PROCESSING DEVICE, CONTROL METHOD OF INFORMATION PROCESSING DEVICE, AND PROGRAM
An information processing device obtains information regarding the position of each fingertip of a user in a real space, and determines contact between a virtual object set within a virtual space and a finger of the user. The information processing device sets the virtual object in a partly deformed state such that a part of the virtual object, the part corresponding to the position of the finger determined to be in contact with the object among the fingers of the user, is located more to a far side from a user side than the finger, and displays the virtual object having the shape set thereto as an image in the virtual space on a display device.
TERMINAL APPARATUS, CONTROL METHOD, AND CONTROL PROGRAM
A terminal apparatus, which may include a display unit that displays a game content positioned in a virtual space, a storage unit that stores information related to a predetermined position in the virtual space, a retrieving unit that retrieves instructions for moving including a moving direction given by a player, a calculating unit that calculates a route from a position of the game content in the virtual space to the predetermined position, a determining unit that determines whether or not the calculated route satisfies a predetermined condition related to the moving direction, and a display control unit that controls display of the game content such that the game content moves to the predetermined position along the route when it is determined that the route satisfies the predetermined condition.
TERMINAL APPARATUS, CONTROL METHOD, AND CONTROL PROGRAM
A terminal apparatus, which may include a display unit that displays a game content positioned in a virtual space, a storage unit that stores information related to a predetermined position in the virtual space, a retrieving unit that retrieves instructions for moving including a moving direction given by a player, a calculating unit that calculates a route from a position of the game content in the virtual space to the predetermined position, a determining unit that determines whether or not the calculated route satisfies a predetermined condition related to the moving direction, and a display control unit that controls display of the game content such that the game content moves to the predetermined position along the route when it is determined that the route satisfies the predetermined condition.
User Interface Menu Transitions
Techniques for better information sharing and control switching in a graphical user interface (GUI) are described. In an example, windows are added to a dynamic area of a menu based on an execution of a menu application. The menu is presented in the GUI and each window is shown in a first state to provide quick information about the corresponding application. Upon a user selection of the window, the window is presented in a second state. Based on the window being in the second state, an application module is updated to present an overlay window reproducing the window in the second state. The overlay window is presented coextensive with and over the window in the second state.
User Interface Menu Transitions
Techniques for better information sharing and control switching in a graphical user interface (GUI) are described. In an example, windows are added to a dynamic area of a menu based on an execution of a menu application. The menu is presented in the GUI and each window is shown in a first state to provide quick information about the corresponding application. Upon a user selection of the window, the window is presented in a second state. Based on the window being in the second state, an application module is updated to present an overlay window reproducing the window in the second state. The overlay window is presented coextensive with and over the window in the second state.
METHODS AND SYSTEMS FOR INTERACTIVE GAMING PLATFORM SCENE GENERATION UTILIZING CAPTURED VISUAL DATA AND ARTIFICIAL INTELLIGENCE-GENERATED ENVIRONMENT
Methods and systems are provided for interactive gaming platform scene generation utilizing captured visual data and artificial intelligence-generated environment, with the gaming platform including at least a user device that has or is coupled to a display, with the gaming platform configured to obtain recorded footage associated with an environment pertinent to a game playable via the user device, to generate, based on the recorded footage, one or more video frames for use during playing of the game via the user device, and to display the one or more video frames via the display during the playing of the game via the user device. The recorded footage may be processed using artificial intelligence, and the one or more video frames may be generated using the artificial intelligence and based on the processing of the recorded footage.