Patent classifications
A63F13/323
Method of automating application program operation in a visual display ecosystem
A method of automating application program operation in a visual display ecosystem is implemented with use of a smart device communicating with the visual display ecosystem through a short-range wireless bidirectional communication link. The method entails user creation of a display spatial data template in the visual display ecosystem and storage of the display spatial data template in the smart device located outside the visual display ecosystem. The user creates the display spatial data template in real time during application program operation by manipulating an input device connected to a command processing device on which the application program is operating. The user can retrieve the stored display spatial data template during application program operation at a later time. The system is advantageous in that the storage and retrieval of the display spatial data template makes minimal use of ecosystem resources, including processing power, bandwidth capacity, and display screen real estate.
Method of automating application program operation in a visual display ecosystem
A method of automating application program operation in a visual display ecosystem is implemented with use of a smart device communicating with the visual display ecosystem through a short-range wireless bidirectional communication link. The method entails user creation of a display spatial data template in the visual display ecosystem and storage of the display spatial data template in the smart device located outside the visual display ecosystem. The user creates the display spatial data template in real time during application program operation by manipulating an input device connected to a command processing device on which the application program is operating. The user can retrieve the stored display spatial data template during application program operation at a later time. The system is advantageous in that the storage and retrieval of the display spatial data template makes minimal use of ecosystem resources, including processing power, bandwidth capacity, and display screen real estate.
Synchronized augmented reality gameplay across multiple gaming environments
Various embodiments of the invention disclosed herein provide techniques for implementing augmented reality (AR) gameplay across multiple AR gaming environments. A synchronized AR gaming application executing on an AR gaming console detects that a first gaming console that is executing an AR gaming application has exited a first AR gaming environment and entered a second AR gaming environment. The synchronized AR gaming application connects to a communications network associated with the second AR gaming environment. The synchronized AR gaming application detects, via the communications network, a sensor associated with the second AR gaming environment. The synchronized AR gaming application alters execution of the AR gaming application based at least in part on sensor data received via the sensor to enable the AR gaming application to continue executing as the first gaming console exits the first AR gaming environment and enters the second AR gaming environment.
Signal activated liquid release for virtual, mixed and augmented reality
The present application describes liquid release of fluid, preferably water, onto the head or heads of a player or players of an electronic interactive game, which may incorporate virtual reality, mixed reality or augmented reality. In most embodiments, the gameset includes headgear that includes a liquid reservoir for dispensing a liquid on a player. The gameset may also include an electronic display to visually present imagery to the player, and a receiver that transmits a signal to a receiver coupled to the headgear to dispense liquid from the liquid reservoir onto a player.
Cross-device accessory input and output for enhanced gaming experience
In non-limiting examples of the present disclosure, systems, methods and devices for providing collaborative use of computing resources in videogame execution are provided. A list comprising an identity of a plurality of mobile games executable on a mobile computing device (e.g., smart phone, tablet) and controllable, on the mobile computing device, by one or more input devices for a primary computing device (e.g., personal computer, game console) may be surfaced. A selection of one of the plurality of mobile games may be received. The mobile computing device and the primary computing device may be paired. A video data stream of the selected game being executed on the mobile computing device may be received. The video data stream may be displayed on a display device associated with the primary computing device.
Cross-device accessory input and output for enhanced gaming experience
In non-limiting examples of the present disclosure, systems, methods and devices for providing collaborative use of computing resources in videogame execution are provided. A list comprising an identity of a plurality of mobile games executable on a mobile computing device (e.g., smart phone, tablet) and controllable, on the mobile computing device, by one or more input devices for a primary computing device (e.g., personal computer, game console) may be surfaced. A selection of one of the plurality of mobile games may be received. The mobile computing device and the primary computing device may be paired. A video data stream of the selected game being executed on the mobile computing device may be received. The video data stream may be displayed on a display device associated with the primary computing device.
Information processing device, information processing method, and program for tampering detection
A method is disclosed that works effectively even in the case where a tampering detection method has become known to the public or in the case where program code for executing detection itself has been tampered with. A vibration-information obtaining unit obtains, from a player terminal, vibration history information representing a time series of the results of detection by a vibration detecting unit of the player terminal. An operation-information obtaining unit obtains, from the player terminal, operation history information representing a time series of the results of detection by a touch-operation input unit of the player terminal. An operation-information validity determining unit determines the validity of the operation history information obtained from the operation-information obtaining unit by comparing the operation history information with the vibration history information for which the validity has been determined by a vibration-information validity determining unit.
Information processing device, information processing method, and program for tampering detection
A method is disclosed that works effectively even in the case where a tampering detection method has become known to the public or in the case where program code for executing detection itself has been tampered with. A vibration-information obtaining unit obtains, from a player terminal, vibration history information representing a time series of the results of detection by a vibration detecting unit of the player terminal. An operation-information obtaining unit obtains, from the player terminal, operation history information representing a time series of the results of detection by a touch-operation input unit of the player terminal. An operation-information validity determining unit determines the validity of the operation history information obtained from the operation-information obtaining unit by comparing the operation history information with the vibration history information for which the validity has been determined by a vibration-information validity determining unit.
Head-mounted display tracking
A virtual reality (VR) head-mounted display (HMD), a computer-implemented method, and a VR tracking system are described. Generally, a VR HMD includes an inertial measurement unit (IMU) and an optical sensor. When a second VR HMD is located in a same physical environment, the VR HMD can be operated to track a motion of the second VR HMD in the physical environment. For example, image data captured by the VR HMD in addition to inertial data of both VR HMDs are used to determine a three dimensional (3D) physical position of the second VR HMD and to track the 3D physical position over time. Three degrees of freedom (DOF) or six DOF for the motion are derived from the tracking.
Head-mounted display tracking
A virtual reality (VR) head-mounted display (HMD), a computer-implemented method, and a VR tracking system are described. Generally, a VR HMD includes an inertial measurement unit (IMU) and an optical sensor. When a second VR HMD is located in a same physical environment, the VR HMD can be operated to track a motion of the second VR HMD in the physical environment. For example, image data captured by the VR HMD in addition to inertial data of both VR HMDs are used to determine a three dimensional (3D) physical position of the second VR HMD and to track the 3D physical position over time. Three degrees of freedom (DOF) or six DOF for the motion are derived from the tracking.