A63F13/27

Systems and methods for determining projected target location of a handheld object

A projected target location of a handheld object is determined based on applying translation factors, scaling factors, and offsets to a location of a reference element of the handheld object detected by a camera on a two-dimensional plane. The translation factors are determined based on a difference between a calibration location on the plane and an initial location of the reference element corresponding to the calibration location, and serve to shift the location of the reference element to generate the projected target location. The scaling factors are determined based on an estimated length of a user's arm holding the handheld object, and serve to scale the location of the reference element to generate the projected target location. The offsets are determined based on polynomial equations, and serve to extend the distance between the projected target location and the calibration location.

Digital Video Structural Support System
20230161538 · 2023-05-25 ·

A digital video ramp assembly incorporates process-formed structural LED tiles with modular components that join in a system of scalable structural LED tiles forming a complete LED display structure with integrated LED embedded tiles with interlocking and inter-trans-positioning features sometimes requiring additional structural framing.

Digital Video Structural Support System
20230161538 · 2023-05-25 ·

A digital video ramp assembly incorporates process-formed structural LED tiles with modular components that join in a system of scalable structural LED tiles forming a complete LED display structure with integrated LED embedded tiles with interlocking and inter-trans-positioning features sometimes requiring additional structural framing.

System and method for identifying b-roll conditions in live streams or live rendered content

A video stream management system includes a video controller that live renders video. Moreover, the video stream management system also includes a display that is communicatively coupled to the video controller and displays a primary video feed that includes the live rendered video. The video controller, the display, or a combination thereof, embeds a pixel pattern in the primary video feed. Additionally, the video feed management system monitors one or more displayed images on the display to identify an error in the primary video feed.

MULTIPLAYER SOMATOSENSORY SYSTEM, METHOD, AND APPARATUS COMBINING VIRTUAL AND REALITY, AND MEDIUM

This disclosure discloses a multiplayer somatosensory system, method, and apparatus combining virtual and reality, and a medium, which belongs to the field of human-computer interaction. The system includes: a computer device, and a simulation carrier, n simulation firearms and a display apparatus; where the simulation carrier includes a driver's seat and n passenger seats; the display apparatus is configured to display a virtual environment picture and n aiming points provided by the computer device; and the computer device is configured to control a virtual carrier to change a driving direction in the virtual environment; control the virtual carrier to change a driving speed in the virtual environment; and shoot a virtual object aimed at an i.sup.th aiming point in the virtual environment in response to a shooting operation on an i.sup.th simulation firearm in the n simulation firearms.

GAME DEVICE
20220314098 · 2022-10-06 ·

A game device includes a display surface, a game space assigned to the display surface, a sensor system configured to detect an impact site of an object on the display surface, an acquisition system configured to detect the position of the object and/or of a player in at least a part of the game space, and a computing unit configured to determine a target field on the display surface using the position, wherein the game device is configured to display the target field on the display surface and to determine whether the impact site lies in the target field.

INTERACTIVE ENVIRONMENT WITH PORTABLE DEVICES
20220317782 · 2022-10-06 ·

An interactive system includes a portable device configured to be carried by a user as the user travels through an interactive attraction. The portable device includes a trigger device and first ultra-wideband (UWB) circuitry. A central system includes second UWB circuitry and one or more processors. The one or more processors are configured to determine a location and an orientation of the portable device within the interactive attraction based on communication between the first UWB circuitry and the second UWB circuitry, receive an indication of actuation of the trigger device via communication between the first UWB circuitry and the second UWB circuitry, and display a virtual projectile on a display screen of the interactive attraction based on the location and the orientation of the portable device during the actuation of the trigger device.

Method and apparatus for cloud gaming
11652863 · 2023-05-16 ·

Aspects of the disclosure provide methods and apparatuses for cloud gaming. In some examples, an apparatus for cloud gaming includes processing circuitry. For example, the processing circuitry receives a video sequence and metadata associated with the video sequence. The video sequence includes a sequence of picture frames generated in response to gaming control information, and the metadata is indicative of the gaming control information. The processing circuitry can configure encoding parameters based on the metadata that is indicative of the gaming control information. Then, the processing circuitry can encode the video sequence into a coded video bitstream, based on the encoding parameters.

Method and apparatus for cloud gaming
11652863 · 2023-05-16 ·

Aspects of the disclosure provide methods and apparatuses for cloud gaming. In some examples, an apparatus for cloud gaming includes processing circuitry. For example, the processing circuitry receives a video sequence and metadata associated with the video sequence. The video sequence includes a sequence of picture frames generated in response to gaming control information, and the metadata is indicative of the gaming control information. The processing circuitry can configure encoding parameters based on the metadata that is indicative of the gaming control information. Then, the processing circuitry can encode the video sequence into a coded video bitstream, based on the encoding parameters.

Computer-controlled sidewalk tiles

Command instruction data can be generated via a computerized control system, with the instruction data being formatted to prompt a plurality of tile units to change their output. The instruction data can be sent from the control system to the tile units, with each of the tile units including a tile controller connected to one or more tiles embedded in one or more sidewalk floors, and with each of the tiles including a user interface output device. At least part of the instruction data from the control system can be received via a tile controller of a tile unit. At least part of the instruction data can be processed via the tile controller. In response, the user interface output device of the tile can be signaled via the tile controller to change the output of the output device.