Patent classifications
A63F13/217
In-vehicle gaming systems and methods
A gaming system of a vehicle includes: a game application embodying an interactive game and stored in memory; a sensor of a vehicle configured to determine a present condition while the vehicle is moving; a gaming module of the vehicle, the gaming module configured to, while the vehicle is moving: execute the game application; display a virtual environment of the interactive game via one or more displays in the vehicle; output sound of the interactive game via one or more speakers in the vehicle; control action within the virtual environment of the interactive game based on user input received via one or more input devices of the vehicle; and adjust one or more characteristics of the virtual environment of the interactive game based on the present condition.
REMOTE CAMERA AUGMENTED REALITY SYSTEM
There is disclosed a system for remote tracking of augmented reality media through a real-world environment. The system may use an augmented reality platform incorporating a camera and motion sensors to move about a physical space while an augmented reality object or characters are superimposed within that physical space. The augmented reality platform may move intelligently to keep the augmented reality media within frame as it moves or is adjusted. The augmented reality platform may present a basis upon which to build upon for superimposing augmented reality elements onto augmented reality media and the augmented reality platform as they move about within a space together. The platform may be used, for example, in a racing or platforming style game or experience.
REMOTE CAMERA AUGMENTED REALITY SYSTEM
There is disclosed a system for remote tracking of augmented reality media through a real-world environment. The system may use an augmented reality platform incorporating a camera and motion sensors to move about a physical space while an augmented reality object or characters are superimposed within that physical space. The augmented reality platform may move intelligently to keep the augmented reality media within frame as it moves or is adjusted. The augmented reality platform may present a basis upon which to build upon for superimposing augmented reality elements onto augmented reality media and the augmented reality platform as they move about within a space together. The platform may be used, for example, in a racing or platforming style game or experience.
INTERACTIVE AUDIOVISUAL SYNCHRONIZATION FOR VENUES
Exemplary venues allow for members of an audience at different locations within these venues to simultaneously interact with an interactive content. The interactive content can convey one or more requisite actions to be performed by the members of the audience. However, the one or more requisite actions can reach members of the audience at different instances in time. These exemplary venues can effectively compensate for these different instances in time such that the accuracy and/or the synchronization of one or more response actions, which are performed by the members of the audience in response to the one or more requisite actions, can be characterized as no longer being dependent upon their distance from the interactive content. Rather, the accuracy and/or the synchronization of the one or more response actions by the members of the audience to the one or more requisite actions can be considered as being related to the performance, for example, timing, of the one or more response actions themselves.
MEMORY-BASED MOTIVATIONAL MODE
A method and system for providing a memory-based motivational mode in virtual reality is disclosed. A plurality of virtual reality datasets is stored, each virtual reality dataset including data regarding a different set of stimuli and corresponding biometric data of a player. Launching a virtual reality session includes presenting a current set of stimuli to the player in a virtual reality environment of an interactive content title. A plurality of virtual reality datasets is stored. Each virtual reality dataset includes data regarding a different set of stimuli and corresponding biometric data of a player. Real-time biometric data is tracked and associated with the current set of stimuli presented to the player. The current set of stimuli of the virtual reality environment is updated based on a set of stimuli from an identified virtual reality dataset that include the biometric data corresponding to an identified change.
MIXED REALITY SYSTEM FOR CONTEXT-AWARE VIRTUAL OBJECT RENDERING
A computer-implemented method in conjunction with mixed reality gear (e.g., a headset) includes imaging a real scene encompassing a user wearing a mixed reality output apparatus. The method includes determining data describing a real context of the real scene, based on the imaging; for example, identifying or classifying objects, lighting, sound or persons in the scene. The method includes selecting a set of content including content enabling rendering of at least one virtual object from a content library, based on the data describing a real context, using various selection algorithms. The method includes rendering the virtual object in the mixed reality session by the mixed reality output apparatus, optionally based on the data describing a real context (“context parameters”). An apparatus is configured to perform the method using hardware, firmware, and/or software.
MIXED REALITY SYSTEM FOR CONTEXT-AWARE VIRTUAL OBJECT RENDERING
A computer-implemented method in conjunction with mixed reality gear (e.g., a headset) includes imaging a real scene encompassing a user wearing a mixed reality output apparatus. The method includes determining data describing a real context of the real scene, based on the imaging; for example, identifying or classifying objects, lighting, sound or persons in the scene. The method includes selecting a set of content including content enabling rendering of at least one virtual object from a content library, based on the data describing a real context, using various selection algorithms. The method includes rendering the virtual object in the mixed reality session by the mixed reality output apparatus, optionally based on the data describing a real context (“context parameters”). An apparatus is configured to perform the method using hardware, firmware, and/or software.
SYSTEM FOR GENERATING SIMULATED ANIMAL DATA AND MODELS
A method for generating and distributing simulated animal data includes a step of receiving a set of real animal data at least partially obtained from one or more sensors that receive, store, or send information related to one or more targeted individuals. Simulated animal data is generated from at least a portion of real animal data or one or more derivatives thereof. Finally, the simulated animal data is provided to a computing device. Characteristically, one or more parameters or variables of the one or more targeted individuals can be modified.
SYSTEM FOR GENERATING SIMULATED ANIMAL DATA AND MODELS
A method for generating and distributing simulated animal data includes a step of receiving a set of real animal data at least partially obtained from one or more sensors that receive, store, or send information related to one or more targeted individuals. Simulated animal data is generated from at least a portion of real animal data or one or more derivatives thereof. Finally, the simulated animal data is provided to a computing device. Characteristically, one or more parameters or variables of the one or more targeted individuals can be modified.
CONTROLLING IOT DEVICES THROUGH AR OBJECT INTERACTION
Systems and methods for controlling an Internet of Things (IoT) device through interaction with an augmented reality (AR) object includes pairing an AR object with an IoT device, presenting the AR object on a display of an AR camera device of a user, receiving an interaction signal indicating that the user has interacted with the AR object on the display of the AR camera device, and sending a control signal to the IoT device paired with the AR object in response to the interaction signal. A second user may request presentation of the AR object to the AR display of the AR camera device of the user when the user's AR camera device is located at particular world coordinates. Also, the control signal may be sent when a particular series of interactions with the AR object have been completed, as during game play.