A63F13/53

Information processing apparatus and warning presentation method

Methods and apparatus provide for: performing information processing on the basis of data received from a device that detects a user position; generating an image to be displayed as a result of the information processing; and determining that the user needs to be warned when the user position indicates that the user moves out of a play area set in an object space in reference to a detection field of the device, and to cause the image generating section to superpose a warning image on the image to be displayed, where the determining includes setting positions of boundary surfaces of the play area with respect to the detection field of the device within a predetermined range from the user.

Information processing apparatus and warning presentation method

Methods and apparatus provide for: performing information processing on the basis of data received from a device that detects a user position; generating an image to be displayed as a result of the information processing; and determining that the user needs to be warned when the user position indicates that the user moves out of a play area set in an object space in reference to a detection field of the device, and to cause the image generating section to superpose a warning image on the image to be displayed, where the determining includes setting positions of boundary surfaces of the play area with respect to the detection field of the device within a predetermined range from the user.

DIGITAL CHARACTER WITH DYNAMIC INTERACTIVE BEHAVIOR
20230009454 · 2023-01-12 ·

A virtual experience system provides dynamically interactive virtual characters in an environment. The system may be built in a modular fashion, enabling new behaviors and functionality to be added without significant alterations to the existing parts of the system. For example, behavior may be modeled using a model with multiple layers. Each layer determines responses to external stimuli based on one or more factors, with the behavior of a character being determined based on a combination of the layers of the model.

DIGITAL CHARACTER WITH DYNAMIC INTERACTIVE BEHAVIOR
20230009454 · 2023-01-12 ·

A virtual experience system provides dynamically interactive virtual characters in an environment. The system may be built in a modular fashion, enabling new behaviors and functionality to be added without significant alterations to the existing parts of the system. For example, behavior may be modeled using a model with multiple layers. Each layer determines responses to external stimuli based on one or more factors, with the behavior of a character being determined based on a combination of the layers of the model.

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND NON-TRANSITORY STORAGE MEDIUM STORING PROGRAM
20230215123 · 2023-07-06 · ·

An information processing apparatus forms a virtual space that is viewable by a user by using a display device, and includes an information acquirer configured to acquire information that causes at least one of movement of a position of the user and a change in an orientation of the user in the virtual space. The information processing apparatus includes a control circuit configured to output, to the display device, based on the information that is acquired, an image in which a scene being viewed by the user in the virtual space is changed in a predetermined time range.

SMART TARGET CO-WITNESSING HIT ATTRIBUTION SYSTEM AND METHOD

A smart target co-witnessing hit attribution system includes a network, a projectile-firing device that includes a projectile repository and an infrared emitter, a smart target that includes a piezoelectric sensor, an infrared sensor, and control circuitry, and an extended-reality gaming application. After confirming a successful hit on the smart target, the application receives target state data and updates gaming metric data to attribute a successful impact on the smart target by the projectile-firing device.

SMART TARGET CO-WITNESSING HIT ATTRIBUTION SYSTEM AND METHOD

A smart target co-witnessing hit attribution system includes a network, a projectile-firing device that includes a projectile repository and an infrared emitter, a smart target that includes a piezoelectric sensor, an infrared sensor, and control circuitry, and an extended-reality gaming application. After confirming a successful hit on the smart target, the application receives target state data and updates gaming metric data to attribute a successful impact on the smart target by the projectile-firing device.

Reconfiguring reality using a reality overlay device

Virtual entities are displayed alongside real world entities in a wearable reality overlay device worn by the user. Information related to an environment proximate to the wearable device is determined. For example, a position of the wearable device may be determined, a camera may capture an image of the environment, etc. Virtual entity image information representative of an entity desired to be virtually displayed is processed based on the determined information. An image of the entity is generated based on the processed image information as a non-transparent region of a lens of the wearable device, enabling the entity to appear to be present in the environment to the user. The image of the entity may conceal a real world entity that would otherwise be visible to the user through the wearable device. Other real world entities may be visible to the user through the wearable device.

Reconfiguring reality using a reality overlay device

Virtual entities are displayed alongside real world entities in a wearable reality overlay device worn by the user. Information related to an environment proximate to the wearable device is determined. For example, a position of the wearable device may be determined, a camera may capture an image of the environment, etc. Virtual entity image information representative of an entity desired to be virtually displayed is processed based on the determined information. An image of the entity is generated based on the processed image information as a non-transparent region of a lens of the wearable device, enabling the entity to appear to be present in the environment to the user. The image of the entity may conceal a real world entity that would otherwise be visible to the user through the wearable device. Other real world entities may be visible to the user through the wearable device.

Peripersonal boundary-based augmented reality game environment

A method of providing an augmented reality game environment within a game space includes obtaining, by a processor, sensor data for the game space, determining, by the processor, a position of a player in the game space based on the sensor data, generating, by the processor, player image data of a peripersonal boundary of the player based on the determined position of the player for rendering a representation of the peripersonal boundary in the game space, the peripersonal boundary being disposed about, and spaced from, the determined position, obtaining, by the processor, player data for the player via an input modality, the player data being indicative of a player directive to modulate the peripersonal boundary, adjusting, by the processor, a size of the peripersonal boundary as a function of the player data, and updating, by the processor, the player image data based on the adjusted size of the peripersonal boundary.