A63F13/5255

Simulation system, processing method, and information storage medium

A simulation system includes a processor including hardware. The processor performs a moving body process of performing a process of moving a moving body corresponding to a user wearing an HMD in a virtual space, a virtual camera control process of controlling a virtual camera moving in accordance with a movement of the moving body, and a display process of generating an image as viewed from the virtual camera in the virtual space as a display image of the head mounted display. The processor performs, in the display process, an image effect process for motion sickness prevention as a process of blurring, compared with a display object in a given distance range from the virtual camera, an image of a display object in a distance range farther than the given distance range to generate the display image.

Simulation system, processing method, and information storage medium

A simulation system includes a processor including hardware. The processor performs a moving body process of performing a process of moving a moving body corresponding to a user wearing an HMD in a virtual space, a virtual camera control process of controlling a virtual camera moving in accordance with a movement of the moving body, and a display process of generating an image as viewed from the virtual camera in the virtual space as a display image of the head mounted display. The processor performs, in the display process, an image effect process for motion sickness prevention as a process of blurring, compared with a display object in a given distance range from the virtual camera, an image of a display object in a distance range farther than the given distance range to generate the display image.

LOCATION BASED AUGMENTED REALITY GAMING SYSTEM
20230098253 · 2023-03-30 ·

An augmented reality (AR) gaming system: samples a first signal encoding data representative of an object's location in a game space, the first signal being, or originating from, a first sensor's signal that identifies the object; samples at least a second signal encoding data representative of the object's location in the game space, the at least a second signal being, or originating from, at least a second sensor's signal that identifies the object, where the first sensor is different from the at least a second sensor; tracks the location of the object in the game space based upon the data representative of the location of the object from both of the first, and the at least a second, signals; and causes visual graphics to be provided to an AR apparatus wearable by the player in response to the tracking of the location of the object in the game space.

LOCATION BASED AUGMENTED REALITY GAMING SYSTEM
20230098253 · 2023-03-30 ·

An augmented reality (AR) gaming system: samples a first signal encoding data representative of an object's location in a game space, the first signal being, or originating from, a first sensor's signal that identifies the object; samples at least a second signal encoding data representative of the object's location in the game space, the at least a second signal being, or originating from, at least a second sensor's signal that identifies the object, where the first sensor is different from the at least a second sensor; tracks the location of the object in the game space based upon the data representative of the location of the object from both of the first, and the at least a second, signals; and causes visual graphics to be provided to an AR apparatus wearable by the player in response to the tracking of the location of the object in the game space.

Systems and methods for detecting objects within the boundary of a defined space while in artificial reality

A system generates a plurality of spatial points based on depth measurements of physical objects. The system determines, based on the plurality of spatial points, an occupancy score for each voxel within a plurality of voxels. The system identifies, based on a gaze of the user, a first set of occupied voxels that are in a field of view of the user and a second set of occupied voxels that are outside the field of view of the user. The system updates the occupancy scores of the first set of occupied voxels by temporally decaying one or more of the plurality of spatial points within the first set of occupied voxels. The system maintains the occupancy scores of the second set of occupied voxels. The system detects intrusions in a predefined subspace within a physical space based on the updated occupancy scores of the first set of occupied voxels.

METHOD FOR QUASI-RANDOM PLACEMENT OF VIRTUAL ITEMS IN AN EXTENDED REALITY (XR) SPACE

A method for quasi-random placement of a virtual item in an XR space includes: accessing a previously generated spatial mapping mesh (SMM) of the XR space; compiling a record from the SMM of open spaces between surfaces of physical elements in the XR space, with corresponding positions and dimensions; selecting from the open spaces: a spawn position for a virtual character, and a random set of other positions, filtering the random set to form a subset. The method then performs a collision analysis to assign a score to each position in the subset partly based on accessibility to that position for the virtual character beginning from the spawn position; and places the virtual item at a position in the subset having a score as high or higher than all other positions in the subset. The method is carried out before user interaction with any virtual element in the XR space.

METHODS FOR PREDEFINING VIRTUAL STAIRCASES CONNECTING PLATFORMS IN EXTENDED REALITY (XR) ENVIRONMENTS

A method predefines a virtual staircase connecting real platforms in an XR space by: accessing a previously generated spatial mapping mesh (SMM); compiling a record from the SMM of available surfaces; identifying available platforms provided by the physical elements and available open spaces therebetween; selecting a first platform at a first level and a second platform at a significantly lower level; selecting a staircase start location at a first edge of the first platform, partly based on predetermined criteria; stacking virtual blocks linearly to form a virtual staircase, with a first virtual block contacting and extending outwards from the first platform, and a last virtual block contacting the second platform at a staircase end location; and performing a collision analysis. If the staircase end location satisfies the predetermined criteria, and if no collisions are detected, the current virtual staircase is displayed in subsequent user interactions with the XR space.

METHOD TO REGULATE JUMPS AND FALLS BY PLAYABLE CHARACTERS IN XR SPACES

A method for regulating falls of a user-controlled character through gaps between surfaces in an extended reality (XR) space in which the user is playing a game includes compiling a record from a previously generated spatial mapping mesh (SMM) of the XR space of surfaces of real elements present in that space, with corresponding positions and dimensions; and, after the game begins, determining whether, if the character approaches a substantially vertical gap between an edge of a first surface at a first level and a second surface, at a second level lower than the first level, continuing motion of the character to fall through the vertical gap will be permitted or prevented. A similar method for regulating jumps rather than falls of a user-controlled character through gaps between surfaces in an extended reality (XR) space in which the user is playing a game is also described.

HYBRID LENS FOR HEAD MOUNT DISPLAY
20230033435 · 2023-02-02 ·

A lens assembly, related methods and constituent optical elements are described. The assembly may be used to direct and focus light for various applications. In one instance, the lens assembly is used in conjunction with one or more sources of light such as projected images or video as part of a virtual reality system. The lens assembly includes two or more optical elements arranged to receive light or direct light through different spatial regions of the assembly at different focal powers corresponding to a first user viewing zone and a second user viewing zone. In one instance, the first user viewing zone is a peripheral viewing zone and the second viewing zone is a primary or non-peripheral viewing zone (or vice versa).

HYBRID LENS FOR HEAD MOUNT DISPLAY
20230033435 · 2023-02-02 ·

A lens assembly, related methods and constituent optical elements are described. The assembly may be used to direct and focus light for various applications. In one instance, the lens assembly is used in conjunction with one or more sources of light such as projected images or video as part of a virtual reality system. The lens assembly includes two or more optical elements arranged to receive light or direct light through different spatial regions of the assembly at different focal powers corresponding to a first user viewing zone and a second user viewing zone. In one instance, the first user viewing zone is a peripheral viewing zone and the second viewing zone is a primary or non-peripheral viewing zone (or vice versa).