G06T2215/16

ACCURATE POSITIONING OF AUGMENTED REALITY CONTENT

A system for accurately positioning augmented reality (AR) content within a coordinate system such as the World Geodetic System (WGS) may include AR content tethered to trackable physical features. As the system is used by mobile computing devices, each mobile device may calculate and compare relative positioning data between the trackable features. The system may connect and group the trackable features hierarchically, as measurements are obtained. As additional measurements are made of the trackable features in a group, the relative position data may be improved, e.g., using statistical methods.

VR-Based Treatment System and Method
20230047622 · 2023-02-16 ·

An XR-based system (virtual reality, augmented reality, or mixed reality system), is provided to visualize and resolve at least one condition of a subject. A dynamic virtual representation of the subject's body is generated based on the captured physical traits and movement of the subject's body is captured by at least one motion tracking device, and rendered in the extended reality environment. The dynamic virtual representation is synchronized with the movement of the body of the subject, generating a virtual representation of at least one condition of the subject in response to one or more inputs, overlaying or rendering the virtual representation of the condition of the subject on the virtual representation of the body of the subject, and receiving and processing one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the extended reality environment.

Systems and methods for reconstruction and rendering of viewpoint-adaptive three-dimensional (3D) personas

An exemplary method includes maintaining a receiver-side mesh-vertices list, receiving duplicative-vertex information from a sender, and responsively reducing the receiver-side mesh-vertices list in accordance with the received duplicative-vertex information, and rendering, using the reduced receiver-side mesh-vertices list, viewpoint-adaptive three-dimensional (3D) personas of a subject at least in part by weighting video pixel colors from different video-camera vantage points of video cameras that capture video streams of the subject, the weighting being performed according to a respective geometric relationship of each video-camera vantage point to a user-selected viewpoint.

Method and device for allowing simulator to analyze radio wave environment in wireless communication system

Disclosed is a 5G or pre-5G communication system for supporting a data transmission rate higher than that of a 4G communication system such as LTE. The present invention relates to a method by which a simulator analyzes a radio wave environment in a wireless communication system, and the method of the present invention comprises the steps of: allowing a simulator to receive geographic information and position information by which a transmitter and a receiver can be positioned in the geographic information; generating, by the transmitter of the simulator arranged at a random position in accordance with the position information, radio waves for at least one direction of a sphere having a fixed radius; grouping into at least one group on the basis of a traveling route of the generated radio waves; setting each group as an operation unit (Warp/Wavefront) for a graphics processing unit (GPU); and analyzing a radio wave environment by using the GPU in which the operation unit is set.

Blending virtual environments with situated physical reality

Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.

Interactive virtual reality system
11580708 · 2023-02-14 · ·

Provided herein are method, apparatus, and computer program products for generating a first and second three dimensional interactive environment. The first three dimensional interactive environment may contain one or more engageable virtual interfaces that correspond to one or more items. Upon engagement with a virtual interface the second three dimensional interactive environment is produced to virtual simulation related to the one or more items.

SYSTEM AND METHOD FOR PROVIDING PERSONALIZED TRANSACTIONS BASED ON 3D REPRESENTATIONS OF USER PHYSICAL CHARACTERISTICS

The disclosed systems, components, methods, and processing steps are directed to determining user-item fit characteristics of an item for a user body part by accessing a three-dimensional (3D) reconstructed model of the user body part, accessing information about one or more 3D reference models of the item, the information for each 3D reference model including respective dimensional measurement, spatial, and geometrical attributes, performing a 3D matching process based on the 3D reconstructed model and the accessed information of the one or more 3D reference models to determine a best-fitting 3D reference model from the one or more 3D reference models, integrating the best-fitting 3D reference model with the 3D reconstructed model to provide a 3D best fit representation and displaying the 3D best fit representation along with visual indications of user-item fit characteristics.

AUGMENTING A VIEW OF A REAL-WORLD ENVIRONMENT WITH A VIEW OF A VOLUMETRIC VIDEO OBJECT

The A method of augmenting a view of a real-world environment with a view of a volumetric video object on a user device is disclosed . The method includes determining a current pose information (CPI) indicating a current pose of the view of the real-world environment and a desired pose of the volumetric video object in the real-world environment. The method further includes sending the CPI to a remote server. The method further includes receiving a rendered view of the volumetric video object that has been rendered in accordance with the CPI from the remote server. The method also includes augmenting the view of the real-world environment by at least mapping the rendered view of the volumetric video object onto a planar mapping surface arranged according to the desired position of the volumetric video object.

CUSTOMIZED ANIMATED ART

A method for providing an animated art experience to a user includes a user device receiving an image of an art piece selected by the user. The user device obtains information about the art piece. The user device presents a three-dimensional (3D) animated image that corresponds with the selected art image. Upon receiving an action by the user caused by a rotation or tilt of the user device, the user device provides a depth perspective view in correlation with the action and associated viewer angle of the art image such that further portions of the art image become visible. A background and a foreground of the image appear to move naturally as actions and associated viewer angles change.

VIRTUAL REALITY SIMULATOR AND VIRTUAL REALITY SIMULATION PROGRAM
20230008807 · 2023-01-12 · ·

A VR (Virtual Reality) simulator projects or displays a virtual space image on a screen installed at a position distant from a user in a real space and not integrally moving with the user. More specifically, the VR simulator acquires a real user position being a position of the user's head in the real space. The VR simulator acquires a virtual user position being a position in a virtual space corresponding to the real user position. Then, the VR simulator acquires the virtual space image by imaging the virtual space by using a camera placed at the virtual user position in the virtual space, based on virtual space configuration information indicating a configuration of the virtual space. Here, the VR simulator performs adjusts a focal length of the camera such that perspective corresponding to a distance between the real user position and the screen is cancelled.