G06F3/04815

System and method for facilitating user defined virtual space

A system and method for facilitating a user defined virtual space is disclosed. One or more virtual space locations and/or activities may be correlated with user specified geolocations. In some implementations, the user specified geolocations may be verified against one or more spatial requirements prior to recording the user selected space-geolocation correlations. A user request to initiate an action or activity in the virtual space may be received. Prior to executing the requested action or activity in the virtual space, the user current geolocation may be verified against that specified in a space-geolocation correlation corresponding to a virtual space location or activity indicated in the user request.

System and method for facilitating user defined virtual space

A system and method for facilitating a user defined virtual space is disclosed. One or more virtual space locations and/or activities may be correlated with user specified geolocations. In some implementations, the user specified geolocations may be verified against one or more spatial requirements prior to recording the user selected space-geolocation correlations. A user request to initiate an action or activity in the virtual space may be received. Prior to executing the requested action or activity in the virtual space, the user current geolocation may be verified against that specified in a space-geolocation correlation corresponding to a virtual space location or activity indicated in the user request.

Interactive virtual reality system
11580708 · 2023-02-14 · ·

Provided herein are method, apparatus, and computer program products for generating a first and second three dimensional interactive environment. The first three dimensional interactive environment may contain one or more engageable virtual interfaces that correspond to one or more items. Upon engagement with a virtual interface the second three dimensional interactive environment is produced to virtual simulation related to the one or more items.

Systems and methods for controlling virtual scene perspective via physical touch input

Systems, methods, and non-transitory computer readable media for controlling perspective in an extended reality environment are disclosed. In one embodiment, a non-transitory computer readable medium contains instructions to cause a processor to perform the steps of: outputting for presentation via a wearable extended reality appliance (WER-appliance), first display signals reflective of a first perspective of a scene; receiving first input signals caused by a first multi-finger interaction with the touch sensor; in response, outputting for presentation via the WER-appliance second display signals to modify the first perspective of the scene, causing a second perspective of the scene to be presented via the WER-appliance; receiving second input signals caused by a second multi-finger interaction with the touch sensor; and in response, outputting for presentation via the WER-appliance third display signals to modify the second perspective of the scene, causing a third perspective of the scene to be presented via the WER-appliance.

Systems and methods for controlling virtual scene perspective via physical touch input

Systems, methods, and non-transitory computer readable media for controlling perspective in an extended reality environment are disclosed. In one embodiment, a non-transitory computer readable medium contains instructions to cause a processor to perform the steps of: outputting for presentation via a wearable extended reality appliance (WER-appliance), first display signals reflective of a first perspective of a scene; receiving first input signals caused by a first multi-finger interaction with the touch sensor; in response, outputting for presentation via the WER-appliance second display signals to modify the first perspective of the scene, causing a second perspective of the scene to be presented via the WER-appliance; receiving second input signals caused by a second multi-finger interaction with the touch sensor; and in response, outputting for presentation via the WER-appliance third display signals to modify the second perspective of the scene, causing a third perspective of the scene to be presented via the WER-appliance.

Systems and methods for seat selection in virtual reality
11579744 · 2023-02-14 · ·

The embodiments described herein provide technologies and techniques for using available data (from a variety of data sources) to provide an integrated and virtual reality experience. Embodiments described herein include systems and methods for acquiring flight information, wherein the flight information includes at least one of seating information regarding layout and availability of seats from one or more data sources, providing the flight information in a virtual reality environment, receiving, from a virtual reality device, a user's movements of an avatar in the virtual reality environment, wherein the avatar represents an individual having pre-stored information, determining, in the virtual reality environment, a position of the avatar with respect to a first seat zone surrounding a first available seat, and assigning the avatar to the first available seat in response to the virtual reality computing system receiving a deliver command when the avatar is in vicinity of the first seat zone surrounding the first available seat.

Systems and methods for seat selection in virtual reality
11579744 · 2023-02-14 · ·

The embodiments described herein provide technologies and techniques for using available data (from a variety of data sources) to provide an integrated and virtual reality experience. Embodiments described herein include systems and methods for acquiring flight information, wherein the flight information includes at least one of seating information regarding layout and availability of seats from one or more data sources, providing the flight information in a virtual reality environment, receiving, from a virtual reality device, a user's movements of an avatar in the virtual reality environment, wherein the avatar represents an individual having pre-stored information, determining, in the virtual reality environment, a position of the avatar with respect to a first seat zone surrounding a first available seat, and assigning the avatar to the first available seat in response to the virtual reality computing system receiving a deliver command when the avatar is in vicinity of the first seat zone surrounding the first available seat.

Cloud-based cyber shopping mall system
11580591 · 2023-02-14 · ·

A cloud-based cyber shopping mall system is provided. The cloud-based cyber shopping mall system includes a cloud-based server module, a scene movement module, a product introduction module, a customer service module, a shopping cart module, a style change module, and an exhibition area switching module. The cloud-based server module includes a cloud-based database and a cloud-based shopping mall webpage. Through the scene movement module, the product introduction module, the customer service module, and the shopping cart module, a user can click on related options adjacent to any product viewed on the cloud-based shopping mall webpage to view an introduction of the product, contact the customer service, and place an order. Through the style change module, the user can change a style of a three-dimensional product model on the cloud-based shopping mall webpage.

DIGITAL REALITY PLATFORM PROVIDING DATA FUSION FOR GENERATING A THREE-DIMENSIONAL MODEL OF THE ENVIRONMENT

The present invention relates to three-dimensional reality capturing of an environment, wherein data of various kinds of measurement devices are fused to generate a three-dimensional model of the environment. In particular, the invention relates to a computer-implemented method for registration and visualization of a 3D model provided by various types of reality capture devices and/or by various surveying tasks.

DIGITAL REALITY PLATFORM PROVIDING DATA FUSION FOR GENERATING A THREE-DIMENSIONAL MODEL OF THE ENVIRONMENT

The present invention relates to three-dimensional reality capturing of an environment, wherein data of various kinds of measurement devices are fused to generate a three-dimensional model of the environment. In particular, the invention relates to a computer-implemented method for registration and visualization of a 3D model provided by various types of reality capture devices and/or by various surveying tasks.