Patent classifications
A63F13/25
Sorting computer applications or computer files and indicating a sort attribute in a user interface
Techniques are described for an intuitive and efficient GUI. In an example, a UI elements are presented on a GUI. Each one of the UI elements can corresponds to a set of computer applications or a set of computer files having attributes and each one of the UI elements can be selected to launch, as applicable, a computer application, a computer file, or a page about the computer application or computer file. A request to sort the UI elements can be received and can indicate at least one of the attributes as a sort factor. The UI elements are sorted accordingly and presented in an updated arrangement on the GUI. An attribute used in the sorting is presented within or in proximity to the UI elements, whereas another attribute not used in the sorting is not presented in the updated arrangement.
Sorting computer applications or computer files and indicating a sort attribute in a user interface
Techniques are described for an intuitive and efficient GUI. In an example, a UI elements are presented on a GUI. Each one of the UI elements can corresponds to a set of computer applications or a set of computer files having attributes and each one of the UI elements can be selected to launch, as applicable, a computer application, a computer file, or a page about the computer application or computer file. A request to sort the UI elements can be received and can indicate at least one of the attributes as a sort factor. The UI elements are sorted accordingly and presented in an updated arrangement on the GUI. An attribute used in the sorting is presented within or in proximity to the UI elements, whereas another attribute not used in the sorting is not presented in the updated arrangement.
Wireless headphone controller
Wireless Blue Tooth® headphone controller comprises a game controller having a pair of wireless Blue Tooth® headphones removably stored within the game controller housing and a detachable pair of Blue Tooth® headphones in a separate case capable of being in electrical communication with the game controller.
Wireless headphone controller
Wireless Blue Tooth® headphone controller comprises a game controller having a pair of wireless Blue Tooth® headphones removably stored within the game controller housing and a detachable pair of Blue Tooth® headphones in a separate case capable of being in electrical communication with the game controller.
Hybrid lens for head mount display
A lens assembly, related methods and constituent optical elements are described. The assembly may be used to direct and focus light for various applications. In one instance, the lens assembly is used in conjunction with one or more sources of light such as projected images or video as part of a virtual reality system. The lens assembly includes two or more optical elements arranged to receive light or direct light through different spatial regions of the assembly at different focal powers corresponding to a first user viewing zone and a second user viewing zone. In one instance, the first user viewing zone is a peripheral viewing zone and the second viewing zone is a primary or non-peripheral viewing zone (or vice versa).
Hybrid lens for head mount display
A lens assembly, related methods and constituent optical elements are described. The assembly may be used to direct and focus light for various applications. In one instance, the lens assembly is used in conjunction with one or more sources of light such as projected images or video as part of a virtual reality system. The lens assembly includes two or more optical elements arranged to receive light or direct light through different spatial regions of the assembly at different focal powers corresponding to a first user viewing zone and a second user viewing zone. In one instance, the first user viewing zone is a peripheral viewing zone and the second viewing zone is a primary or non-peripheral viewing zone (or vice versa).
Environment model with surfaces and per-surface volumes
In one embodiment, a method includes receiving sensor data of a scene captured using one or more sensors, generating (1) a number of virtual surfaces representing a number of detected planar surfaces in the scene and (2) a point cloud representing detected features of objects in the scene based on the sensor data, assigning each point in the point cloud to one or more of the number of virtual surfaces, generating occupancy volumes for each of the number of virtual surfaces based on the points assigned to the virtual surface, generating a datastore including the number of virtual surfaces, the occupancy volumes of each of the number of virtual surfaces, and a spatial relationship between the number of virtual surfaces, receiving a query, and sending a response to the query, the response including an identified subset of the plurality of virtual surfaces in the datastore that satisfy the query.
VR SYSTEM AND POSITIONING AND TRACKING METHOD OF VR SYSTEM
A VR system and a positioning and tracking method of the VR system are provided. The VR system includes a head-mounted display and a gamepad. The head-mounted display includes a first inertial sensor, a first processor, and first cameras. The gamepad includes a second inertial sensor and second cameras. The first processor is configured to obtain 6DOF data of the head-mounted display under a world coordinate system based on the inertial data of the head-mounted display obtained by the first inertial sensor and the external environment images of the head-mounted display obtained by the first cameras, obtain 6DOF data of the gamepad under the world coordinate system based on the inertial data of the gamepad obtained by the second inertial sensor and the external environment images of the gamepad obtained by the second cameras, and generate 6DOF data of the gamepad on the head-mounted display.
VR SYSTEM AND POSITIONING AND TRACKING METHOD OF VR SYSTEM
A VR system and a positioning and tracking method of the VR system are provided. The VR system includes a head-mounted display and a gamepad. The head-mounted display includes a first inertial sensor, a first processor, and first cameras. The gamepad includes a second inertial sensor and second cameras. The first processor is configured to obtain 6DOF data of the head-mounted display under a world coordinate system based on the inertial data of the head-mounted display obtained by the first inertial sensor and the external environment images of the head-mounted display obtained by the first cameras, obtain 6DOF data of the gamepad under the world coordinate system based on the inertial data of the gamepad obtained by the second inertial sensor and the external environment images of the gamepad obtained by the second cameras, and generate 6DOF data of the gamepad on the head-mounted display.
MIXED-REALITY DEVICE POSITIONING BASED ON SHARED LOCATION
Techniques and systems are provided for positioning mixed-reality devices within mixed-reality environments. The devices, which are configured to perform inside out tracking, transition between position tracking states in mixed-reality environments and utilize positional information from other inside out tracking devices that share the mixed-reality environments to identify/update positioning of the devices when they become disoriented within the environments and without requiring an extensive or full scan and comparison/matching of feature points that are detectable by the devices with mapped feature points of the maps associated with the mixed-reality environments. Such techniques can conserve processing and power consumption that would be required when performing a full or extensive scan and comparison of matching feature points. Such techniques can also enhance the accuracy and speed of positioning mixed-reality devices.