Patent classifications
A63F13/5255
GAMING DEVICE WITH ROTATABLY PLACED CAMERAS
A method to identify positions of fingers of a hand is described. The method includes capturing images of a first hand using a plurality of cameras that are part of a wearable device. The wearable device is attached to a wrist of a second hand and the plurality of cameras of the wearable device is disposed around the wearable device. The method includes repeating capturing of additional images of the first hand, the images and the additional images captured to produce a stream of captured image data during a session of presenting the virtual environment in a head mounted display (HMD). The method includes sending the stream of captured image data to a computing device that is interfaced with the HMD. The computing device is configured to process the captured image data to identify changes in positions of the fingers of the first hand.
USING 6DOF POSE INFORMATION TO ALIGN IMAGES FROM SEPARATED CAMERAS
Techniques for aligning images generated by an integrated camera physically mounted to an HMD with images generated by a detached camera physically unmounted from the HMD are disclosed. A 3D feature map is generated and shared with the detached camera. Both the integrated camera and the detached camera use the 3D feature map to relocalize themselves and to determine their respective 6 DOF poses. The HMD receives the detached camera's image of the environment and the 6 DOF pose of the detached camera. A depth map of the environment is accessed. An overlaid image is generated by reprojecting a perspective of the detached camera's image to align with a perspective of the integrated camera and by overlaying the reprojected detached camera's image onto the integrated camera's image.
REALITY VS VIRTUAL REALITY RACING
A method for displaying a virtual vehicle includes: calculating a virtual world comprising the virtual vehicle and a representation of a physical object at a virtual position; calculating a virtual position of a point of view within the virtual world based on a position of the point of view at the racecourse; and calculating a portion of the virtual vehicle within the virtual world that is visible from the virtual position of the point of view, wherein the portion of the virtual vehicle visible from the virtual position of the point of view comprises a portion of the virtual vehicle that is unobscured, from the virtual position of the point of view, by the representation of the physical object at the virtual position of the physical object.
REALITY VS VIRTUAL REALITY RACING
A method for displaying a virtual vehicle includes: calculating a virtual world comprising the virtual vehicle and a representation of a physical object at a virtual position; calculating a virtual position of a point of view within the virtual world based on a position of the point of view at the racecourse; and calculating a portion of the virtual vehicle within the virtual world that is visible from the virtual position of the point of view, wherein the portion of the virtual vehicle visible from the virtual position of the point of view comprises a portion of the virtual vehicle that is unobscured, from the virtual position of the point of view, by the representation of the physical object at the virtual position of the physical object.
MANAGEMENT OF STREAMING VIDEO DATA
User action data characterizing action by a player in a game environment executing at a user client is received at a server. The game environment is created by the user client separate from the server. Data characterizing a selected viewing position is received. The selected viewing position is different than a player viewing position. The selected viewing position characterizes a viewing location within the game environment. A recreated game environment is generated from the user action data at the server. A video stream of the recreated game environment is generated. The video stream includes video from a perspective of the selected viewing position. The video stream is transmitted to a viewing client. Related apparatus, systems, articles, and techniques are also described.
MANAGEMENT OF STREAMING VIDEO DATA
User action data characterizing action by a player in a game environment executing at a user client is received at a server. The game environment is created by the user client separate from the server. Data characterizing a selected viewing position is received. The selected viewing position is different than a player viewing position. The selected viewing position characterizes a viewing location within the game environment. A recreated game environment is generated from the user action data at the server. A video stream of the recreated game environment is generated. The video stream includes video from a perspective of the selected viewing position. The video stream is transmitted to a viewing client. Related apparatus, systems, articles, and techniques are also described.
Virtual reality control system
According to one aspect of the present invention, a virtual reality control system for providing a chemical accident response training content includes a sensor detecting a light signal, a display displaying an image, at least one controller controlling the display, and a simulator displayed as a valve in the image, wherein the controller is configured to acquire first position data related to the user and second position data related to the simulator based on the light signal, acquire first virtual position data indicating a character corresponding to the user and acquire second virtual position data indicating the valve, and display the character and the valve on the display and display a gas within a predetermined distance from the valve, wherein at least a portion of the gas is not displayed when the character moves while at least a portion of the character is in contact with the valve.
Modular augmented and virtual reality ride attraction
An amusement park system includes a modular attraction system having a ride vehicle having seats to accommodate passengers, an on-board system integrated with the ride vehicle and having on-board game systems connected via a network. Each on-board game system is configured to provide an augmented reality (AR) experience, or a virtual reality (VR) experience, or both, via a respective visual experience generator device. The AR experience, or the VR experience, or both, is provided within a game shared between the on-board game systems. The on-board game systems are integrated into the ride vehicle, and are connected via the network to one another in a manner that allows for ready removal of all or a portion of one of the on-board game systems without affecting operation of the remaining on-board game systems.
Method, apparatus and device for view switching of virtual environment, and storage medium
A viewing angle switching method is provided. The viewing angle switching method includes: displaying a first user interface, the first user interface including an environment picture and a viewing angle switching region, the environment picture including a three-dimensional virtual environment observed from a first viewing angle direction by a virtual object, and the viewing angle switching region including a viewing angle jumping element used for viewing angle jumping; receiving a viewing angle jumping signal triggered on a target viewing angle jumping element, the at least one viewing angle jumping element including the target viewing angle jumping element; determining a second viewing angle direction corresponding to the target viewing angle jumping element; and displaying a second user interface, the second user interface including the environment picture of the three-dimensional virtual environment observed from the second viewing angle direction by the virtual object.
Method, apparatus and device for view switching of virtual environment, and storage medium
A viewing angle switching method is provided. The viewing angle switching method includes: displaying a first user interface, the first user interface including an environment picture and a viewing angle switching region, the environment picture including a three-dimensional virtual environment observed from a first viewing angle direction by a virtual object, and the viewing angle switching region including a viewing angle jumping element used for viewing angle jumping; receiving a viewing angle jumping signal triggered on a target viewing angle jumping element, the at least one viewing angle jumping element including the target viewing angle jumping element; determining a second viewing angle direction corresponding to the target viewing angle jumping element; and displaying a second user interface, the second user interface including the environment picture of the three-dimensional virtual environment observed from the second viewing angle direction by the virtual object.