Patent classifications
H04N13/332
System and method for interactive 360 video playback based on user location
A system, method, and Head-Mounted Display, HMD, apparatus for recording a video and playing back the video to a viewer in a Virtual Reality, VR, environment. A geographical area is recorded with an omnidirectional video recording camera by dividing the geographical area into a plurality of area portions and recording in separate video sections, each of the area portions while moving the camera in different directions. Time points in each video section are associated with virtual locations of the view. At a time point providing the viewer with a choice of directions to proceed, the system receives the viewer's choice and presents to the viewer, a video section corresponding to the virtual location of the viewer and the desired direction of movement. The viewer's choice may be indicated by detecting a direction of the viewer's field of view or by receiving from the viewer, a response to a banner notification.
Wearable Virtual Retinal Display
Disclosed are a method and an input device for inputting information such as user commands to a computer a person wearing a virtual retinal display device that allows the person to concurrently see the real world.
Wearable Virtual Retinal Display
Disclosed are a method and an input device for inputting information such as user commands to a computer a person wearing a virtual retinal display device that allows the person to concurrently see the real world.
Systems and methods for reducing hops associated with a head mounted system
Systems and methods for reducing hops associated with a head mounted display are described. The head mounted display includes a communications circuit for receiving and transmitting interactive media associated with a game program via a network. The interactive media is processed by the game cloud system and streamed directly to the communications circuit of the head mounted display. The head mounted display further includes a user input circuit for receiving an action from a user to generate an input, which includes position and motion detected by the user input circuit. The head mounted display includes a game processing circuit for decoding the interactive media received from the network. The game processing circuit drives a portion of interactivity associated with the game program. The portion of interactivity is generated based on the input.
HYPER-CONNECTED AND SYNCHRONIZED AR GLASSES
Systems and methods are described for selectively sharing audio and video streams amongst electronic eyewear devices. Each electronic eyewear device includes a camera arranged to capture a video stream in an environment of the wearer, a microphone arranged to capture an audio stream in the environment of the wearer, and a display. A processor of each electronic eyewear device executes instructions to establish an always-on session with other electronic eyewear devices and selectively shares an audio stream, a video stream, or both with other electronic eyewear devices in the session. Each electronic eyewear device also generates and receives annotations from other users in the session for display with the selectively shared video stream on the display of the electronic eyewear device that provided the selectively shared video stream. The annotation may include manipulation of an object in the shared video stream or overlay images registered with the shared video stream.
HYPER-CONNECTED AND SYNCHRONIZED AR GLASSES
Systems and methods are described for selectively sharing audio and video streams amongst electronic eyewear devices. Each electronic eyewear device includes a camera arranged to capture a video stream in an environment of the wearer, a microphone arranged to capture an audio stream in the environment of the wearer, and a display. A processor of each electronic eyewear device executes instructions to establish an always-on session with other electronic eyewear devices and selectively shares an audio stream, a video stream, or both with other electronic eyewear devices in the session. Each electronic eyewear device also generates and receives annotations from other users in the session for display with the selectively shared video stream on the display of the electronic eyewear device that provided the selectively shared video stream. The annotation may include manipulation of an object in the shared video stream or overlay images registered with the shared video stream.
Electronic device and control method thereof
An electronic device according to the present invention includes: a processor; and a memory storing a program which, when executed by the processor, causes the electronic device to: perform control to change a display region of an image in accordance with an orientation change of the electronic device or in accordance with accepting a user operation and display the display region of the image on a screen; and determine a clipping region of the image to be clipped from the image based on a position of the display region of the image, wherein the image includes the display region and the clipping region and the clipping region is wider than the display region.
Electronic device and control method thereof
An electronic device according to the present invention includes: a processor; and a memory storing a program which, when executed by the processor, causes the electronic device to: perform control to change a display region of an image in accordance with an orientation change of the electronic device or in accordance with accepting a user operation and display the display region of the image on a screen; and determine a clipping region of the image to be clipped from the image based on a position of the display region of the image, wherein the image includes the display region and the clipping region and the clipping region is wider than the display region.
EXTENDED REALITY HEADSET POSITIONING AND STABILIZATION
An extended reality headset is configured to position and stabilize the headset on a face when worn. For example, the headset can include an external frame with first and second side pieces coupled to a display structure and configured to provide lateral stabilization. In some examples, the headset can include a front head-engaging structure front head-engaging structure that is rotationally coupled to the external frame via a pivot point. The headset can also include a rear head-engaging structure coupled the external frame. In some examples, the rear head-engaging structure can include a tensioning mechanism to adjust the headset to fit various head shapes. Additionally, the headset can include a flexible strap coupled to the front head-engaging structure and the tensioning mechanism. In some examples, applying tension to the flexible strap by the tensioning mechanism can cause the front head-engaging structure to rotate along the pivot point, providing a secure fit.
EXTENDED REALITY HEADSET POSITIONING AND STABILIZATION
An extended reality headset is configured to position and stabilize the headset on a face when worn. For example, the headset can include an external frame with first and second side pieces coupled to a display structure and configured to provide lateral stabilization. In some examples, the headset can include a front head-engaging structure front head-engaging structure that is rotationally coupled to the external frame via a pivot point. The headset can also include a rear head-engaging structure coupled the external frame. In some examples, the rear head-engaging structure can include a tensioning mechanism to adjust the headset to fit various head shapes. Additionally, the headset can include a flexible strap coupled to the front head-engaging structure and the tensioning mechanism. In some examples, applying tension to the flexible strap by the tensioning mechanism can cause the front head-engaging structure to rotate along the pivot point, providing a secure fit.