H04N13/371

ELECTRONIC DEVICE, SERVER AND METHODS FOR VIEWPORT PREDICTION BASED ON HEAD AND EYE GAZE

A method performed by an electronic device for requesting tiles relating to a viewport of an ongoing omnidirectional video stream is provided. The ongoing omnidirectional video stream is provided by a server to be displayed to a user of the electronic device. The electronic device predicts for an impending time period, a future head gaze of the user in relation to a current head gaze of the user, based on: A current head gaze relative to a position of shoulders of the user, a limitation of the head gaze of the user bounded by the shoulders position of the user, and a current eye gaze and eye movements of the user. The electronic device then sends a request to the server. The request requests tiles relating to the viewport for the impending time period, selected based on the predicted future head gaze of the user.

ELECTRONIC DEVICE, SERVER AND METHODS FOR VIEWPORT PREDICTION BASED ON HEAD AND EYE GAZE

A method performed by an electronic device for requesting tiles relating to a viewport of an ongoing omnidirectional video stream is provided. The ongoing omnidirectional video stream is provided by a server to be displayed to a user of the electronic device. The electronic device predicts for an impending time period, a future head gaze of the user in relation to a current head gaze of the user, based on: A current head gaze relative to a position of shoulders of the user, a limitation of the head gaze of the user bounded by the shoulders position of the user, and a current eye gaze and eye movements of the user. The electronic device then sends a request to the server. The request requests tiles relating to the viewport for the impending time period, selected based on the predicted future head gaze of the user.

METHOD FOR REALIZING 3D IMAGE DISPLAY, AND 3D DISPLAY DEVICE
20230007233 · 2023-01-05 ·

Provided is a method for realizing 3D image display, comprising: detecting a posture change of a 3D display device, wherein the 3D display device comprises a multi-viewpoint 3D display screen, the multi-viewpoint 3D display screen comprises a plurality of composite pixels and a plurality of spherical gratings covering the plurality of composite pixels, each composite pixel of the plurality of composite pixels comprises a plurality of composite subpixels, and each composite subpixel of the plurality of composite subpixels comprises a plurality of subpixels corresponding to a plurality of viewpoints; and when detecting the posture change of the 3D display device, adjusting a display orientation of a displayed 3D image so that the 3D image is kept in an initial display orientation before the posture change of the 3D display device. A 3D display device, a computer-readable storage medium, and a computer program product are further provided.

Computer-readable non-transitory storage medium, web server, and calibration method for interpupillary distance

An object of the present invention is to obtain calibration data more easily in a VR (Virtual Reality) device. a user wearing a pair of VR goggles visually recognizes overlapped marker images displayed in the 360-degree VR space, and a stationary state is detected when the images for right and left eyes are overlapped, and when the stationary state satisfies a predetermined condition set in advance, one of the plurality of marker images displayed on the display in this state, which is at the center, is set as a marker image for calibration setting, calibration data of the interpupillary distance based on the marker image for calibration setting having been set is acquired, and the acquired calibration data is set as calibration data used for subsequent reproduction of images.

Computer-readable non-transitory storage medium, web server, and calibration method for interpupillary distance

An object of the present invention is to obtain calibration data more easily in a VR (Virtual Reality) device. a user wearing a pair of VR goggles visually recognizes overlapped marker images displayed in the 360-degree VR space, and a stationary state is detected when the images for right and left eyes are overlapped, and when the stationary state satisfies a predetermined condition set in advance, one of the plurality of marker images displayed on the display in this state, which is at the center, is set as a marker image for calibration setting, calibration data of the interpupillary distance based on the marker image for calibration setting having been set is acquired, and the acquired calibration data is set as calibration data used for subsequent reproduction of images.

EYEWEAR DISPLAY DEVICE FOR DISPLAYING A VIRTUAL IMAGE IN A FIELD OF VIEW OF A USER, AUGMENTED REALITY EYEWEAR DISPLAY DEVICE
20220408075 · 2022-12-22 · ·

Disclosed is an eyewear display device for displaying a virtual image in a field of view of a user, comprising a frame unit, a line-shaped screen unit attached to the frame unit for emitting light as computer-generated image information in a first direction; at least two partially transparent beam splitter units attached to the frame unit, designed to be operated as scanner units at a defined scanner frequency, for deflecting the light emitted in the first direction from the screen unit into a second directional range corresponding to the field of view of the user when the eyewear display device is used as intended; to provide an eyewear display device for display, AR glasses, by which the virtual image is displayed in as large a sub-area of the field of view as possible and the form factor of which corresponds as closely as possible to that of ordinary glasses.

EYEWEAR DISPLAY DEVICE FOR DISPLAYING A VIRTUAL IMAGE IN A FIELD OF VIEW OF A USER, AUGMENTED REALITY EYEWEAR DISPLAY DEVICE
20220408075 · 2022-12-22 · ·

Disclosed is an eyewear display device for displaying a virtual image in a field of view of a user, comprising a frame unit, a line-shaped screen unit attached to the frame unit for emitting light as computer-generated image information in a first direction; at least two partially transparent beam splitter units attached to the frame unit, designed to be operated as scanner units at a defined scanner frequency, for deflecting the light emitted in the first direction from the screen unit into a second directional range corresponding to the field of view of the user when the eyewear display device is used as intended; to provide an eyewear display device for display, AR glasses, by which the virtual image is displayed in as large a sub-area of the field of view as possible and the form factor of which corresponds as closely as possible to that of ordinary glasses.

Systems and methods for using peripheral vision in virtual, augmented, and mixed reality (xR) applications

Systems and methods for using peripheral vision in virtual, augmented, and mixed reality (collectively referred to as “xR”) applications are described. In some embodiments, an Information Handling System (IHS) may include a processor and a memory coupled to the processor, the memory having program instructions stored thereon that, upon execution, cause the IHS to: render an object in a peripheral field-of-view of a user; detect at least one of: the user's eye movement, or the user's head rotation; and determine whether to re-render the object based upon the detection.

Systems and methods for using peripheral vision in virtual, augmented, and mixed reality (xR) applications

Systems and methods for using peripheral vision in virtual, augmented, and mixed reality (collectively referred to as “xR”) applications are described. In some embodiments, an Information Handling System (IHS) may include a processor and a memory coupled to the processor, the memory having program instructions stored thereon that, upon execution, cause the IHS to: render an object in a peripheral field-of-view of a user; detect at least one of: the user's eye movement, or the user's head rotation; and determine whether to re-render the object based upon the detection.

Methods and systems for creating virtual and augmented reality

Configurations are disclosed for presenting virtual reality and augmented reality experiences to users. The system may comprise an image capturing device to capture one or more images, the one or more images corresponding to a field of the view of a user of a head-mounted augmented reality device, and a processor communicatively coupled to the image capturing device to extract a set of map points from the set of images, to identify a set of sparse points and a set of dense points from the extracted set of map points, and to perform a normalization on the set of map points.