Patent classifications
H04N13/144
GLASSES-TYPE MOBILE TERMINAL AND METHOD OF OPERATING THE SAME
A glasses-type mobile terminal includes a display configured to display a virtual reality image and a controller configured to acquire reality information from a mobile terminal connected to the glasses-type mobile terminal and controlling the virtual reality image if a reality returning time indicating that viewing of the virtual reality image should be finished is reached based on the acquired reality information.
Single depth tracked accommodation-vergence solutions
While a viewer is viewing a first stereoscopic image comprising a first left image and a first right image, a left vergence angle of a left eye of a viewer and a right vergence angle of a right eye of the viewer are determined. A virtual object depth is determined based at least in part on (i) the left vergence angle of the left eye of the viewer and (ii) the right vergence angle of the right eye of the viewer. A second stereoscopic image comprising a second left image and a second right image for the viewer is rendered on one or more image displays. The second stereoscopic image is subsequent to the first stereoscopic image. The second stereoscopic image is projected from the one or more image displays to a virtual object plane at the virtual object depth.
Single depth tracked accommodation-vergence solutions
While a viewer is viewing a first stereoscopic image comprising a first left image and a first right image, a left vergence angle of a left eye of a viewer and a right vergence angle of a right eye of the viewer are determined. A virtual object depth is determined based at least in part on (i) the left vergence angle of the left eye of the viewer and (ii) the right vergence angle of the right eye of the viewer. A second stereoscopic image comprising a second left image and a second right image for the viewer is rendered on one or more image displays. The second stereoscopic image is subsequent to the first stereoscopic image. The second stereoscopic image is projected from the one or more image displays to a virtual object plane at the virtual object depth.
LIGHT FIELD DISPLAY METROLOGY
Examples of a light field metrology system for use with a display are disclosed. The light field metrology may capture images of a projected light field, and determine focus depths (or lateral focus positions) for various regions of the light field using the captured images. The determined focus depths (or lateral positions) may then be compared with intended focus depths (or lateral positions), to quantify the imperfections of the display. Based on the measured imperfections, an appropriate error correction may be performed on the light field to correct for the measured imperfections. The display can be an optical display element in a head mounted display, for example, an optical display element capable of generating multiple depth planes or a light field display.
LIGHT FIELD DISPLAY METROLOGY
Examples of a light field metrology system for use with a display are disclosed. The light field metrology may capture images of a projected light field, and determine focus depths (or lateral focus positions) for various regions of the light field using the captured images. The determined focus depths (or lateral positions) may then be compared with intended focus depths (or lateral positions), to quantify the imperfections of the display. Based on the measured imperfections, an appropriate error correction may be performed on the light field to correct for the measured imperfections. The display can be an optical display element in a head mounted display, for example, an optical display element capable of generating multiple depth planes or a light field display.
METHOD AND APPARATUS FOR PROVIDING PERSONAL 3-DIMENSIONAL IMAGE USING CONVERGENCE MATCHING ALGORITHM
Disclosed herein are a method and apparatus for providing a personalized three-dimensional (3D) image. The apparatus for providing a personalized three-dimensional image using convergence matching may include a fixed screen set as a horopter region, a calculation unit configured to calculate a shift value of an image projected onto the screen based on a distance from a virtual stereo camera to a target object, a convergence matching unit configured to match a convergence angle to the horopter region based on the shift value, and a controller configured to control the convergence matching unit to maintain the convergence angle when a user's gaze is shifted to a nearby object having the same depth value as the target object.
Method and apparatus for photographing and projecting moving images in three dimensions
A digital cinematographic and projection process that provides 3D stereoscopic imagery that is not adversely affected by the standard frame rate of 24 frames per second, as is the convention in the motion picture industry worldwide. A method for photographing and projecting moving images in three dimensions includes recording a moving image with a first and a second camera simultaneously and interleaving a plurality of frames recorded by the first camera with a plurality of frames recorded by the second camera. The step of interleaving includes retaining odd numbered frames recorded by the first camera and deleting the even numbered frames, retaining even numbered frames recorded by the second camera and deleting the odd numbered frames, and creating an image sequence by alternating the retained images from the first and second camera.
SELF-CALIBRATING DISPLAY SYSTEM
A self-calibrating display system includes a stereoscopic, near-eye display device, a docking unit, and one or more cameras. The display device includes one or more coupling structures, in addition to one or more microprojectors configured to project a right calibration image and a left calibration image. The docking unit includes one or more complementary coupling structures, each being releasably lockable to a coupling structure of the display device, to prevent movement of the display device relative to the docking unit. The one or more cameras are configured to acquire a secondary image of the right calibration image and a secondary image of the left calibration image.
Interactive virtual reality display providing accommodation depth cues
An interactive display includes a display capable of generating displayed images, and first and second eyepiece assemblies each including one or more variable-focus lenses. The eyepiece assemblies, variable-focus lenses and display allow the user to perceive a virtual 3D image while providing visual depth cues that cause the eyes to accommodate at a specified fixation distance. The fixation distance can be adjusted by changing the focal power of the variable-focus lenses.
Interactive virtual reality display providing accommodation depth cues
An interactive display includes a display capable of generating displayed images, and first and second eyepiece assemblies each including one or more variable-focus lenses. The eyepiece assemblies, variable-focus lenses and display allow the user to perceive a virtual 3D image while providing visual depth cues that cause the eyes to accommodate at a specified fixation distance. The fixation distance can be adjusted by changing the focal power of the variable-focus lenses.