Patent classifications
H04N13/341
NEAR-EYE DISPLAY DEVICE
The present invention relates to a near-eye display device. The a near-eye display device includes a display, a first lens disposed in front of the display so as to be spaced apart from the display by a predetermined distance, a dynamic aperture adjustment element disposed adjacent to the first lens to dynamically control an aperture size of the first lens and a horizontal position of the aperture on a plane perpendicular to an optical axis, a main optics lens disposed to be spaced apart from the first lens by a predetermined distance, and a control system configured to control the dynamic aperture adjustment element.
Presenting video streams on a head-mountable device
In various implementations, a method of presenting video streams at a head-mountable device (HMD) includes generating a first video stream at a first frame rate for a first display portion. In some implementations, the first frame rate indicates a rate at which frames are presented by the first display portion. In various implementations, the method includes generating a second video stream at a second frame rate for a second display portion. In some implementations, the second frame rate indicates a rate at which frames are presented by the second display portion. In some implementations, the second frame rate is within a threshold relative to the first frame rate. In various implementations, the method includes temporally shifting the second video stream relative to the first video stream so that a majority of refresh times of the first display portion are different from refresh times of the second display portion.
ADVANCED REFRACTIVE OPTICS FOR IMMERSIVE VIRTUAL REALITY
A display device has a display, operable to generate a real image, and an optical system, comprising one or more lenslets, arranged to generate a virtual sub-image from a partial real image on the display, by each lenslet projecting light from the display to an eye position. The sub-images combine to form a virtual image viewable from the eye position. At least one lenslet is symmetric with respect to a plane, and the display surface is cylindrical with its axis perpendicular to that plane.
Image Processing Apparatus, Image Processing Method, and Image Communication System
Methods and apparatus provide for: capturing an image of an object, which includes a face of a person wearing an optical display apparatus by which to observe a stereoscopic image that contains a first parallax image and a second parallax image obtained when the object in a three-dimensional (3D) space is viewed from different viewpoints; identifying the optical display apparatus included in the image of the object; and generating an image of the face of the person that does not include the optical display apparatus by excluding the identified optical display apparatus, and instead by adding features of the face of the person to a region in which the identified optical display apparatus is excluded.
IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND IMAGE DISPLAY SYSTEM
There is provided an image processing device that processes a projection image presented to a plurality of persons at the same time. The image processing device specifies an overlapping area in which fields of view of two or more users overlap based on information on each user, classifies objects included in the overlapping area into a first object group and a second object group, generates a common image common to all users, made up of the first object group, generates individual images different for each user, made up of the second object group, and determines an output protocol for displaying the individual images.
IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND IMAGE DISPLAY SYSTEM
There is provided an image processing device that processes a projection image presented to a plurality of persons at the same time. The image processing device specifies an overlapping area in which fields of view of two or more users overlap based on information on each user, classifies objects included in the overlapping area into a first object group and a second object group, generates a common image common to all users, made up of the first object group, generates individual images different for each user, made up of the second object group, and determines an output protocol for displaying the individual images.
CALIBRATION OF STEREOSCOPIC DISPLAY USING WAVEGUIDE COMBINER
Examples are disclosed that relate to calibration of a stereoscopic display system of an HMD via an optical calibration system comprising a waveguide combiner. One example provides an HMD device comprising a first image projector and a second image projector configured to project a stereoscopic image pair, and an optical calibration system. The optical calibration system comprises a first optical path indicative of an alignment of the first image projector, a second optical path indicative of an alignment of the second image projector, a waveguide combiner in which the first and second optical paths combine into a shared optical path, and one or more boresight sensors configured to detect calibration image light traveling along one or more of the first optical or the second optical path.
CALIBRATION OF STEREOSCOPIC DISPLAY USING WAVEGUIDE COMBINER
Examples are disclosed that relate to calibration of a stereoscopic display system of an HMD via an optical calibration system comprising a waveguide combiner. One example provides an HMD device comprising a first image projector and a second image projector configured to project a stereoscopic image pair, and an optical calibration system. The optical calibration system comprises a first optical path indicative of an alignment of the first image projector, a second optical path indicative of an alignment of the second image projector, a waveguide combiner in which the first and second optical paths combine into a shared optical path, and one or more boresight sensors configured to detect calibration image light traveling along one or more of the first optical or the second optical path.
Display method of image
A display method of an image is disclosed. A position of a vergence surface of a user is obtained through a gaze tracking device. An image is provided by a display, the image is located at a virtual image surface, and the image has an offset between different view directions. A controller is coupled to the gaze tracking device and the display. The controller receives an information of the position of the vergence surface obtained through the gaze tracking device, performs an algorithm processing according to the information to obtain the offset, and transmits a display information including the offset to the display. An eye of the user focuses on an accommodation surface when viewing the image, and a position of the accommodation surface is different from a position of the virtual image surface.
Display method of image
A display method of an image is disclosed. A position of a vergence surface of a user is obtained through a gaze tracking device. An image is provided by a display, the image is located at a virtual image surface, and the image has an offset between different view directions. A controller is coupled to the gaze tracking device and the display. The controller receives an information of the position of the vergence surface obtained through the gaze tracking device, performs an algorithm processing according to the information to obtain the offset, and transmits a display information including the offset to the display. An eye of the user focuses on an accommodation surface when viewing the image, and a position of the accommodation surface is different from a position of the virtual image surface.