G02B2027/0129

WEARABLE VIDEO HEADSET AND METHOD FOR CALIBRATION
20170289535 · 2017-10-05 ·

During calibration, a wearable video headset displays a pattern on a partially transparent display positioned in a field of view of a user's eye. The user has a hand-held marker that includes a scaled version of the displayed pattern. The user moves the marker toward or away from the user's eye until the pattern on the marker appears to be the same size as the pattern on the display. When the sizes match, the headset measures a distance between a forward-facing camera and the hand-held marker. The headset uses the measured distance, and geometrical relationships, to determine the spacing between the user's eye and the display. Such calibration can ensure that the images displayed to the user mesh realistically with the surroundings, which remain partially visible through the partially transparent display of the video headset.

Waterproof virtual reality goggle and sensor system
09740010 · 2017-08-22 ·

A waterproof 3D virtual reality system is described. The virtual reality system includes several key components, such as a submersion tank, an enveloping medium, a waterproof head-mounted display system containing a sensor for tracking the user's head movements and an optical element that displays virtual reality images, waterproof hand sensors, electronic surface sensors, a computer/controller that both receives and transmits location and speed data from sensors worn by the user, and a computer-implemented virtual reality video input signal that is dynamically modified in accordance with movement of the user's head and/or hand sensors. Furthermore, a method of enhancing a user's overall virtual experience is described. This method involves the user occupying a position of partial submersion in a body of fluid, such that the fluid provides buoyancy and resistance to the user.

NEAR-TO-EYE DISPLAY DEVICE AND AUGMENTED REALITY APPARATUS
20220308343 · 2022-09-29 ·

There is provided a near-to-eye display device, including: an optical waveguide; at least one in-coupling grating on a surface of the optical waveguide and configured to couple received parallel light into the optical waveguide for propagating by total internal reflection; a light out-coupling structure on the surface of the optical waveguide and configured to extract the light propagating by total internal reflection in the optical waveguide to become an outgoing light from the optical waveguide; and an optical lens configured to receive the outgoing light, remain an outgoing direction of the outgoing light with a first polarization direction, and converge or diverge the outgoing light with a second polarization direction. There is further provided an augmented reality apparatus including the near-to-eye display device.

Camera System

A device for MR/VR systems that includes a two-dimensional array of cameras that capture images of respective portions of a scene. The cameras are positioned along a spherical surface so that the cameras have adjacent fields of view. The entrance pupils of the cameras are positioned at or near the user's eye while the cameras also form optimized images at the sensor. Methods for reducing the number of cameras in an array, as well as methods for reducing the number of pixels read from the array and processed by the pipeline, are also described.

IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD FOR SAME

In the present specification, an image processing device and an image processing method for same are disclosed. An image processing device for processing images according to an embodiment of the present invention comprises: a receiving unit for receiving content; a controller for turning retinal scanning on and measuring an accommodation, and calculating a focal depth based on the measured accommodation and controlling in order to generate an image according to the calculated focal depth; and an output unit for outputting the generated image.

Determining inter-pupillary distance

A head-mounted display device is disclosed that includes an at least partially see-though display, a processor, and a non-volatile storage device holding instructions executable by the processor to: select an image that corresponds to a physical object viewable by the user; display the image at a perceived offset to the physical object; in response to alignment user input, move a perceived position of the image relative to the physical object; output an instruction to provide completion user input when the image appears to align with the physical object; when the completion user input is received, determine the inter-pupillary distance of the user; and calibrate the head mounted display device based on the inter-pupillary distance.

3D DISPLAY APPARATUS, METHOD, AND APPLICATIONS
20170269353 · 2017-09-21 · ·

A 3D display apparatus and method that address the vergence-accommodation conflict. A display screen component includes a display screen pixel array adapted to display a display screen image, a microlens imaging component including an array of microlenses corresponding to the display screen pixel array that can form a virtual or a real image of the display screen image, and a controllable movement component coupled to the imaging component or the display screen, wherein the imaging component and the display screen are controllably movable relative to each other, further wherein upon a controlled movement of the imaging component relative to the display screen, a location of the virtual or the real image along an optical axis is controllably changed.

FOCUS ADJUSTING HEADSET

A virtual reality (VR) headset adjusts the phase of light of a virtual scene received from a display element using a spatial light modulator (SLM) to accommodate changes in vergence for a user viewing objects in the virtual scene. The VR headset receives virtual scene data that includes depth information for components of the virtual scene and the SLM adjusts a wavefront of the light of the virtual scene by generating a phase function that adjusts the light of the virtual scene with phase delays based the depth values. Individual phase delays shift components of the virtual scene based on the depth values to a target focal plane to accommodate a user at a vergence depth for a frame of the virtual scene. Further, the SLM can provide optical defocus by shifting components of the virtual scene with the phase delays for depth of field blur.

DISPLAY APPARATUS INCLUDING VISION CORRECTION LENS

Provided is a display apparatus including a vision correction lens. The display apparatus may include: an image forming device configured to form a virtual image; a vision correction lens configured to correct eyesight of a viewer; a combiner configured to mix the virtual image with light containing an outside landscape and having passed through the vision correction lens and provide the viewer with the virtual image and the light that are mixed with each other; and a virtual image positioner configured to adjust, according to a state of eyes of the viewer, a depth of a virtual image plane at which the virtual image is viewed, wherein the combiner may be arranged between the vision correction lens and the eyes of the viewer.

Pancake lens assembly and optical system thereof

An optical lens assembly and a head-mounted display (HMD) is provided. The optical lens assembly includes a first optical element including a partial reflector and a quarter-wave plate, a second optical element including a reflective polarizer, and a varifocal lens disposed inside a cavity formed by the first optical element and the second optical element. The varifocal lens is a liquid crystal (LC) lens stack including a plurality of LC lenses. An LC lens of the plurality of the LC lenses has a plurality of optical states including an additive state that adds optical power to the varifocal lens and a subtractive state that removes optical power from the varifocal lens. The plurality of optical states provides a range of adjustment of optical power for the optical lens assembly.