G02B2027/0127

OPTICAL DISPLAY SYSTEM AND METHOD, AND DISPLAY DEVICE

The present disclosure provides an optical display system and method, and a display device. The optical display system includes: a display screen; a light split member configured to split light from the display screen into a first polarized light and a second polarized light with different polarization directions; a first optical waveguide configured to guide the first polarized light to a light exit side of the optical display system; and a second optical waveguide located at a light exit side of the first optical waveguide, spaced apart from the first optical waveguide, and configured to at least partially transmit the first polarized light and guide the second polarized light to the light exit side of the optical display system.

System and Method for Generating Compact Light-Field Displays through Varying Optical Depths
20230124178 · 2023-04-20 ·

A system and method for generating compact light-field displays through varying optical depths provides digital content in a more effective and efficient manner. The system includes a field-evolving cavity with a cavity exit pupil, a relay mechanism, and a system enclosure with an enclosure exit pupil. The field-evolving cavity modifies the light-field displays before outputting the light-field displays with the cavity exit pupil. More specifically, the field-evolving cavity includes at least one display panel, which initially generates the light-field displays, and at least one optical-tuning mechanism, which subsequently modifies the light-field displays to varying optical depths. The system enclosure houses the field-evolving cavity and the relay mechanism. The relay mechanism directs the light-field displays from the cavity exit pupil to the enclosure exit pupil, which outputs the light-field displays to a user.

Enhanced optical and perceptual digital eyewear
11630311 · 2023-04-18 · ·

Improved wearable optics is disclosed. The wearable optics comprises a frame member and a lens. The wearable optics also includes circuitry within the frame member for enhancing the use of the wearable optics. A system and method in accordance with the present invention is directed to a variety of ways to enhance the use of eye-glasses. They are: (1) media focals, that is, utilizing the wearable optics for its intended purpose and enhancing that use by using imaging techniques to improve the vision of the user; (2) telecommunications enhancements that allow the eyeglasses to be integrated with telecommunication devices such as cell phones or the like; and (3) entertainment enhancements that allow the wearable optics to be integrated with devices such as MP3 players, radios, or the like.

Multi-depth augmented reality display

A system includes an image realisation device for forming a source image and projection optics for rendering a display image on a display screen, wherein the display image is a virtual image corresponding to the source image. The projection optics have an optical axis, and the image realisation device includes a first image realisation surface at a first distance along the optical axis and a second image realisation surface at a second, different distance along the optical axis. The first and second image realisation surfaces overlap, and the first and second image realisation surfaces include multiple regions, each region switchable between a transparent state and an image realisation state such that the source image may be formed on a region of the first or second image realisation surface and projected through the projection optics to render the display image on the display screen at a first or second apparent depth.

Multi-depth display apparatus

An imaging system includes an image realisation device and projection optics for rendering a display image on a display screen. The image realisation device includes a first image realisation surface tilted relative to an optical axis such that a first point in a first region of the image realisation surface is at a first distance from the focal point of the projection optics and a second point in a second region of the image realisation surface is at a second different distance from the focal point of the projection optics. A first source image formed on the first region and projected through the projection optics renders the first display image on the display screen at a first apparent depth, and a second source image formed on the second region and projected through the projection optics renders the second display image on the display screen at a second apparent depth.

Recognizing objects in a passable world model in augmented or virtual reality systems
11663789 · 2023-05-30 · ·

One embodiment is directed to a system for enabling two or more users to interact within a virtual world comprising virtual world data, comprising a computer network comprising one or more computing devices, the one or more computing devices comprising memory, processing circuitry, and software stored at least in part in the memory and executable by the processing circuitry to process at least a portion of the virtual world data; wherein at least a first portion of the virtual world data originates from a first user virtual world local to a first user, and wherein the computer network is operable to transmit the first portion to a user device for presentation to a second user, such that the second user may experience the first portion from the location of the second user, such that aspects of the first user virtual world are effectively passed to the second user.

Multi-depth exit pupil expander
11662575 · 2023-05-30 · ·

An example head-mounted display device includes a light projector and an eyepiece. The eyepiece includes a light guiding layer and a first focusing optical element. The first focusing optical element includes a first region having a first optical power, and a second region having a second optical power different from the first optical power. The light guiding layer is configured to: i) receive light from the light projector, ii) direct at least a first portion of the light to a user's eye through the first region to present a first virtual image to the user at a first focal distance, and iii) direct at least a second portion of the light to the user's eye through the second region to present a second virtual image to the user at a second focal distance.

Determining gaze depth using eye tracking functions

A device includes a camera assembly and a controller. The camera assembly is configured to capture images of both eyes of a user. Using the captured images, the controller determines a location for each pupil of each eye of the user. The determined pupil locations and captured images are used to determine eye tracking parameters which are used to compute values of eye tracking functions. With the computed values and a model that maps the eye tracking functions to gaze depths, a gaze depth of the user is determined. An action is performed based on the determined gaze depth.

Display systems and imaging systems with dynamically controllable optical path lengths

An optical subsystem for use in a display system or an imaging system comprises a plurality of reflective surfaces collectively arranged to provide variable control of device-internal path lengths of light coming to an imaging sensor or traveling a path to an eye of a viewer. The optical subsystem can be used to provide multiple images concurrently at different apparent depths as perceived by the user.

OPTICAL SYSTEM OF AUGMENTED REALITY HEAD-UP DISPLAY

An optical system includes a picture generation unit (PGU), a correcting optical unit configured to create, in a direction of a horizontal field of view (HFoV), a monotonic variation of an optical path length of light rays propagating from the PGU, and a combiner configured to redirect light rays propagating from the correcting optical unit toward an eye box, producing one or more virtual images observable from the eye box. The optical system provides a virtual image surface inclined in the direction of the HFoV for displaying the virtual images. The virtual image surface has a non-zero angle between projections on a horizontal plane defined by a first axis perpendicular to the virtual image surface and extending through an arbitrary intersection point on the virtual image surface, and a second axis parallel to a line of sight and extending from the eye box through the intersection point.