Patent classifications
G02B2027/0134
Head-mounted display for virtual and mixed reality with inside-out positional, user body and environment tracking
A Head-Mounted Display system together with associated techniques for performing accurate and automatic inside-out positional, user body and environment tracking for virtual or mixed reality are disclosed. The system uses computer vision methods and data fusion from multiple sensors to achieve real-time tracking. High frame rate and low latency is achieved by performing part of the processing on the HMD itself.
Light output system with reflector and lens for highly spatially uniform light output
In some embodiments, optical systems with a reflector and a lens proximate a light output opening of the reflector provide light output with high spatial uniformity and high efficiency. The reflectors are shaped to provide substantially angularly uniform light output and the lens is configured to transform this angularly uniform light output into spatially uniform light output. The light output may be directed into a spatial light modulator, which modulates the light to project an image.
Liquid crystal diffractive devices with nano-scale pattern and methods of manufacturing the same
An optical device includes a liquid crystal layer having a first plurality of liquid crystal molecules arranged in a first pattern and a second plurality of liquid crystal molecules arranged in a second pattern. The first and the second pattern are separated from each other by a distance of about 20 nm and about 100 nm along a longitudinal or a transverse axis of the liquid crystal layer. The first and the second plurality of liquid crystal molecules are configured as first and second grating structures that can redirect light of visible or infrared wavelengths.
Systems and methods for temporarily disabling user control interfaces during attachment of an electronic device
Systems and methods of disabling user control interfaces during attachment of a wearable electronic device to a portion of a user's clothing or accessory are disclosed. The wearable electronic device can include inertial measurement units (IMUs), optical sources, optical sensors or electromagnetic sensors. Based on the information provided by the IMUs, optical sources, optical sensors or electromagnetic sensors, an electrical processing and control system can make a determination that the electronic device is being grasped and picked up for attaching to a portion of a user's clothing or accessory or that the electronic device is in the process of being attached to a portion of a user's clothing or accessory and temporarily disable one or more user control interfaces disposed on the outside of the wearable electronic device.
Three-dimensional display device, three-dimensional display system, head-up display, and movable object
A three-dimensional display device includes a display panel, a barrier panel, and a controller that controls the display panel and the barrier panel. The controller defines multiple first image areas and multiple second image areas in the display panel, causes the first image areas to be at first intervals in a first direction, causes displaying of a first image viewable by a first eye of a user in the first image areas and a second image viewable by a second eye of the user in the second image areas, defines, in the barrier panel, multiple first transmissive areas transmissive to the image light at a first transmissivity and multiple second transmissive areas transmissive to the image light at a second transmissivity, causes the first transmissive areas to be at second intervals in the first direction, and performs an irregular process at third intervals in the first direction.
Mixed reality system
A mixed reality direct retinal projector system that may include a headset that uses a reflective holographic combiner to direct light from a light engine into an eye box corresponding to a user's eye. The light engine may include light sources coupled to projectors that independently project light to the holographic combiner from different projection points. The light sources may be in a unit separate from the headset that may be carried on a user's hip, or otherwise carried or worn separately from the headset. Each projector may include a collimating and focusing element, an active focusing element, and a two-axis scanning mirror to project light from a respective light source to the holographic combiner. The holographic combiner may be recorded with a series of point to point holograms; each projector interacts with multiple holograms to project light onto multiple locations in the eye box.
Methods and systems of automatic calibration for dynamic display configurations
Systems and methods are described for capturing, using a forward-facing camera associated with a head-mounted augmented reality (AR) head-mounted display (HMD), images of portions of first and second display devices in an environment, the first and second display devices displaying first and second portions of content related to an AR presentation, and displaying a third portion of content related to the AR presentation on the AR HMD, the third portion determined based upon the images of portions of the first and second display devices captured using the forward-facing camera. Moreover, the first and second display devices may be active stereo display, and the AR HMD may simultaneously function as shutter glasses.
Methods of rendering light field images for integral-imaging-based light field display
A method for rendering light field images of a 3D scene in an HMD using an integral-imaging-based light field display. The method includes providing integral imaging (InI) optics including a microdisplay, the InI optics having a central depth plane (CDP) associated therewith; providing an eyepiece in optical communication with the InI optics, the eyepiece and the InI optics together providing InI-HMD optics; sampling the 3D scene using a simulated virtual array of cameras so that each camera captures a respective portion of the 3D scene to create a plurality of elemental images; and displaying the image data on the microdisplay.
Third-party accessible application programming interface for generating 3D symbology
A system that employs a third-party accessible application programming interface (API) to generate symbology for a three-dimensional view is disclosed. In embodiments, the third-party accessible API is running on or configured to communicate with at least one controller for an aircraft display system. The third-party accessible API is configured to receive a set of parameters for generating three-dimensional symbology. The controller is configured to receive the three-dimensional symbology from the third-party accessible API. The controller is further configured to generate a three-dimensional view that includes proprietary symbology and the three-dimensional symbology from the third-party accessible API at a display of the aircraft display system.
Display device, head-up display, and moving body
A display device includes: a first panel including a first image-forming surface for forming a first image visually recognized by a user and a plurality of first pixels; a second panel including a second image-forming surface for forming a second image visually recognized by the user and a plurality of second pixels; and a half-wavelength plate located between the first image-forming surface and the second image-forming surface. The half-wavelength plate includes an optical axis, is configured to be capable of transmitting incident light from either one panel of the first panel and the second panel, and of emitting light as emission light to the other panel. A polarization direction of the emission light from the half-wavelength plate is determined based on a polarization direction of the incident light on the half-wavelength plate and a direction of the optical axis.