G02B27/00

INFRARED PROJECTION LENS ASSEMBLY AND INFRARED PROJECTION MODULE
20230052783 · 2023-02-16 ·

An infrared projection lens assembly includes, in order from the image side to the image source side: a stop, a first lens with positive refractive power, a second lens with refractive power, and a third lens with positive refractive power, wherein a radius of curvature of an image-side surface of the first lens is R1, half of a maximum view angle (field of view) of the infrared projection lens assembly is HFOV, a focal length of the infrared projection lens assembly is f, and following condition is satisfied: −18.2<R1/(sin(HFOV)*f)<−1.53, so as to provide projected light of large angle.

WORLD LOCK SPATIAL AUDIO PROCESSING
20230046341 · 2023-02-16 ·

A method for providing a world-locked experience to a user of a headset in an immersive reality application includes receiving, from an immersive reality application, a first audio waveform from a first acoustic source to provide to a user of a headset, determining a direction of arrival for the first acoustic source relative to the headset, and providing, to a speaker in the headset, an audio signal including the first audio waveform and intended for an ear of the user of the headset, wherein the audio signal includes a time delay and an amplitude for the first audio waveform based on the direction of arrival for the first acoustic source relative to the user of the headset. A non-transitory, computer-readable medium storing instructions which, when executed by a processor, cause a system to perform the above method, and the system, are also provided.

METHOD AND SYSTEM FOR GAZE-BASED CONTROL OF MIXED REALITY CONTENT
20230048185 · 2023-02-16 ·

Systems and methods are presented for discovering and positioning content into augmented reality space. A method includes forming a three-dimensional (3D) map of surroundings of a user of an augmented reality (AR) head mounted display (HMD); determining a depth-wise location of a gaze point of a user based on eye gaze direction and eye vergence; determining a visual guidance line pathway in the 3D map; guiding an action of the user along the visual guidance line pathway at one or more identified focal points; and rendering a mixed reality (MR) object along the visual guidance line pathway at a location corresponding to a direction of the user’s gaze.

SYSTEM AND METHOD FOR ENHANCING VISUAL ACUITY

A head wearable display system comprising a target object detection module receiving multiple image pixels of a first portion and a second portion of a target object, and the corresponding depths; a first light emitter emitting multiple first-eye light signals to display a first-eye virtual image of the first portion and the second portion of the target object for a viewer; a first light direction modifier for respectively varying a light direction of each of the multiple first-eye light signals emitted from the first light emitter; a first collimator; a first combiner, for redirecting and converging the multiple first-eye light signals towards a first eye of the viewer. The first-eye virtual image of the first portion of the target object in a first field of view has a greater number of the multiple first-eye light signals per degree than that of the first-eye virtual image of the second portion of the target object in a second field of view.

Apparatuses, Methods and Computer Programs for Controlling a Microscope System
20230046644 · 2023-02-16 ·

Examples relate to apparatuses, methods and computer programs for controlling a microscope system, and to a corresponding microscope system. An apparatus for controlling a microscope system comprises an interface for communicating with a camera module. The camera module is suitable for providing camera image data of a head of a user of the microscope system. The apparatus comprises a processing module configured to obtain the camera image data from the camera module via the interface. The processing module is configured to process the camera image data to determine information on an angular orientation of the head of the user relative to a display of the microscope system. The processing module is configured to provide a control signal for a robotic adjustment system of the microscope system based on the information on the angular orientation of the head of the user.

OPTICAL ELEMENT FOR INFLUENCING LIGHT DIRECTIONS, ARRANGEMENT FOR IMAGING A MULTIPLICITY OF ILLUMINATED OR SELF-LUMINOUS SURFACES, AND ILLUMINATION DEVICE
20230047322 · 2023-02-16 ·

An optical element including a plate-shaped substrate with a light-entrance surface and a light-exit surface, a multiplicity of imaging elements formed on the light-exit surface and a multiplicity of diaphragms formed on the light-entrance surface. Each diaphragm includes a transparent geometric region in an opaque region. The optical element can be switched between two operating modes B1 and B2 such that some of the imaging elements change their focal length between values f1 and f2 and/or, some of the diaphragms change their aperture width and/or their position. Exactly one diaphragm is associated with each imaging element in mode B1 so that light passing through the diaphragm is imaged or collimated by the associated imaging element. Consequently, light arriving in the optical element through the diaphragms and then through the light-entrance surface has, after passing through the associated imaging elements in the two operating modes B1 and B2, different propagation angles.

ZOOM OPTICAL SYSTEM, OPTICAL APPARATUS AND METHOD FOR MANUFACTURING THE ZOOM OPTICAL SYSTEM
20230048508 · 2023-02-16 ·

A variable magnification optical system (ZL) comprises a front group (GA) and a rear group (GB). The rear group (GB) has a first focusing lens group (GF1) and a second focusing lens group (GF2). From focusing on an object at infinity to focusing on a short-distance object, the front group (GA) is fixed with respect to an image surface, and the first focusing lens group (GF1) and the second focusing lens group (GF2) move on different trajectories along an optical axis. The variable magnification system satisfies the following conditional expressions.


0.25<βF1t/βF1w<2.00


0.25<βF2w/βF2t<2.00 where βF1t is the magnification of the first focusing lens group (GF1) in a telephoto end state, βF1w is the magnification of the first focusing lens group (GF1) in a wide-angle end state, βF2t is the magnification of the second focusing lens group (GF2) in the telephoto end state, and βF2w is the magnification of the second focusing lens group (GF2) in the wide-angle end state.

Light field near-eye display and method thereof for generating virtual reality images
20230045962 · 2023-02-16 ·

A method for generating virtual reality images and used in a light field near-eye display includes steps of: shifting a display image according to at least one change vector of a plurality of eye movement parameters, and calculating a compensation mask according to a simulated image and superimposing the compensation mask on a target image to generate a superimposed target image, wherein brightness distributions of the simulated image and the compensation mask are opposite to each other. The light field near-eye display is also provided. In this way, the light field near-eye display for generating virtual reality images and the method thereof can achieve the purpose of improving the uniformity of the image and expanding the eye box size.

HEAD-MOUNTED DEVICE
20230048991 · 2023-02-16 ·

A head-mounted device includes a first light field camera, a second light field camera, a first light field display, a second light field display and a supporting structure. Each of the first light field camera and the second light field camera includes, in order from an object side to an image side, a lens group, a collimator and an image sensor. Each of the lens groups includes a plurality of lens units. The lens units are arranged in a two-dimensional lens array, and each of the lens units includes a lens container and a plurality of lens elements. A first engaging structure is disposed between at least two adjacent lens elements of the lens elements.

XR RENDERING FOR 3D AUDIO CONTENT AND AUDIO CODEC
20230051841 · 2023-02-16 ·

A device includes a memory configured to store instructions and also includes one or more processors configured to execute the instructions to obtain audio data corresponding to a sound source and metadata indicative of a direction of the sound source. The one or more processors are configured to execute the instructions to obtain direction data indicating a viewing direction associated with a user of a playback device. The one or more processors are configured to execute the instructions to determine a resolution setting based on a similarity between the viewing direction and the direction of the sound source. The one or more processors are also configured to execute the instructions to process the audio data based on the resolution setting to generate processed audio data.