G02B27/017

IMAGE BASED DETECTION OF FIT FOR A HEAD MOUNTED WEARABLE COMPUTING DEVICE

A system and method of detecting display fit measurements and/or ophthalmic measurements for a head mounted wearable computing device including a display device is provided. An image of a fitting frame worn by a user of the computing device is captured by the user, through an application running on the computing device. One or more keypoints and/or features and/or landmarks are detected in the image including the fitting frame. A three-dimensional pose of the fitting frame is determined based on the detected keypoints and/or features and/or landmarks, and configuration information associated with the fitting frame. The display device of the head mounted wearable computing device can then be configured based on the three-dimensional pose of the fitting frame as captured in the image.

Force-Based Peripheral Control Toggle
20230046333 · 2023-02-16 ·

A smartwatch is configured to be usable to control a peripheral device. The smartwatch can be toggled between a typical or usual operating mode for a smartwatch and a mode for controlling the peripheral device. The toggling function is bound to a virtual button. Force applied to the virtual button is measured so that multiple functions, including the toggling function, can be accessed by applying different amounts of force to the button for a predetermined amount of time.

WORLD LOCK SPATIAL AUDIO PROCESSING
20230046341 · 2023-02-16 ·

A method for providing a world-locked experience to a user of a headset in an immersive reality application includes receiving, from an immersive reality application, a first audio waveform from a first acoustic source to provide to a user of a headset, determining a direction of arrival for the first acoustic source relative to the headset, and providing, to a speaker in the headset, an audio signal including the first audio waveform and intended for an ear of the user of the headset, wherein the audio signal includes a time delay and an amplitude for the first audio waveform based on the direction of arrival for the first acoustic source relative to the user of the headset. A non-transitory, computer-readable medium storing instructions which, when executed by a processor, cause a system to perform the above method, and the system, are also provided.

METHOD AND SYSTEM FOR GAZE-BASED CONTROL OF MIXED REALITY CONTENT
20230048185 · 2023-02-16 ·

Systems and methods are presented for discovering and positioning content into augmented reality space. A method includes forming a three-dimensional (3D) map of surroundings of a user of an augmented reality (AR) head mounted display (HMD); determining a depth-wise location of a gaze point of a user based on eye gaze direction and eye vergence; determining a visual guidance line pathway in the 3D map; guiding an action of the user along the visual guidance line pathway at one or more identified focal points; and rendering a mixed reality (MR) object along the visual guidance line pathway at a location corresponding to a direction of the user’s gaze.

DEVICES AND METHODS FOR AUTOMATED DELIVERY OF OPHTHALMOLOGICAL MEDICATIONS
20230051700 · 2023-02-16 ·

Disclosed are various embodiments for an apparatus for installation of eye drops, an automatic dispensing device, and a method to dispense eye drops. The apparatus including a dispensing device, a control system, and a housing. The dispensing device configured to dispense a dosage of a fluid medication from an eye drop bottle, the eye drop bottle having an opening. The control system operatively connected to the dispensing device. The housing comprising a main wearable headset and a cover. The housing configured to contain the dispensing device and the control system, and configured to allow passage of the dosage of the fluid medication through an aperture in the main wearable headset of the housing. Also, disclosed is an automatic dispensing device comprising a motor and detection system to dispense a fluid from a bottle.

XR RENDERING FOR 3D AUDIO CONTENT AND AUDIO CODEC
20230051841 · 2023-02-16 ·

A device includes a memory configured to store instructions and also includes one or more processors configured to execute the instructions to obtain audio data corresponding to a sound source and metadata indicative of a direction of the sound source. The one or more processors are configured to execute the instructions to obtain direction data indicating a viewing direction associated with a user of a playback device. The one or more processors are configured to execute the instructions to determine a resolution setting based on a similarity between the viewing direction and the direction of the sound source. The one or more processors are also configured to execute the instructions to process the audio data based on the resolution setting to generate processed audio data.

DISPLAY DEVICE AND ELECTRONIC DEVICE INCLUDING THE SAME
20230048041 · 2023-02-16 ·

Disclosed is a display device which includes a display panel that includes a plurality of pixels and includes a display area displaying an image, a panel controller that receives an external input signal from an external source and generates a control signal for dividing the display area into a first area and a second area which is disposed adjacent to the first area based on the external input signal, and an instrument module that stretches the first area and the second area of the display panel in response to the control signal. The number of the pixels per unit area in the first area is different from the number of the pixels per unit area in the second area.

FULL-FIELD METROLOGY TOOL FOR WAVEGUIDE COMBINERS AND META-SURFACES
20230046330 · 2023-02-16 ·

Embodiments described herein provide for metrology tools and methods of obtaining a full-field optical field of an optical device to determine multiple metrology metrics of the optical device. A metrology tool is utilized to split a light beam into a first light path and a second light path. The first light path and the second light path are combined into a combined light beam and delivered to the detector. The detector measures the intensity of the combined light beam. A first equation and second equation are utilized in combination with the intensity measurements to determine an amplitude and phase Ψ at a reference point directly adjacent to a second surface of the at least one optical device.

Metasurfaces with light-redirecting structures including multiple materials and methods for fabricating

Display devices include waveguides with metasurfaces as in-coupling and/or out-coupling optical elements. The metasurfaces may be formed on a surface of the waveguide and may include a plurality or an array of sub-wavelength-scale (e.g., nanometer-scale) protrusions. Individual protrusions may include horizontal and/or vertical layers of different materials which may have different refractive indices, allowing for enhanced manipulation of light redirecting properties of the metasurface. Some configurations and combinations of materials may advantageously allow for broadband metasurfaces. Manufacturing methods described herein provide for vertical and/or horizontal layers of different materials in a desired configuration or profile.

Visual-inertial tracking using rolling shutter cameras

Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.