G02B27/017

Reducing head mounted display power consumption and heat generation through predictive rendering of content

Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.

Pixel intensity modulation using modifying gain values

A visual perception device has a look-up table stored in a laser driver chip. The look-up table includes relational gain data to compensate for brighter areas of a laser pattern wherein pixels are located more closely than areas where the pixels are further apart and to compensate for differences in intensity of individual pixels when the intensities of pixels are altered due to design characteristics of an eye piece.

Method and device for carrying out eye gaze mapping

The invention relates to a device and a method for performing an eye gaze mapping (M), in which at least one point of vision (B) and/or a viewing direction of at least one person (10) in relation to at least one scene recording (S) of a scene (12) viewed by the at least one person (10) is mapped onto a reference (R). At least a part of an algorithm (A1, A2, A3) for performing the eye gaze mapping (M) is thereby selected from multiple predetermined algorithms (A1, A2, A3) as a function of at least one parameter (P), and the eye gaze mapping (M) is performed on the basis of the at least one part of the algorithm (A1, A2, A3).

Adjustable waveguide assembly and augmented reality eyewear with adjustable waveguide assembly

An adjustable frame assembly for augmented reality eyewear. The frame assembly includes a face portion for supporting at least one waveguide that creates an eye box, a support rest for supporting the face portion on a user, and a coupling for adjusting the position of the face portion relative to the support rest. This enables movement of the waveguide eye box relative to the support rest to position the eye box in front of the wearer's eyes.

Systems and methods for controlling virtual scene perspective via physical touch input

Systems, methods, and non-transitory computer readable media for controlling perspective in an extended reality environment are disclosed. In one embodiment, a non-transitory computer readable medium contains instructions to cause a processor to perform the steps of: outputting for presentation via a wearable extended reality appliance (WER-appliance), first display signals reflective of a first perspective of a scene; receiving first input signals caused by a first multi-finger interaction with the touch sensor; in response, outputting for presentation via the WER-appliance second display signals to modify the first perspective of the scene, causing a second perspective of the scene to be presented via the WER-appliance; receiving second input signals caused by a second multi-finger interaction with the touch sensor; and in response, outputting for presentation via the WER-appliance third display signals to modify the second perspective of the scene, causing a third perspective of the scene to be presented via the WER-appliance.

Measurement method and system
11579442 · 2023-02-14 · ·

Methods and systems for determining an individual gaze value are disclosed herein. An exemplary method involves: (a) receiving gaze data for a first wearable computing device, wherein the gaze data is indicative of a wearer-view associated with the first wearable computing device, and wherein the first wearable computing device is associated with a first user-account; (b) analyzing the gaze data from the first wearable computing device to detect one or more occurrences of one or more advertisement spaces in the gaze data; (c) based at least in part on the one or more detected advertisement-space occurrences, determining an individual gaze value for the first user-account; and (d) sending a gaze-value indication, wherein the gaze-value indication indicates the individual gaze value for the first user-account.

INFORMATION PROCESSING DEVICE THAT DISPLAYS A VIRTUAL OBJECT RELATIVE TO REAL SPACE
20230041942 · 2023-02-09 · ·

An information processing device including a display unit, a detector, and a first control unit and a method of using same. The display unit may be a head-mounted display. The display unit is capable of providing the user with a field of view of a real space and a virtual object. The detector detects an azimuth of the display unit around at least one axis and display of the virtual object is controlled based in the detected azimuth.

INTERACTION PERIPHERAL, DETECTION METHOD, VIRTUAL REALITY HEADSET, METHOD FOR REPRODUCING A REAL POINT IN VIRTUAL SPACE, DEVICE AND METHOD FOR VIRTUALISING A REAL SPACE
20230042781 · 2023-02-09 ·

An interaction peripheral, a method for detecting a real point, a virtual reality headset, a method for reproducing a real point in virtual space, a device and a method for virtualising a real space, particularly allowing a plane to be obtained in two, three or n dimensions of a real space which may be reproducible in virtual reality. An interaction peripheral which can be connected to a virtual reality headset, includes a range finder which can supply, to the headset, a measurement signal including a relative position measurement of a real point of a real space, the real point being sighted by the range finder. The measurement signal enables reproduction of the real point measured in a virtual space generated by the headset. Thus, the real point can be reproduced in real space while reducing risks of errors because the measurement tools are simple interaction peripherals handled by a user.

ULTRASOUND DEVICES FOR MAKING EYE MEASUREMENTS
20230043585 · 2023-02-09 ·

The disclosed ultrasound devices may include at least one ultrasound transmitter positioned and configured to transmit ultrasound signals toward a user's face to reflect off a facial feature of the user's face and at least one ultrasound receiver positioned and configured to receive and detect the ultrasound signals reflected off the facial feature. At least one processor may be configured to receive data from the at least one ultrasound receiver and to determine, based on the received data from the at least one ultrasound receiver, at least one of the following eye measurements: an interpupillary distance of the user; an eye relief; or a position of a head-mounted display relative to the facial feature of the user. Various other devices, systems, and methods are also disclosed.

Pilot and passenger seat

The present invention achieves technical advantages as a pilot and passenger seating. An aircraft employs a pilot seat, comprising a contoured structure having ergonomically formed and padded surfaces, with left and right arm supports that include an articulated control knob, movable in three rectangular axes and rotatable about a vertical axis to provide one or more aircraft steering functions for an aircraft, and a touch-sensitive control surface for controlling one or more power system components. A passenger seat, having a contoured structure, having ergonomically formed and padded surfaces, a headrest, a seat, a left support member, and a right support member are adapted to cradle a portion of a passenger's body to support the passenger during travel.