G02B27/0093

Gaze tracking apparatus and systems

A system configured to perform an eye tracking process using a head-mountable eye-tracking arrangement, the system comprising an eye tracking unit, located on the head-mountable arrangement, operable to detect motion of one or both of the user's eyes, a relative motion identification unit operable to identify motion of the head-mountable arrangement relative to the user's head, and a correction unit operable to determine a correction to the eye tracking process in dependence upon the identified motion of the user's head relative to the head-mountable arrangement.

Head mounted display apparatus

The occlusion is faithfully expressed even in the binocular vision in the AR display by a head mounted display apparatus or the like. A head mounted display apparatus 10 includes a lens 12, a lens 13, a camera 14, a camera 15, and a control processor 16. A CG image for a right eye is displayed on the lens 12. A CG image for a left eye is displayed on the lens 13. The camera 14 captures an image for the right eye. The camera 15 captures an image for the left eye. The control processor 16 generates the CG image for the right eye in which occlusion at the time of seeing by the right eye is expressed and the CG image for the left eye in which occlusion at the time of seeing by the left eye is expressed, based on the images captured by the cameras 14 and 15 and projects the generated CG image for the right eye and CG image for the left eye onto the lenses 12 and 13. A center of a lens of the camera 14 is provided at the same position as a center of the lens 12. A center of a lens of the camera 15 is provided at the same position as a center of the lens 13.

Wearable device and control method therefor
11579683 · 2023-02-14 · ·

A wearable device is disclosed. The wearable device comprises: a camera; a sensor; a display; a laser projector; and a processor for identifying a user's sight line on the basis of sensing data obtained by the sensor, identifying location information of at least one object included in an image obtained by the camera, and controlling the laser projector to provide, on the display, augmented reality (AR) information related to the object on the basis of the identified user's sight line and location information of the object, wherein the laser projector comprises: a light emitting element for emitting a laser light; a focus adjusting element for adjusting a focus of the emitted laser light; and a scanning mirror for controlling a scan direction of the light having the adjusted focus, and the processor controls a driving state of the focus adjusting element on the basis of the identified user's sight line and location information of the object.

INFORMATION PROCESSING DEVICE THAT DISPLAYS A VIRTUAL OBJECT RELATIVE TO REAL SPACE
20230041942 · 2023-02-09 · ·

An information processing device including a display unit, a detector, and a first control unit and a method of using same. The display unit may be a head-mounted display. The display unit is capable of providing the user with a field of view of a real space and a virtual object. The detector detects an azimuth of the display unit around at least one axis and display of the virtual object is controlled based in the detected azimuth.

INTELLIGENT SYSTEM FOR CONTROLLING FUNCTIONS IN A COMBAT VEHICLE TURRET
20230045581 · 2023-02-09 ·

A system for controlling turret functions of a land-based combat vehicle includes: a plurality of image detection sensors for recording sequences of images having an at least partial view of a 360° environment of the land-based combat vehicle; at least one virtual, augmented or mixed reality headset for wear by an operator, the headset presenting the at least partial view of the environment of the land-based combat vehicle on a display, the headset including a direction sensor for tracking an orientation of the headset imparted during a movement of a head of the operator and eye tracking means for tracking eye movements of the operator; a control unit including at least one computing unit for receiving as input and processing: images supplied by the plurality of image detection sensors; headset position and orientation data supplied by the direction sensor; eye position data supplied by the eye tracking means.

WEARABLE APPARATUS FOR ACTIVE SUBSTITUTION
20230045237 · 2023-02-09 · ·

A hearing aid and related systems and methods. In one implementation, a hearing aid system may comprise a wearable camera configured to capture images from an environment of a user, a microphone configured to capture sounds from the environment of the user, and a processor. The processor may be programmed to receive images captured by the camera; receive audio signals representative of sounds captured by the microphone; operate in a first mode to cause a first selective conditioning of a first audio signal; determine, based on analysis of at least one of the images or the audio signals, to switch to a second mode to cause a second selective conditioning of the first audio signal; and cause transmission of the first audio signal selectively conditioned in the second mode to a hearing interface device configured to provide sound to an ear of the user.

Head Up Display Apparatus With a Bright Energy Efficient Backlight for a Vehicle
20230041447 · 2023-02-09 ·

A head up display apparatus for a vehicle includes an imaging unit that generates a projection light beam with display content and includes a transmissive display indication layer with selectively controllable display elements distributed over an area, a matrix backlight that provides backlighting therefor and includes selectively controllable light sources distributed along the transmissive display indication layer, and a collimation array with collimators arranged between a light source and the transmissive display indication layer, and a projection panel in the beam path of the projection light beam generated by the imaging unit for reflecting the projection light beam to a user, the projection panel being arranged in the beam path such that a virtual display image is generated therebehind in the visual field of the user.

EYE TRACKING SYSTEMS AND METHODS

Methods and systems for tracking an individual's eye, by tracking one or more ocular axes, are presented. The technique comprises the following: (i) illuminating the eye, over an area of the cornea extending over the pupil, with first and second incident light beams having a transverse cross sectional area smaller than a predetermined value with respect to an area of the pupil and propagating coaxially along a first optical path defined by central axes of the first and second incident light beams, wherein said first incident light beam is configured to be reflected from the cornea and said second incident light beam is configured to pass through the cornea and the pupil and to be reflected from a retina region of the eye; (ii) detecting respective first and second reflected light beams; (iii) adjusting the first optical path such that said first reflected light beam propagates along said first optical path and said second reflected light beam propagates along a second optical path having a predetermined spatial relationship with said first optical path whereby said predetermined spatial relationship is indicative of said ocular axis being along at least said first optical path; and (iv) tracking said ocular axis of the eye under changes in gaze direction of said eye by repeating (i) to (iii).

MULTI-LAYER REPROJECTION TECHNIQUES FOR AUGMENTED REALITY

This disclosure provides systems, devices, apparatus and methods, including computer programs encoded on storage media, for multi-layer reprojection techniques for augmented reality. A display processor may obtain a layer of graphics data including a plurality of virtual objects. Each the plurality of virtual objects may be associated with at least one bounding box of a plurality of bounding boxes. The display processor may further obtain metadata indicative of at least one edge of the at least one bounding box of the plurality of bounding boxes, and metadata corresponding to reprojection instructions associated with each of the plurality of bounding boxes. The display processor may reproject the plurality of virtual objects based on the metadata indicative of the at least one edge of the at least one bounding box and the metadata corresponding to the reprojection instructions.

HEAD-UP DISPLAY DEVICE AND MOBILE OBJECT
20230045329 · 2023-02-09 ·

A display region has a curved surface shape having upper and lower end portions disposed at positions closer to a visual field than a reference plane, and a central portion disposed at a position farther from the visual field than the reference plane. A first convergence angle difference between a convergence angle from the eye position to the upper end portion and a convergence angle from the eye position to a first point on the reference plane through the upper end portion, a second convergence angle difference between a convergence angle to the central portion and a convergence angle to a second point on the reference plane through the central portion, and a third convergence angle difference between a convergence angle to the lower end portion and a convergence angle to a third point on the reference plane through the lower end portion respectively fall within four milliradians.