DIGITAL ILLUMINATION ASSISTED GAZE TRACKING FOR AUGMENTED REALITY NEAR TO EYE DISPLAYS
20220350153 · 2022-11-03
Assignee
Inventors
Cpc classification
G02B6/4298
PHYSICS
G02B6/4204
PHYSICS
G02B27/4272
PHYSICS
G02B6/0023
PHYSICS
G02B6/005
PHYSICS
G02B2027/0187
PHYSICS
G02B27/0179
PHYSICS
G02B27/0093
PHYSICS
International classification
Abstract
A gaze tracking platform for human-machine interface device, such as a wearable Augmented Reality Near-to-Eye Display. The gaze tracking method, digital illumination assisted analog feedback tracking employs neither auxiliary camera nor digital image processing of human eye image that confronts challenges in gaze tracking speed, power, cost and space. Instead, an analog-digital hybrid method to track the gaze inspired by the groove tracking method that is widely adopted for optical data storage systems. In the method, a digital micromirror device generates angular modulated and infrared illuminating beam. The cornea reflects the infrared light and a segmented photodiode detects the reflection while providing a feedback servo signal to the digital micromirror device controller. The feedback signal is integrated over a time provides the variation gaze. Moreover, infrared and angularly modulated illumination is time-multiplexed with information displayed in visible wavelength. In this manner, single display device is dual used for information display and gaze tracking that benefits especially augmented reality devices in terms of achieving small device form factor.
Claims
1. A process having steps for image projection and gaze tracking, the steps comprising: a. illuminating a digital micromirror device with an infrared light; b. generating an infrared illuminating beam using the digital micromirror device, the infrared illuminating beam having a modulated angle; c. reflecting the infrared illuminating beam off of a user's cornea; d. tracking a user's gaze by detecting the reflection of the infrared illuminating beam off of the user's cornea using a quad detector; e. providing a feedback signal to the digital micromirror device, wherein the feedback signal represents a overall variation of the user's gaze; f. adjusting the angle of the infrared illuminating beam based upon the feedback signal; g. integrating the feedback servo signal over a time; and h. time-multiplexing the infrared illuminating beam.
2. The process of claim 1, wherein the angle of the infrared illuminating beam is adjusted by the digital micromirror device by turning on a pixel.
3. The process of claim 1, further comprising the step of: collimating the infrared illuminating beam using a projection lens with holographic chromatic corrector.
4. The process of claim 1, further comprising the step of: relaying the reflection of the infrared illuminating beam to the quad detector using at least one infrared holographic optical element lens.
5. A process having steps for image projection and gaze tracking, the steps comprising: a. illuminating a digital micromirror device via a light source, wherein the digital micromirror device is a micro electro mechanical system having an array of micro mirrors; b. turning on and off the micro mirrors in a synchronous manner to the light source; c. forming a visible image by pulse width modulation of the micro mirrors, d. incorporating an infrared light emitting diode into the pulse width modulation e. generating an infrared illuminating beam using the digital micromirror device, the infrared illuminating beam having a modulated angle; f. reflecting the infrared illuminating beam off of a user's cornea; g. tracking a user's gaze by detecting the reflection of the infrared illuminating beam off of the user's cornea using a quad detector; h. providing a feedback signal to the digital micromirror device, wherein the feedback signal represents the overall variation of the user's gaze; i. adjusting the angle of the infrared illuminating beam based upon the feedback signal; j. integrating the feedback servo signal over a time; and k. time-multiplexing the infrared illuminating beam.
6. The process of claim 5, wherein the light source is a red green blue light emitting diode.
7. The process of claim 5, further comprising the step of: combining the visible and infrared light using a dichroic beam splitter.
8. The process of claim 5, further comprising the step of: in and out-coupling the visible image using a holographic optical element coupler.
9. The process of claim 5, wherein the micro mirrors can turn on and off approximately 190 times within 1/60 sec for 23 kHz frame rate.
10. The process of claim 5, IR light source is collimated by the projection lens with holographic chromatic corrector that corrects the difference of focal length between visible and IR light.
11. The process of claim 5, further comprising the step of: relaying the reflection of the infrared illuminating beam to the quad detector using at least one infrared holographic optical element lens.
12. A digital illumination assisted gaze tracking for augmented reality near to eye display device, comprising: a digital micromirror device having a controller; an illumination beam generated by the digital micromirror device along an optical path, having an angle of incidence; a quad detector being in communication with the digital micromirror device and positioned along the optical path such that it can receive a reflected illumination beam having a gaze angle; and a servomechanism controller in communication with the digital micromirror device controller, such that a feedback loop is formed wherein a signal is produced by the quad detector is sent to the digital micromirror device, wherein the digital micromirror device controller adjusts the angle of incidence.
13. The device of claim 12, wherein the digital micromirror device is a micro electro mechanical system having an array of micro mirrors and pixels, wherein the angle of incidence is adjusted by turning on a pixel.
14. The device of claim 12, wherein the digital micromirror device generates a visible image.
15. The device of claim 12, further comprising a projection lens with holographic chromatic corrector holographic chromatic corrector along the optical path.
16. The device of claim 12, further comprising a holographic optical element along the optical path.
17. The device of claim 12, further comprising light emitting diode positioned to illuminate the digital micromirror device.
18. The device of claim 12, further comprising an infrared light source positioned to illuminate the digital micromirror device.
19. The device of claim 18, further comprising a dichroic beam splitter positioned between the infrared light source and the digital micromirror device.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0038] The present invention will be more fully understood and appreciated by reading the following Detailed Description in conjunction with the accompanying drawings, in which:
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
DETAILED DESCRIPTION OF EMBODIMENTS
[0064] The present disclosure describes digital illumination assisted gaze tracking for augmented reality near to eye displays.
[0065] According to an aspect is a method for image projection and gaze tracking, comprising the steps of individually turning on and off an array of micro mirrors in a synchronous manner to light sources; and forming a color image by Pulse Width Modulation (PWM) of the micro mirrors of DMD, the dual use of the DMD for information display and gaze tracking device, wherein red green blue (RGB) LEDs and an IR LED re used in the PWM.
[0066] In an aspect of the present invention, the optical method does not employ an additional camera for gaze tracking, but instead shares optics and a micro display device with the information projector.
[0067] A cross sectional schematic of the NED with proposed gaze tracking system 10 is depicted in
[0068] The IR illumination path 14 illustrates DMD 18 with reference to
Gaze Tracking Servo Principle
[0069]
[0070] Change in the angle of incidence moves the Purkinje image to the optical axis, thus the spot on the quad detector 34 is centered as depicted in
[0071] Based on the signals I.sub.x and I.sub.y the DMD driver turns on pixels so that the illumination angle changes to move the detector spot to keep signals I.sub.x and I.sub.y as zero. Thus, the DMD 18 and projection lens digitally steer IR light and changes angle of incidence (AOI) of the IR beam on the cornea by selecting corresponding pixels of the DMD 18. The feedback signals I.sub.x, and I.sub.y are integrated that provides overall variation of the gaze (θ.sub.LS,xθ.sub.LS,y). Since the IR illumination is time multiplexed to visible image, a sample hold function (now shown in
Verification of Principle by Ray Trace Modeling and Experiment
[0072] The principle is verified by ray trace and experiment.
[0073]
[0074] The feedback signal I.sub.y has a good linearity up to gaze angle θ.sub.LS,y˜8°. The detection range is limited to 10 degrees for the system due to a small clear aperture of the lens used for the experiment.
Dual use of Digital Micromirror Device (DMD) for Image Projection and Gaze Tracking
[0075] A distinct differentiation of the proposed approach from the state-of-the-art is dual use of a micro-display device for image projection and gaze tracking. A DMD is a Micro Electro Mechanical System (MEMS) based display device. An array of micro mirrors is individually turned on and off in a synchronous manner to light sources (RGB light emitting diodes). A color image is generated by a Pulse Width Modulation (PWM) of micro mirrors of DMD 18. In addition to RGB LEDs, an IR LED is incorporated in the PWM sequence as depicted in
[0076] With the current available conventional DLP systems, such as the DLP4100 paired with the DLP7000, rates up to 42 KHz are achieved with reduced vertical resolution or 24 KHz with full resolution. For a dynamic DMD IR source a system as shown in
Scalability of the Proposed Gaze Tracking Method
[0077] While proof of the concept of the gaze tracking method is critical, it is also important to determine the scalability of the approach. Implementation of the system to AR glass and recovery algorithm for the gaze tracking system from blinking of the eye is addressed.
Extension of FOV for Gaze Tracking
[0078] As demand for Field of View (FOV) increases, the demand for the FOV of gaze tracking also increases. The proposed gaze tracking system detects the Purkinje image from the first surface of the cornea. The full detectable range of the gaze angle is upper limited by θ.sub.LS=2 tan((D−d)/2L) ˜44 deg in full FOV where D=11.5 mm (diameter of cornea), d=1 mm (illumination beam diameter), and L=13 mm (distance between the center of eye rotation and the surface of the cornea). The FOV of gaze tracking is matched to the current FOV of AR device 40 degrees. Another limiting factor is size of the optics. As
[0079] On the other hand, the optical architecture for VR gaze tracking would be different. A recently published paper by the present inventors for DMD display on diffraction based FOV extension. The application of the method for wide FOV gaze tracking for AR and VR application will be researched as a ray trace model.
Recovery from Blinking Search Algorithm
[0080] One of the challenges in gaze tracking is blinking of the eye, and variation of gaze angle before and after the blink. Since the servo signal is not available during the blinking, alternative information is needed to restart gaze tracking. The gaze search algorithm is tested by digitally controlled illumination with a DMD 18. Once servo signal is lost, the display sequence depicted in
[0081] Alternatively, a focused illumination and detection by quad detector or position sensitive detector is employed. The configuration is an open loop gaze tracking that eliminates needs of using DMD as a part of illumination optical path.
[0082]
[0083] The focuses IR light towards center of eye rotation (14 mm measured from surface of cornea) is reflected and generates virtual image (Purkinje image) at about 3 mm away from the surface of the cornea.
[0084] The displacement of Purkinje image (dx, dy, dz) occurs as gaze changes in horizontal (y-direction), vertical (x-direction) as well as along optical axis (z-direction).
where m is a magnification of Purkinje image, R is radius if the cornea 7.5 mm, 1 is the position of focal point by the IC2 measured from the surface of cornea, which is 13 mm. The Purkinje image lineary shifts with θ.sub.g. Accordingly on the quad detector, position of Purkinje image shifts (
[0085] The gaze tracking operation is confirmed by a ray trace modeling (
[0086] The fundamental advantage of the proposed gaze tracking approach is that angular bandwidth (FOV) required for the gaze tracking, both for illumination and gaze detection are substantially small as compared to the angular bandwidth of the image guide. To achieve +/−30 degree gaze tracking, only +/−13 degrees of FOV is needed for IR path with f.sub.IC2=35 mm coupler for detection of the shift of Purkinje image. For illumination, “fixed” and converging illumination requires close to zero TIR bandwidth. The IR bandwidth requirement is within the display bandwidth of single layer image guide with n=1.5, +/−14 degrees. Current advance in high index glass/polymer substrates allow high index image guide, n=1.7 has +/−20 degrees which easily supports +/−13 degree bandwidth for gaze tracking. Besides, Analog gaze tracking with single quad detector has advantages in speed, cost, size, and computational power, and therefore power consumption as compared to camera based approach. Also the converging illumination that overfill the region of the cornea makes this approach robust to interference with eye lid, eye lash, recovery from blinking as discussed in later section. Moreover, the approach is prescription eye glasses ready that potentially increase adoption of gaze tracking considering that 50% of population owns eye-ware.
[0087] While various embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings are used. Those skilled in the art will recognize or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, embodiments may be practiced otherwise than as specifically described and claimed. Embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
[0088] The above-described embodiments of the described subject matter can be implemented in any of numerous ways. For example, some embodiments may be implemented using hardware, software or a combination thereof. When any aspect of an embodiment is implemented at least in part in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single device or computer or distributed among multiple devices/computers.