Eyeball Tracking System and Method based on Light Field Sensing
20220365342 ยท 2022-11-17
Inventors
Cpc classification
G02B2027/0198
PHYSICS
G06F3/011
PHYSICS
G02B2027/0187
PHYSICS
G02B27/0179
PHYSICS
International classification
Abstract
An eyeball tracking method based on light field sensing is provided. Firstly, light intensity image data of plenoptic images of respective eyes and direction data of infrared light are captured by light field cameras in real time, depth information of the plenoptic images is obtained according to the light intensity image data and the direction data of the infrared light, models with curvature are formed according to the depth information, regions where the models are located are determined as eyeball image plane regions, and normal vectors of respective eyeball image plane regions and positions of the normal vectors relative to respective light field cameras are determined to determine fixation directions of both eyes to complete tracking
Claims
1. An eyeball tracking system based on light field sensing, comprising an infrared light illumination source, two light field cameras, and an eyeball tracking processor, wherein the infrared light illumination source is configured to emit infrared light to both eyes; the two light field cameras are configured to respectively capture light intensity image data of plenoptic images of respective eyes and direction data of the infrared light in real time; and the eyeball tracking processor is configured to obtain depth information of the plenoptic images according to the light intensity image data and the direction data of the infrared light, form models with curvature according to the depth information, determine regions where the models are respectively located as eyeball image plane regions, and determine normal vectors of respective eyeball image plane regions and positions of the normal vectors relative to respective light field cameras to determine fixation directions of both eyes.
2. The eyeball tracking system based on light field sensing of claim 1, further comprising a display element, wherein the display element is configured to display virtual display content to both eyes.
3. The eyeball tracking system based on light field sensing of claim 2, further comprising an optical block, wherein the optical block is configured to guide the infrared light from the display element to pupils so that both eyes receive the infrared light.
4. The eyeball tracking system based on light field sensing of claim 3, further comprising a processor, wherein the processor is configured to receive data regarding the fixation directions of both eyes determined by the eyeball tracking processor and to fit the data regarding the fixation directions of both eyes in the virtual display content of a head-mounted virtual device.
5. The eyeball tracking system based on light field sensing of claim 4, wherein a wave band of the infrared light illumination source is 850 nm.
6. The eyeball tracking system based on light field sensing of claim 5, wherein the infrared light illumination source and the two light field cameras have a synchronous flickering frequency.
7. The eyeball tracking system based on light field sensing of claim 6, wherein the two light field cameras are 60 Hz cameras.
8. An eyeball tracking method based on light field sensing, implemented in the eyeball tracking system based on light field sensing of claim 1, the method comprising: capturing light intensity image data of plenoptic images of respective eyes and direction data of infrared light in real time through the two light field cameras; obtaining depth information of the plenoptic images according to the light intensity image data and the direction data of the infrared light; forming models with curvature according to the depth information, and determining regions where the models are respectively located as eyeball image plane regions; and determining normal vectors of respective eyeball image plane regions and positions of the normal vectors relative to respective light field cameras to determine fixation directions of both eyes.
9. The eyeball tracking method based on light field sensing of claim 8, wherein determining regions where the models are respectively located as eyeball image plane regions comprises: obtaining a centroid of respective model; calculating position coordinates of the centroid in the model; and mapping the position coordinates into the respective plenoptic image to form center coordinates of the respective eyeball image plane region.
10. The eyeball tracking method based on light field sensing of claim 9, wherein determining regions where the models are respectively located as eyeball image plane regions further comprises: obtaining a maximum width of respective model; and using the center coordinates as a circle center, and using the maximum width as a diameter to form the respective eyeball image plane region.
11. A non-transitory computer-readable storage medium, having a computer program stored therein which, when executed by a processor, implements the method of claim 8.
12. An electronic device, comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method according to claim 8.
13. A non-transitory computer-readable storage medium, having a computer program stored therein which, when executed by a processor, implements the method of claim 9.
14. An electronic device, comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method according to claim 9.
15. A non-transitory computer-readable storage medium, having a computer program stored therein which, when executed by a processor, implements the method of claim 10.
16. An electronic device, comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method according to claim 10.
17. The eyeball tracking method based on light field sensing of claim 8, further comprising: displaying virtual display content to both eyes.
18. The eyeball tracking method based on light field sensing of claim 17, further comprising: fitting data regarding the fixation directions of both eyes in the virtual display content of a head-mounted virtual device.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0037]
[0038]
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0039] The same light source is arranged in two eyeball tracking modules, so that when in calibration or use, light rays emitted by the identical light sources in the two eyeball tracking modules are likely to interfere with each other, especially for a user wearing myopia glasses, calculation result errors are increased, and the position accuracy of eyeball tracking is influenced.
[0040] Aiming at the above problem, the embodiments of the present disclosure provide an eyeball tracking system and method based on light field sensing. Exemplary embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
[0041] In order to illustrate the eyeball tracking system based on light field sensing provided in the embodiments of the present disclosure,
[0042] The following description of exemplary embodiments is only illustrative actually, and is not used as any limitation for the present disclosure and the application or use thereof. Technologies and devices known by those having ordinary skill in the related art may not be discussed in detail. However, where appropriate, these technologies and these devices shall be regarded as part of the description.
[0043] As shown in
[0044] In the embodiment shown in
[0045] In the embodiments shown in
[0046] In the embodiments shown in
[0047] As can be concluded from the above implementation manner, according to the eyeball tracking system based on light field sensing provided in the embodiments of the present disclosure, firstly, light intensity image data of plenoptic images of respective eyes and direction data of infrared light are captured by light field cameras in real time, depth information of the plenoptic images is obtained according to the light intensity image data and the direction data of the infrared light, models with curvature are formed according to the depth information, regions where the models are located are determined as eyeball image plane regions, and normal vectors of respective eyeball image plane regions and positions of the normal vectors relative to respective light field cameras are determined to determine fixation directions of both eyes to complete tracking, so that the light field camera can directly capture the direction data of the infrared light. Therefore, eyeball tracking can be realized without calculating the cornea center of eyes of a user depending on a spherical reflection model or a flickering position of the cornea and without positioning an external illumination source at a specific position relative to the light field camera. Only one infrared light illumination source is needed, and the infrared light illumination source can be placed at a random position, so that the placement of the light field camera has more degrees of freedom inside an integrated VR device, thereby avoiding the problem of mutual interference of two infrared light sources, and greatly improving the data quality, stability and tracking precision of eyeball tracking.
[0048] As shown in
[0049] At S110, light intensity image data of plenoptic images of respective eyes and direction data of infrared light are captured in real time through the two light field cameras.
[0050] At S120, depth information of the plenoptic images is obtained according to the light intensity image data and the direction data of the infrared light.
[0051] At S130, models with curvature are formed according to the depth information, and regions where the models are located are determined as eyeball image plane regions.
[0052] At S140, normal vectors of respective eyeball image plane regions and positions of the normal vectors relative to respective light field cameras are determined to determine fixation directions of both eyes to complete tracking.
[0053] As shown in
[0054] At S131-1, a centroid of respective model is obtained.
[0055] At S131-2, position coordinates of the centroid in the model are calculated.
[0056] At S131-3, the position coordinates are mapped into the respective plenoptic image to form center coordinates of the respective eyeball image plane region.
[0057] In the embodiment shown in
[0058] At S132-1, a maximum width of respective model is obtained.
[0059] At S132-2, the center coordinates are used as a circle center, and the maximum width is used as a diameter to form the respective eyeball image plane region.
[0060] As described above, according to the eyeball tracking method based on light field sensing provided in the embodiments of the present disclosure, firstly, light intensity image data of plenoptic images of respective eyes and direction data of infrared light are captured by light field cameras in real time, depth information of the plenoptic images is obtained according to the light intensity image data and the direction data of the infrared light, models with curvature are formed according to the depth information, regions where the models are located are determined as eyeball image plane regions, and normal vectors of respective eyeball image plane regions and positions of the normal vectors relative to respective light field cameras are determined to determine fixation directions of both eyes to complete tracking, so that the light field camera can directly capture the direction data of the infrared light. Therefore, eyeball tracking can be realized without calculating the cornea center of eyes of a user depending on a spherical reflection model or a flickering position of the cornea and without positioning an external illumination source at a specific position relative to the light field camera. Only one infrared light illumination source is needed, and the infrared light illumination source can be placed at a random position, so that the placement of the light field camera has more degrees of freedom inside an integrated VR device, thereby avoiding the problem of mutual interference of two infrared light sources, and greatly improving the data quality, stability and tracking precision of eyeball tracking.
[0061] The eyeball tracking system and method based on light field sensing proposed according to the embodiments of the present disclosure are described above by way of example with reference to the accompanying drawings. However, those having ordinary skill in the art should understand that various improvements can be made to the eyeball tracking system and method based on light field sensing proposed in the embodiments of the present disclosure, without departing from the content of the present disclosure. Therefore, the scope of protection of the present disclosure should be determined by the content of the appended claims.
[0062] The embodiments of the present disclosure also provide a computer-readable storage medium. The computer-readable storage medium stores a computer program. The computer program is configured to perform, when executed, the operations in any one of the above method embodiments.
[0063] In an exemplary embodiment, the computer-readable storage medium may include, but is not limited to, various media capable of storing a computer program such as a U disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a mobile hard disk, a magnetic disk, or an optical disc.
[0064] The embodiments of the present disclosure also provide an electronic device, which includes a memory and a processor. The memory stores a computer program. The processor is configured to run the computer program to perform the operations in any one of the above method embodiments.
[0065] In an exemplary embodiment, the electronic device may further include a transmission device and an input/output device. The transmission device is connected to the processor, and the input/output device is connected to the processor.
[0066] Specific examples in the embodiments may refer to the examples described in the above embodiments and exemplary implementation manners, and details are not described herein in the embodiments.
[0067] It is apparent those having ordinary skill in the art should understand that the above modules or operations of the present disclosure may be implemented by a general-purpose computing device, and they may be centralized on a single computing device or distributed on a network composed of multiple computing devices. They may be implemented with program codes executable by a computing device, so that they may be stored in a storage device and executed by the computing device, and in some cases, the operations shown or described may be performed in a different order than here, or they are separately made into individual integrated circuit modules, or multiple modules or operations therein are made into a single integrated circuit module for implementation. As such, the present disclosure is not limited to any particular combination of hardware and software.
[0068] The above is only the exemplary embodiments of the present disclosure, not intended to limit the present disclosure. As will occur to those having ordinary skill in the art, the present disclosure is susceptible to various modifications and changes. Any modifications, equivalent replacements, improvements and the like made within the principle of the present disclosure shall fall within the scope of protection of the present disclosure.
INDUSTRIAL APPLICABILITY
[0069] As described above, the eyeball tracking method based on light field sensing provided by the embodiments of the present disclosure has the following beneficial effects. Eyeball tracking can be realized without calculating the cornea center of eyes of a user depending on a spherical reflection model or a flickering position of the cornea and without positioning an external illumination source at a specific position relative to a light field camera, thereby greatly improving the data quality, stability and tracking accuracy of eyeball tracking.