METHOD AND DEVICE FOR DETERMINING A REFRACTIVE ERROR
20230064322 · 2023-03-02
Inventors
- Alexander Leube (Aalen, DE)
- Torsten Strasser (Tübingen, DE)
- Arne Ohlendorf (Tübingen, DE)
- Eberhart Zrenner (Tübingen, DE)
- Siegfried Wahl (Donzdorf, DE)
Cpc classification
A61B3/032
HUMAN NECESSITIES
A61B3/103
HUMAN NECESSITIES
G06T3/40
PHYSICS
International classification
A61B3/032
HUMAN NECESSITIES
A61B3/00
HUMAN NECESSITIES
G06T3/40
PHYSICS
Abstract
A method, a device, and a computer program product for determining a refractive error of an eye of a user are disclosed, as well as a method for producing a spectacle lens. The method for determining includes: a) displaying an image with a spatial modulation to the user; b) optionally, recording a reaction of the user to a variation of the spatial modulation over time; c) detecting a point in time at which a perception threshold of the user is reached; and d) determining the refractive error of the user from the spatial modulation, wherein the image contains a source image with several picture elements, wherein values for an image parameter are assigned to the picture elements, and wherein the spatial modulation is generated such that the values of the image parameter determine the values of a modulation parameter of the spatial modulation in the image.
Claims
1. A method for determining at least one refractive error of at least one eye of a user, the method comprising the following steps: a) displaying at least one image to a user using a screen, wherein the at least one image contains at least one spatial modulation; c) detecting a point in time at which a perception threshold of the user is indicated by a reaction of the user using an evaluation unit; and d) determining a value for the at least one refractive error of the at least one eye of the user from the at least one spatial modulation in the at least one image at the point in time using the evaluation unit, wherein the at least one image contains a source image having a plurality of picture elements, wherein values for at last one image parameter are assigned to the plurality of picture elements, and wherein the at least one spatial modulation in the at least one image is generated in a manner that the values of the at last one image parameter of the plurality of picture elements determine the values of at least one modulation parameter of the at least one spatial modulation in the at least one image.
2. The method according to claim 1, further comprising the following step: b) recording the reaction of the user to at least one variation of the at least one spatial modulation over time using an input unit.
3. The method according to claim 1, wherein the at least one image parameter of the picture elements is selected from at least one of an intensity, a grayscale, a color, a polarization, or a temporal variation of the picture elements.
4. The method according to claim 1, wherein the perception threshold is selected from at least one of a contrast threshold, a color threshold, a polarization threshold, or a temporal threshold.
5. The method according to claim 1, wherein a type of the at least one spatial modulation is selected from at least one of a pulse width modulation, a frequency modulation, an amplitude modulation.
6. The method according to claim 1, wherein the at least one spatial modulation comprises at least one spatial frequency, and wherein the at least one spatial frequency corresponds to at least one carrier frequency of the modulation.
7. The method according to claim 6, wherein a phase of the at least one carrier frequency is, additionally, modulated.
8. The method according to claim 1, wherein the at least one variation of the at least one spatial modulation in the at least one image over time is selected from at least one of: (i) varying at least one spatial frequency of the at least one spatial modulation; (ii) varying a distance between the at least one image and the at least one eye of the user; and (iii) rotating the at least one image in a plane perpendicular with respect to a direction of view of the user.
9. The method according to claim 8, wherein the rotating of the at least one image in the plane being perpendicular with respect to the direction of view of the user is performed after, before, or concurrently with at least one of the varying of the at least one spatial frequency of the at least one spatial modulation or the varying of the distance between the at least one image and the at least one eye of the user.
10. The method according to claim 1, wherein the variation of the at least one spatial modulation time is performed until the perception threshold of the user for the source image as comprised by the at least one image is indicated by the reaction of the user.
11. The method according to claim 1, wherein the value for the refractive error of the at least one eye of the user is determined by demodulating the at least one image.
12. The method according to claim 11, wherein at least one filter is used for demodulating the at least one image.
13. A computer program product comprising executable instructions for performing a method for determining a refractive error of at least one eye of a user, wherein the method comprises the following steps: a) displaying at least one image to a user using a screen, wherein the at least one image comprises at least one spatial modulation; c) detecting a point in time at which a perception threshold of the user is indicated by a reaction of the user using an evaluation unit; and d) determining a value for the at least one refractive error of the at least one eye of the user from the at least one spatial modulation in the at least one image at the point in time using the evaluation unit, wherein the at least one image contains a source image having a plurality of picture elements, wherein values for at last one image parameter are assigned to the plurality of picture elements, wherein the at least one spatial modulation in the at least one image is generated in a manner that the values of the at last one image parameter of the plurality of picture elements determine the values of at least one modulation parameter of the at least one spatial modulation in the at least one image.
14. The computer program product according to claim 13, wherein the method further comprises the following step: b) recording the reaction of the user to at least one variation of the at least one spatial modulation over time using an input unit.
15. The computer program product according to claim 13, wherein the at least one image parameter of the picture elements is selected from at least one of an intensity, a grayscale, a color, a polarization, or a temporal variation of the picture elements.
16. The computer program product according to claim 13, wherein the perception threshold is selected from at least one of a contrast threshold, a color threshold, a polarization threshold, or a temporal threshold.
17. The computer program product according to claim 13, wherein a type of the at least one spatial modulation is selected from at least one of a pulse width modulation, a frequency modulation, an amplitude modulation.
18. The computer program product according to claim 13, wherein the at least one spatial modulation comprises at least one spatial frequency, wherein the at least one spatial frequency corresponds to at least one carrier frequency of the modulation.
19. The computer program product according to claim 18, wherein a phase of the at least one carrier frequency is, additionally, modulated.
20. The computer program product according to claim 13, wherein the at least one variation of the at least one spatial modulation in the at least one image over time is selected from at least one of: (i) varying at least one spatial frequency of the at least one spatial modulation; (ii) varying a distance between the at least one image and the at least one eye of the user; and (iii) rotating the at least one image in a plane perpendicular with respect to a direction of view of the user.
21. The computer program product according to claim 20, wherein the rotating of the at least one image in the plane being perpendicular with respect to the direction of view of the user is performed after, before, or concurrently with at least one of the varying of the at least one spatial frequency of the at least one spatial modulation or the varying of the distance between the at least one image and the at least one eye of the user.
22. The computer program product according to claim 13, wherein the variation of the at least one spatial modulation time is performed until the perception threshold of the user for the source image as comprised by the at least one image is indicated by the reaction of the user.
23. The computer program product according to claim 13, wherein the value for the refractive error of the at least one eye of the user is determined by demodulating the at least one image.
24. The computer program product according to claim 23, wherein at least one filter is used for demodulating the at least one image.
25. A computer-readable medium carrying a computer program comprising instructions which, when executed by a computer, cause the computer to carry out at least one of the features of the method of claim 1 by using at least one computer program.
26. A method for producing at least one spectacle lens for the at least one eye of the user, wherein the producing of the spectacle lens comprises processing a lens blank, wherein the processing of the lens blank is based on instructions configured to compensate at least one refractive error of the at least one eye of the user; and determining of the refractive error of the at least one eye of the user by performing the following steps: a) displaying at least one image to a user using a screen, wherein the at least one image comprises at least one spatial modulation; c) detecting a point in time at which a perception threshold of the user is indicated by a reaction of the user using an evaluation unit; and d) determining a value for the at least one refractive error of the at least one eye of the user from the at least one spatial modulation in the at least one image at the point in time using the evaluation unit, wherein the at least one image contains a source image having a plurality of picture elements, wherein values for at last one image parameter are assigned to the plurality of picture elements, wherein the at least one spatial modulation in the at least one image is generated in a manner that the values of the at last one image parameter of the plurality of picture elements determine the values of at least one modulation parameter of the at least one spatial modulation in the at least one image.
27. The method according to claim 26, wherein the determining of the refractive error of the at least one eye of the user further comprises the following step: b) recording the reaction of the user to at least one variation of the at least one spatial modulation over time using an input unit.
28. A device for determining a refractive error of at least one eye of a user, the device comprising: a screen, wherein the screen is configured to display at least one image and at least one variation of at least one spatial modulation in the at least one image to a user; and an evaluation unit, wherein the evaluation unit is configured to detect a point in time at which a perception threshold of the user is indicated by a reaction of the user, and to determine a value for the at least one refractive error of the at least one eye of the user from the at least one spatial modulation in the at least one image at the point in time, wherein the at least one image comprises a source image having a plurality of picture elements, wherein values for at last one image parameter are assigned to the plurality of picture elements, wherein the evaluation unit is further configured to generate the at least one spatial modulation in the at least one image in a manner that the values of the at last one image parameter of the plurality of picture elements determine the values of at least one modulation parameter of the at least one spatial modulation in the at least one image.
29. The device according to claim 28, wherein the device further comprises: an input unit, wherein the input unit is configured to record a reaction of the user to the at least one variation of the at least one spatial modulation in the at least one image over time.
30. The device according to claim 28, wherein the device further comprises at least one of: a processing unit, wherein the processing unit is configured to generate the at least one image and the at least one variation of the at least one spatial modulation in the at least one image; or a distance meter, wherein the distance meter is configured to determine a distance between the at least one image and the at least one eye of the user.
31. The device according to claim 30, wherein at least one of the screen, the input unit, the evaluation unit, the processing unit and the distance meter is comprised by at least one of a mobile communication device, or a virtual really headset.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0126] The disclosure will now be described with reference to the drawings wherein:
[0127]
[0128]
[0129]
[0130]
DESCRIPTION OF EXEMPLARY EMBODIMENTS
[0131]
[0132] In the exemplary embodiment as shown in
[0133] In addition, the device 110 may be configured to rotate the screen 118 displaying the image 120 within a plane being perpendicular with respect to a direction of view 126 of the user 114. In the exemplary device 110 as illustrated in
[0134] As further illustrated in
[0135] As shown in
[0136] The spatial modulation 132 used for generating the image 120 as depicted in
[0137] According to the present disclosure, the value for the image parameter “shade” of the picture elements 130 within the spatial period 138, thus, determines whether the value of “0.6” or of “0.4” is used for the duty cycle within the spatial period 138. Therefore, the duty cycle assumes the value of “0.6” in the black sections of the source image 128, while the duty cycle assumes the value of “0.4” in the white sections of the source image 128. As a result, the image 120 has, on one hand, first areas comprising the black sections of the source image 128 and the thick black stripes 134 having the value of “0.6” for the duty cycle and, on the other hand, second areas comprising the white sections of the source image 128 and the thin black stripes 134 having the value of “0.4” for the duty cycle. Thus, the value of the duty cycle may vary between adjacent spatial periods 138, whereby the length of the spatial period 138 is, however, left unaffected.
[0138] In addition, the image 120 as illustrated in
[0139] Alternatively or in addition, at least one other modulation type (not depicted here) can be used for the at least one spatial modulation 132, wherein the other modulation type may, particularly, be selected from an amplitude modulation or a frequency modulation. Using the amplitude modulation, on one hand, would, as further depicted in
[0140] As further illustrated in
[0141] varying the spatial frequency of the spatial modulation 132;
[0142] varying the distance 122 between the image 120 and the at least one eye 112 of the user 114; and
[0143] rotating the image 120 in a plane perpendicular with respect to the direction of view 126 of the user 114.
[0144] As indicated above, the spatial frequency of the spatial modulation 132 may be varied, either in a direct fashion according to embodiment (i), by displaying a different image 120 having a different a spatial frequency to the user 114 or, in an indirect fashion according to embodiment (ii), by varying the distance 122 between the image 120 and the at least one eye 112 of the user 114. As further indicated above, the image 120 may be rotated in a plane being perpendicular to the direction of view 126 of the user 114 according to embodiment (iii) by using a rotating unit configured to physically rotate the image 120. However, the image 120 may also be rotated in a plane that is perpendicular to the direction of view 126 of the user 114 in a virtual fashion by rotating the image 120 as displayed on the screen 118. In addition, further ways of rotating the image 120 may also be conceivable.
[0145] As depicted in
[0146] As further illustrated in
[0147] In accordance with the present disclosure, the evaluation unit is configured to detect a point in time at which a perception threshold of the image 120 for the user 114 is indicated by the reaction of the user 114 to the image 120, in particular to the variation of the spatial modulation 132 in the image 120 over time. As defined above, the perception threshold may refer to a first event, in which the user may be able to recognize the source image 128 within the image 120, or to a further event, in which the user can just still recognize the source image 128 before the source image 128 vanishes from the image 120 as recognized by the user 114.
[0148] Accordingly, the perception threshold can be used to determine the point in time at which the first event or the further event may occur. However, by knowing the point in time as deduced in this fashion, respective values can be derived for the at least one modulation parameter of the spatial modulation 132 in the image 120 at the point in time, on one hand, and, due to the relationship between the modulation parameter used for generating the spatial modulation 132 in the image 120 and the at least one image parameter initiating the spatial modulation 132 in the image 120, for the at least one image parameter at the point in time, on the other hand. In accordance with the present disclosure, such a deduction can be used to determine a value for the at least one refractive error of the at least one eye 112 of the user 114. The value for the at least one refractive error of the at least one eye 112 of the user 114 can, subsequently, be reported to at least one of the user 114 or the assistant, preferably, via the screen 118 after the end of displaying the one or more images 120 to the user 114. However, further fashions for reporting the value of the at least one refractive error of the at least one eye 112 of the user 114 are conceivable.
[0149] In particular, by varying the spatial frequency of the spatial modulation 132, either according to embodiment (i) or to embodiment (ii), without rotating the image 120 according to embodiment (iii) as defined above, a refractive value of a single main section of a spherocylindrical lens which may be required to correct this part of the refractive error of the user 114 can, thus, be determined, on one hand. On the other hand, rotating the image 120 according to embodiment (iii) allows determining a cylindrical power which refers, as defined above, to an algebraic difference between the refractive values of the main sections and the cylinder axis that indicates the direction of the main section of the spectacle lens whose apex refractive index is used as reference.
[0150] In a particularly preferred embodiment, the desired value for the refractive error of the at least one eye of the user 114 can be determined by demodulating the image which has, previously, been modulated by the at least one type of modulation using the at least one modulation parameter. Herein, a demodulation of the image 120 can be considered as filtering the image at the point in time, whereby a filter which can be used for the demodulation of the image 10 may, thus, comprise additional information related to the point in time. In particular, a frequency which is used for modulating the source image 128 can be considered as proportional to a power of the filter which is, thus, proportional to the desired value for the refractive error of the user 114.
[0151] As already described above, the filter as used for the demodulation, typically, comprises a low pass filter, in particular the two-dimensional equivalent of the low pass filter which is, usually, denoted by the terms “Gauss filter” or “Sinc.sup.2 filter,” which is configured to remove high frequency portions from the image 120. As schematically depicted in
[0152] Alternatively or in addition, an optical filter may be used for the same purpose with similar efficiency as illustrated in
[0153] Corresponding to each of
[0154]
[0155] In a displaying step 212 according to step a), the image 120 is displayed to the user 114, wherein the image 120 comprises the at least one spatial modulation 132 as described above in more detail.
[0156] In a recording step 214 according to step b), the reaction of the user 114 to the variation of the spatial modulation 132 in the image 120 may be recorded over time. Herein, the spatial modulation 132 in the image 120 can be varied, preferably by using at least one of the embodiments (i) to (iii) as indicated above in more detail. Herein, the recording step 214 may be repeated with different images 120 displaying at least one of a different value of the spatial modulation 132 or a different orientation of the spatial modulation 132 in a plane perpendicular to the direction of view 126 of the user 114 until the point in time is detected in a detecting step 216.
[0157] In the detecting step 216 according to step c), the point in time is detected at which the perception threshold of the user 114 is indicated by a reaction of the user 114 to the particular image 120 as currently displayed to the user 114. For this purpose, the input unit 144 as described above in more detail may be used, however, further kinds of input units may also be feasible. In an alternative embodiment, the perception threshold of the user 114 may be indicated by a reaction of the user 114 without requiring an input unit, wherein the reaction of the user 114 may, preferably, be at least one physiological reaction, in particular, selected from at least one of the physiological reactions as described above in more detail. In this alternative embodiment, performing the recording step 214 according to step b) may be dispensable.
[0158] In a determining step 218 according to step d), the value of at least one modulation parameter of the at least one spatial modulation 132 in the image 120 at the point in time as detected in the detecting step 216 is used for determining a desired value 220 for the at least one refractive error of the at least one eye 112 of the user 114. For details concerning the determining step 218, reference can be made to the description above.
[0159] The foregoing description of the exemplary embodiments of the disclosure illustrates and describes the present invention. Additionally, the disclosure shows and describes only the exemplary embodiments but, as mentioned above, it is to be understood that the disclosure is capable of use in various other combinations, modifications, and environments and is capable of changes or modifications within the scope of the concept as expressed herein, commensurate with the above teachings and/or the skill or knowledge of the relevant art.
[0160] All publications, patents and patent applications cited in this specification are herein incorporated by reference, and for any and all purposes, as if each individual publication, patent or patent application were specifically and individually indicated to be incorporated by reference. In the case of inconsistencies, the present disclosure will prevail.
LIST OF REFERENCE SIGNS
[0161] 110 device [0162] 112 eyes [0163] 114 user [0164] 116 electronic device [0165] 118 screen [0166] 120 image [0167] 122 distance [0168] 124 distance meter [0169] 126 direction of view [0170] 128 source image [0171] 130 picture element [0172] 132 spatial modulation [0173] 134 stripes [0174] 136 spatial distance [0175] 138 spatial period [0176] 140 zig-zag pattern [0177] 142 row [0178] 144 input unit [0179] 146 keyboard [0180] 148 finger [0181] 150 evaluation unit [0182] 152 connection [0183] 154 reconstructed source image [0184] 156 reconstructed source image [0185] 160 grayscale value [0186] 162 location [0187] 164 course [0188] 166 course [0189] 210 method [0190] 212 displaying step [0191] 214 recording step [0192] 216 detecting step [0193] 218 determining step [0194] 220 value for the refractive error