METHOD AND DEVICE FOR MEASURING AN OPTICAL LENS FOR INDIVIDUAL WEARING SITUATIONS BY A USER
20210325276 · 2021-10-21
Inventors
Cpc classification
A61B3/1005
HUMAN NECESSITIES
A61B3/107
HUMAN NECESSITIES
A61B3/0025
HUMAN NECESSITIES
A61B3/103
HUMAN NECESSITIES
G01B11/245
PHYSICS
International classification
Abstract
An apparatus for measuring a cornea of a subject contains an image capturing device configured to capture image data of an iris of the subject from a plurality of viewpoints by imaging beam paths which pass through the cornea and a computing unit for providing a mathematical model of an anterior eye section of the subject including a mathematical model of the cornea and the iris. The model further identifies and registers image features of the iris, which are present in a plurality of images of the image data; determines deviations between actual positions of the image features of the iris in the images captured from the plurality of viewpoints and expected positions of the image features of the iris in the images captured from the plurality of viewpoints in consideration of the mathematical model of the cornea and the position of the iris.
Claims
1. An apparatus for measuring a cornea of a subject, the apparatus comprising: an image capturing device configured to capture image data of an iris of the subject from a plurality of viewpoints by imaging beam paths which pass through the cornea; and a computing unit configured to: provide a mathematical model of an anterior eye section of the subject including a mathematical model of the cornea and the iris; identify and register image features of the iris, which are present in a plurality of images of the image data; determine deviations between actual positions of the image features of the iris in the images captured from the plurality of viewpoints and expected positions of the image features of the iris in the images captured from the plurality of viewpoints in consideration of the mathematical model of the cornea and the position of the iris; adapt parameters of the mathematical model of the cornea so as to minimize the deviations; and determine a metric of the cornea from the adapted mathematical model of the cornea.
2. The apparatus as claimed in claim 1, wherein the computing unit is configured to evaluate a refractive power or an astigmatism from the adapted mathematical model of the cornea.
3. The apparatus as claimed in claim 1, wherein the mathematical model comprises a relative position of the iris with respect to the cornea.
4. The apparatus as claimed in claim 1, wherein the apparatus is configured to calculate the cornea situated between the iris and the image capturing device without prior knowledge about how the iris looks.
5. The apparatus as claimed in claim 1, wherein the apparatus is configured to perform a correlation of previously unknown image features in images of the iris recorded from different viewpoints.
6. The apparatus as claimed in claim 1, wherein the computing unit is configured to calculate the shape of the cornea based on a system of equations from imaging beam paths which are captured at the respective known positions.
7. The apparatus as claimed in claim 1, wherein the image capturing device comprises: a first camera configured to capture first image data from a first viewpoint; and a second camera configured to capture second image data from a second viewpoint, wherein the computing unit is configured to determine the three-dimensional shape of the cornea on the basis of the first image data and the second image data.
8. The apparatus as claimed in claim 1, wherein the computing unit is configured to iteratively determine the three-dimensional shape of the cornea with an integration method.
9. The apparatus as claimed in claim 1, wherein the computing unit is configured to determine the three-dimensional shape of the cornea on a basis of back tracing the respective imaging beam paths entering the image capturing device.
10. The apparatus as claimed in claim 1, wherein the computing unit is configured to determine the three-dimensional shape of the cornea by dividing at least one of a front surface or a back surface of the cornea into surface elements and to determine an alignment of the surface elements.
11. The apparatus as claimed in claim 10, wherein the computing unit is configured to determine a three-dimensional shape of the front surface and the back surface of the cornea on the basis of the alignment of the surface elements.
12. The apparatus as claimed in claim 1, wherein the computing unit is configured to determine the three-dimensional shape of the cornea taking account of a boundary condition that a front surface or a back surface of the cornea has a parameterizable area.
13. The apparatus as claimed in claim 12, wherein the parameterizable area includes a sphere, a section of the sphere, a torus, or a section of the torus.
14. The apparatus as claimed in claim 1, wherein the computing unit is further configured to determine the three-dimensional shape of the cornea taking account of one or more known contact points of the cornea.
15. The apparatus as claimed in claim 1, wherein the computing unit is configured to determine a spatial refractive index distribution of the cornea to be measured.
16. A method for measuring the cornea of a subject, the method comprising the steps of: capturing image data of an iris of the subject from a plurality of viewpoints by imaging beam paths which pass through the cornea; providing a mathematical model of an anterior eye section of the subject including a mathematical model of the cornea and the iris; identifying and registering image features of the iris, which are present in a plurality of images of the image data; determining deviations between actual positions of the image features of the iris in the images captured from the plurality of viewpoints and expected positions of the image features of the iris in the images captured from the plurality of viewpoints in consideration of the mathematical model of the cornea and the position of the iris; adapting parameters of the mathematical model of the cornea so as to minimize the deviations; and determining a metric of the cornea from the adapted mathematical model of the cornea.
17. The method of claim 16, further comprising a preceding step of calibrating the camera system for capturing the image date of the iris from the plurality of viewpoints.
18. A computer program product being stored on a non-transitory storage medium and comprising instructions that, upon execution of the program by a computer, cause the computer to perform the method of claim 16.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0048] The disclosure will now be described with reference to the drawings wherein:
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
DESCRIPTION OF EXEMPLARY EMBODIMENTS
[0066] The apparatus 10 shown in
[0067] The apparatus 10 further comprises an image capturing device 30 which is configured to capture image data of the test structure 21 from a plurality of viewpoints 31, 31′, 31″ by way of imaging beam paths 32 which pass through the lens 100. On the one hand, the imaging beam paths from the various viewpoints could be recorded successively by one camera, which is successively arranged at the various positions. However, a plurality of cameras are typically provided in order to capture the image data in parallel. It is understood that mixed forms may also be provided. By way of example, an image capturing device 30 can comprise a first camera 33 and a second camera 34, wherein the first camera 33 is configured to capture first image data from a first viewpoint 33 and the second camera 34 is configured to capture second image data from a second viewpoint 33″. The measurement volume 200 is located between the test structure 21, which is displayable on the display device 20, and the image capturing device 30.
[0068] The apparatus 10 further comprises a computing unit 40. By way of example, the computing unit 40 can be a computer, a microcontroller, an FPGA or the like. The computing unit 40 is configured to determine a three-dimensional shape of the lens 100 on the basis of the image data; and to calculate an optical effect of the lens 100 on the basis of the three-dimensional shape. Expressed differently, a two-stage procedure is proposed, in which the three-dimensional shape of the lens is initially determined and only then is the optical effect of the lens calculated from its three-dimensional shape.
[0069] This approach as per the present disclosure should be explained in more detail below with reference to
[0070]
[0071]
[0072] However, in this case, the spectacles 101 with the optical lens 100 are arranged with a large tilt in the measurement region, and so the beam deflection caused by the optical lens 100 only reproduces the actual optical effect in a wearing position with limited accuracy.
[0073]
[0074] The inventors have recognized that such uncertainty or ambiguity of the optical effect can be resolved by virtue of recording the test structure from a plurality of viewpoints and consequently capturing a multiplicity of imaging beam paths (see also
[0075] A simplified example of beam paths through a lens 100 is reproduced in
[0076] To this end, the computing unit can be configured to model the lens 100, typically as a composed surface made of parameterizable surface elements, as shown in
[0077] Optionally, the apparatus can be embodied as an apparatus for measuring a spatial refractive index distribution of an optical lens arranged in a measurement volume. To this end, provision can typically be made of an interface which is configured to receive lens geometry data, which describe a three-dimensional shape of the lens. In this case, the shape of the lens need not be calculated; instead, it can serve as an input parameter for calculating the spatial refractive index distribution of the lens on the basis of the image data and the lens geometry data.
[0078] Referring to
[0079]
[0080]
[0081] Light from defined sources at defined origins of the test structure 21 passes through the lens 100 and is captured by the image capturing device 30 from different viewing angles by means of a calibrated camera system. The refractive surfaces of the body are reconstructed from the images arising.
[0082] The principle works with one camera, two cameras or more cameras. Two cameras are used in an exemplary embodiment, as a good cost/use ratio can be obtained in this case. Even more cameras can be used to further increase the accuracy.
[0083] The image capturing device 30 or the cameras 31, 31′ is/are calibrated in such a way that a function is known, by means of which a unique chief light ray (camera ray) can be derived in 3D for each sensor coordinate from the origin and direction. This calibration can be carried out according to the related art. Alternatively, a known optical design of the camera and/or of an employed objective can be included in the model of the camera instead of the above-described camera calibration.
[0084] By way of example, the display device 20 can have self-luminous sources, such as light-emitting diodes arranged in an array, a TFT or LED display, a 3D display, laser sources, a polarization display, or else a collimated, selectively structured illumination unit. Light can also be shone on the display apparatus. By way of example, a display apparatus on which light is shone may have test charts (e.g., a point pattern or checker pattern), an in particular regular 3D pattern, an unknown feature-rich flat image (wherein positions can be estimated during the operation) or else an unknown feature-rich 3D scene (positions are estimated during the optimization).
[0085] The computing unit 40 can use further information for determining the three-dimensional shape. The reconstruction of the three-dimensional shape may in particular also be based on the known viewpoints or positions of the camera, from which the image data are captured, and a known position of the test structure. In the present example, the image data can be locations of the imaging of light beams, entering the cameras, on the camera detectors. The light beams entering the image capturing device can be calculated from the image data and the known viewpoints. A calibration of the image capturing device can serve as a basis for this.
[0086] Optionally, the computing unit 40 can further be configured to determine the three-dimensional shape of the lens taking account of one or more boundary conditions. By way of example, a contact point or stop 51 may be predetermined. The relative position of the lens 100 is known at this point and can be taken into account when determining the three-dimensional shape of the lens. Further, information such as the shape of a front and/or back surface of the lens, a refractive index or material, etc., may be predetermined. Optionally, the apparatus can be embodied to read information present on the lens, for example in the form of an engraving or a marker 140, and take this information into account when determining the three-dimensional shape and/or when calculating the optical effect.
[0087] A particularly advantageous application of the present disclosure lies in the measurement of spectacle lenses, in particular the measurement of progressive spectacle lenses—also known as varifocal spectacle lenses. Simpler spectacle lenses such as spherical, aspherical, toric or prismatic lenses can, however, likewise be measured using the apparatus proposed.
[0088] Optionally, the computing unit can be configured to calculate an ISO vertex power or a vertex power in a specified measuring appliance configuration in order to provide comparable data. By providing wearer-specific data, such as the distance of a pupil from the spectacle lens (vertex distance) and its relative position (e.g., face form angle or “as worn” pantoscopic angle), it is possible to calculate use vertex powers.
[0089] Optionally, a plurality of test objects can be measured simultaneously in the measurement space. In the case where a pair of spectacles with a left and a right spectacle lens is measured, the computing unit can be further embodied to determine a position and relative position of the spectacle lenses with respect to one another. From this, it is possible to calculate further information, such as the distance of the optical channels for example. A transparent body with zones of different effects can also be provided as a plurality of test objects. By way of example, this can be a pair of spectacles with two lenses or a lens with a plurality of zones—bifocal lens, trifocal lens or multifocal lens.
[0090]
[0091] Optionally, the measuring method can be preceded by step 905 for calibrating the apparatus.
[0092] A corresponding method for calibrating the apparatus may, in turn, include the following steps: In a first calibration step, a test structure is provided on the display device. In a second calibration step, a first distance is set between the image capturing device and the display device and image data of the test structure are captured from the first distance by means of the image capturing device.
[0093] As shown in
[0094] In a further step of the method for calibrating the apparatus, a second distance can be set between the image capturing device and the display device and image data of the test structure are captured from the second distance by means of the image capturing device. From this, a direction of incident light beams, captured by the image capturing device, and corresponding image points in the image data can be determined in a further step.
[0095]
[0096] In a first step S1011, a test structure is displayed on the display device. By way of example, this can be an entire point or stripe pattern. In a further step S1012, image data of the test structure are captured by the image capturing device. In step S1013, it is possible to determine positions of features of the test structure, for example the positions of pattern points in the image data (corresponding to positions on a detector surface of the image capturing device). Here, there can be a camera calibration step S1001, as explained above or described in detail in
[0097] In a step S1021, a complete or partial pattern of a test structure can be displayed on the display device. In a further step S1022, image data of the test structure are captured by the image capturing device. In step S1023, pattern points can be associated with image points in the image data. In particular, it is possible to provide a sequence of different test patterns in order to resolve a possible ambiguity when associating pattern points with image points in the image data. Expressed differently, luminous spots in the image data captured by the image capturing device can be assigned to a position of the luminous points on the display device, and hence also to the calculated light beams, which were incident in the image capturing device. As an alternative or in addition thereto, the computing unit can be configured to determine neighborhood relationships from an overall pattern of a test structure.
[0098] In a step S1031, a planar illumination can be provided on the display device. By way of example, all pixels of the display device could display “white”. As a consequence, a contour of the lens could stand out and a contour of the lens can be determined in step S1032. In a step S1033, a relative position and dimensions of the lens can be determined on the basis of the captured contour. Expressed differently, a relative position of the lens in the measurement volume can be determined in a simple manner.
[0099] In step S1041, there can be a calculation of a “best fitting” parameterizable lens. Typically, a “best fitting” parameterizable lens, which could lie in the measurement volume of the appliance, can be ascertained by back propagation of the camera light beams. A parameterizable lens is understood to mean a lens that can be described by few parameters such as radius, thickness or the refractive index. These include spherical and toric lenses, for example. Toric lenses are a general compromise, which may be applied here. In a more specific exemplary embodiment, it may be sufficient to define individual “toric zones” on the lens and only describe the spectacle lens there. By way of example, a first of these zones could be a “far region” of a progressive lens. By way of example, a second of these zones could be a “near region” of a progressive lens. In addition to the location of the lens or the individual surfaces, further parameters could be the radii, the thickness and the refractive index.
[0100] In step S1042, a “best fitting” gradient surface of the front and/or back surface of the lens can be determined by inverse ray tracing of the camera rays. Consequently, a surface of the “best fitting” parameterizable lens determined in step S1041 can be described as a gradient surface and the gradients at the locations of the beam passage can be varied in such a way that the positions of the luminous points on the display device are impinged perfectly by back propagation of the camera rays. In simple terms, the three-dimensional shape of the lens is therefore adapted in such a way that the light beams received by the image capturing device and the associated beam sources fit together on the display device.
[0101] In step S1043, a front and/or back surface of the lens can be obtained by integration from the gradient surface. Expressed differently, a (continuous) new surface is determined from a piecewise gradient surface or a gradient surface determined for surface elements. Here, this could be the front surface or the back surface of the lens.
[0102] According to step S1044, steps S1042 and S1043 can be repeated iteratively. By way of example, the steps could be repeated until a quality criterion has been satisfied. Optionally, if a sufficient quality cannot be reached, step S1041 can also be included in the iteration loop, in order to take account of alternative lens geometries. A three-dimensional shape of the lens can be available as a result of the iteration.
[0103] In a further exemplary embodiment, which may assist with the understanding of the disclosure, a shape of the lens can be predetermined and, instead, a spatial refractive index distribution within the lens can be iteratively determined in analog fashion.
[0104] One or more variables can be subsequently determined from the determined three-dimensional shape (optionally including the refractive index). A use value, in particular a user-specific use value, can be calculated in step S1052. To this end, wearer-specific data, such as a distance between cornea and apex, can be provided in step S1051. An ISO vertex power can be determined in step S1053. A vertex power in an appliance configuration can be determined in step S1054.
[0105] If a plurality of lenses or spectacle lenses were arranged in the measurement volume at the same time, it is optionally possible to determine additional parameters, such as the spacing of the progression channels.
[0106] It is understood that the aforementioned steps can be carried out by the computing unit and that the latter can be configured accordingly for the purposes of carrying out the steps.
[0107]
where (x.sub.0, y.sub.0, 0) describes a point of the light beam in a reference plane of the image capturing device, typically a point of the light beam in a reference plane in the lens system of a camera of the image capturing device, and (dx, dy, 1) describes the direction vector of the incident beam. Consequently, the set of functions consists of four functions: x.sub.0(x,y), y.sub.0(x,y), dx(x,y), and dy(x,y), where x and y describe the pixel coordinates in image data of the image capturing device, a camera in this case.
[0108] Such a set of functions can be determined by virtue of a test structure, e.g., a point pattern, being displayed on the display device and being observed by the cameras of the image capturing device from different distances. For this purpose, the apparatus as illustrated in
[0109] In the method shown in
[0110] As shown in
[0111] Optionally, a relative spatial position of a contact point 51, as illustrated in exemplary fashion in
[0112] It is understood that the explanations made above can apply accordingly to the exemplary embodiments below, and vice versa. To avoid repetition, further aspects, in particular, are intended to be discussed below. Features of the aforementioned exemplary embodiments and the exemplary embodiments below can advantageously be combined with one another.
[0113] The inventors have recognized that the concepts described herein can also be advantageously used for measuring the cornea.
[0114]
[0115] The image capturing device 30 can again have the same or similar configuration as described for
[0116] The inventors have recognized that the cornea 1201 situated between the iris 1202 and the image capturing device 30 can also be calculated without knowledge about how the iris 1202 looks. The iris 1202 has an unknown structure or an unknown pattern. However, the iris 1202 is usually very structured. The inventors have recognized that a multiplicity of image features of the iris can be identified and subsequently evaluated in respect of their position in a plurality of images of the image data, which were recorded from different positions. To this end, a system of equations can be set up from the imaging beam paths 32 which are captured at the respective known positions; the shape of the cornea 1201 can be calculated therefrom.
[0117]
[0118] On the basis of this correlation or association analysis, it is possible to reconstruct a multiplicity of beam paths, as shown in
[0119]
[0120] In conclusion, the solutions disclosed herein can facilitate, in particular, a simplified contactless measurement of lens elements arranged in a measurement volume or else a contactless measurement of the cornea, in particular with a reduced impairment of a light-sensitive user, in the field of ophthalmic optics.
[0121] The foregoing description of the exemplary embodiments of the disclosure illustrates and describes the present invention. Additionally, the disclosure shows and describes only the exemplary embodiments but, as mentioned above, it is to be understood that the disclosure is capable of use in various other combinations, modifications, and environments and is capable of changes or modifications within the scope of the concept as expressed herein, commensurate with the above teachings and/or the skill or knowledge of the relevant art.
[0122] The term “comprising” (and its grammatical variations) as used herein is used in the inclusive sense of “having” or “including” and not in the exclusive sense of “consisting only of” The terms “a” and “the” as used herein are understood to encompass the plural as well as the singular.
[0123] All publications, patents and patent applications cited in this specification are herein incorporated by reference, and for any and all purposes, as if each individual publication, patent or patent application were specifically and individually indicated to be incorporated by reference. In the case of inconsistencies, the present disclosure will prevail.