APPARATUS AND METHOD FOR DETERMINING OPTICAL PARAMETERS
20170336654 · 2017-11-23
Assignee
Inventors
- Peter Seitz (Muenchen, DE)
- Markus Tiemann (Muenchen, DE)
- Gregor Esser (Muenchen, DE)
- Werner Mueller (Oetisheim, DE)
Cpc classification
G02C13/005
PHYSICS
A61B3/08
HUMAN NECESSITIES
International classification
Abstract
An apparatus for determining optical parameters of a user with spectacles arranged in the use position on the head of the user includes at least one projection device designed and arranged for marking a partial region of the head of the user and/or of the spectacles of the user with a light projection; at least one image recording device designed and arranged for generating image data at least from the marked partial region of the head of the user and/or of the spectacles of the user; and a data processing device with a user data determining device, which is designed to determine user data from the marked partial region of the head and/or of the spectacles on the basis of the generated image data, wherein the user data comprise spatial information in the three-dimensional space of points of the partial region of the head and/or of the spectacles, and a parameter determining device, which is designed to determine optical parameters of the user on the basis of the user data.
Claims
1. An apparatus for determining optical parameters of a user with spectacles arranged in a use position on the head of the user, comprising: at least one projection device which is designed and arranged to mark a partial region of the head of the user and/or of the spectacles of the user with a light projection, at least one image recording device which is designed and arranged to generate image data of at least the marked partial region of the head of the user and/or of the spectacles of the user, and a data processing device comprising: a user data determination device which is designed to determine user data of the marked partial region of the head and/or of the spectacles using the generated image data, wherein the user data include spatial information in three-dimensional space of points of the partial region of the head and/or of the spectacles, and a parameter determination device which is designed to determine optical parameters of the user using the user data.
2. The apparatus according to claim 1, wherein the projection device is designed and arranged so that specific individual points on the head and/or the spectacles of the user are marked in the image data by the light projection.
3. The apparatus according to claim 2, wherein the projection device is designed and arranged so that, in the image data, at least one of the following user points is specifically marked: a pupil center point, an outer temporal frame point, an inner nasal frame point, an inner frame point above the pupil and/or an inner frame point below the pupil.
4. The apparatus according to claim 1, wherein the projection device is designed and arranged so that the light projection in the image data at least partially has the form of at least one line, at least one line intersection and/or at least one point.
5. The apparatus according to claim 1, wherein the projection apparatus is calibrated relative to the image recording device, and the user data determination device uses information about this calibration to determine the user data.
6. The apparatus according to claim 1, wherein the projection device has an adjustable projection direction.
7. The apparatus according to claim 1, wherein the light projection provided by the projection device contains at least one projection plane that at least partially projects a line on the exposed partial region of the head and/or of the spectacles of the user.
8. The apparatus according to claim 7, wherein the projection device is calibrated relative to the image recording device so that the optical axis of the image recording device and the projection plane intersect at an intersection angle of at least 10° and at most 70°, which intersection angle is known in advance.
9. The apparatus according to claim 1, wherein the light projection provided by the projection device contains at least two projection planes that intersect at a predetermined angle.
10. The apparatus according to claim 1, wherein the user data determination device is designed and arranged so that it determines the user data from image data that are generated with a single recording of the image recording device.
11. The apparatus according to any claim 7, wherein the user data determination device is designed and arranged so that it determines the user data from two sets of image data from different recording positions, and uses the line through the light projection to identify corresponding image points in the two sets of image data.
12. The apparatus according to claim 1, wherein the projection device provides the light projection in an invisible wavelength.
13. The apparatus according to claim 12, with a preview output device that displays on which partial region of the head and/or of the spectacles of the user the invisible light projection is aligned.
14. The apparatus according to claim 1, wherein the apparatus is designed as a portable, mobile apparatus.
15. A method to determine optical parameters of a user with spectacles arranged in the use position on the head of the user, wherein: a partial region of the head of the user and/or of the spectacles of the user is marked with a light projection, image data of at least the marked partial region of the head of the user and/or of the spectacles of the user are generated, user data of the marked partial region of the head and/or of the spectacles are determined using the generated image data, wherein the user data include spatial information in three-dimensional space of points of the partial region of the head and/or of the spectacles, and optical parameters of the user are determined using the user data.
16. The method according to claim 15, wherein the user data are determined under consideration of a calibration of the light projection relative to the positioning and alignment of an image recording device to generate the image data.
17. The method according to claim 16, wherein a distance between the position of the image recording device and the position of the user is estimated from the image data by means of triangulation, taking into account the calibration.
18. The method according to claim 15, wherein the light projection is adjusted in the generation of the image data so that a projection plane of the light projection travels through both pupils of the user.
19. A computer program product comprising program parts which, when loaded into a computer, are designed to implement the method according to claim 15.
Description
[0088] The invention is explained in detail in the following using aspects of the invention presented by Figures. Shown are:
[0089]
[0090]
[0091]
[0092]
[0093]
[0094] Both the image recording device 11 and the projection device 12 are aligned on a subject 1 that serves merely to illustrate the measurement method and is essentially cuboid in shape.
[0095] The image recording device 11 and the projection device 12 are calibrated to one another. In the coordinate system drawn in
[0096] The direction vector of the optical axis 16 and the direction vector of the projection device 17 intersect at an angle known beforehand. The size of the angle and the separation of the image recording device 11 from the projection device 12 are components of the calibration of the apparatus. The separation of the image recording device 11 from the light projection device 12 serves as a triangulation basis with which the separation of the subject 1 from the image recording device 11 may be determined, a separation in the Y-direction in the shown exemplary embodiment.
[0097] The projection device 12 generates a light projection that is radiated in the projection direction 17 and marks a partial region of the subject 1. The components of the apparatus are arranged so that this marking is included in the image data acquired by the image recording device 11. In other words, the marking caused by the light projection is visible in the recording.
[0098] In Figures, the Z-direction is arranged essentially vertically in the reference system of the user; the Y-direction is arranged essentially horizontally from the image recording device 11 toward the user 1 or the subject 1, and the X-direction is arranged essentially horizontally through the two pupil center points of the user 2 and/or orthogonal to the optical axis 16 of the image recording device 11.
[0099] In a perspective, schematic presentation,
[0100] The marking 14 generated by the projection device 12 essentially has the form of a solid line. In the recording recorded by the image recording device 11, a portion of the marking 14 appears further below (as viewed in the Z-direction), another portion appears further above (see
[0101] The separation of the points of the marked partial region from the image recording device 11 and/or from the apparatus 10 can thus be determined in the image data in that the position of the marked points in the recording is determined, in particular the position on the Z-axis in the shown embodiment. Via the calibration of the apparatus 10, the separation (in the Y-direction) of the marked points on the subject 1 in the image data of the image recording device 11 relative to the subject 1 can thus be calculated by means of triangulation.
[0102] As shown in
[0103] In addition or as an alternative to this, the display 15 may be designed and provided to display a preview of the image data together with the marking 14 generated by the projection device 12. The display 15 thus generates a preview of the image to be recorded as serves as a preview output device. This is especially advantageous if the projection device 12 generates radiation in a non-visible wavelength, for example in infrared, which is not visible to a human user. To adjust and/or align the apparatus 10 relative to the subject 1, an operator (for example an optician) may see on the display 15 which partial region of the subject 1 the projection device 12 marks.
[0104]
[0105]
[0106] The separation of the marked points on the frame of the spectacles 4, as well as of the marked points on the head of the user 2, from the image recording device 11 may be calculated by means of triangulation from the different Z-positions of the marking 14′ in the recording recorded by the image recording device 11. User data in three-dimensional space may thereby be calculated. User data may be calculated by means of a user data determination device that may access a microprocessor of the apparatus 10. Optical parameters of the user 2 may be determined from the user data by a parameter determination device.
[0107] The apparatus 10 uses an active exposure in order to determine the optical parameters, such as centering data and individual parameters of the user, in a design for video centering by means of a 3D reconstruction.
[0108] As an alternative to the apparatus shown in Figures, a stereo camera system may also be used that generates image data from two different perspectives, for example from the document DE 10 2005 003 699 A1. The active exposure provided via an additional projection device 12 may hereby be used to solve and/or to accelerate the solving of the correspondence problem, which results if points corresponding to one another must be identified in the recordings from two different perspectives.
[0109] The user 2 may use the image recording device 11 or another point of the apparatus 10 as a fixed object that he fixes given the recording of the image data. The distance of the fixed object that is used from the pupils of the user may subsequently be used for convergence correction. Alternatively, a point that can be determined relative to the user 2 may be used as a fixed point, for example the nasal root of the user in a mirror image, wherein the mirror image is provided by a mirror that is attached to the image recording device 11 at a position known in advance.
Embodiment with a Stereo Camera System
[0110] In a stereo camera system as mentioned above as an apparatus for determining optical parameters of the user, the marking may be used in order to localize points corresponding to one another in image data that are recorded simultaneously or with chronological offset. This may take place via automatic and/or manual image processing. The manual evaluation may thereby be reduced to the first set if epipolar lines of the marking projected as a line do not coincide in the two sets of image data. From a point selected in the first set of image data, the corresponding point in the second set of image data is established by the epipolar line and the intersection point with the projected marking. This intersection point may be determined via automatic image processing, for example.
[0111] The stereo camera system may also have only one image recording device that records two sets of image data from different viewing angles toward the user with chronological offset. In this instance, the calibration of the camera positions is not necessarily known in advance and may be determined from the recorded image data in a method step. For calibration, for example, the position of the pupils 3L and/or 3R relative to the frame of the spectacles may be used. Since the pupils and the frame are located at different distances from the two positions of the image recording device, a corresponding offset results depending on the observation direction.
[0112] In addition to the two pupils of the user, five characteristic points on the frame that are respectively selected in both sets of image data may be used for the determination of the orientation. If a moving point is predetermined as a fixed object, two other, invariant points on the user and/or the frame may be selected instead of the two pupils of the user. A 3D reconstruction of the points from the acquired image data may take place from these seven points in total via the 7-point algorithm known from epipolar geometry.
Embodiment with One Set of Image Data
[0113] In the embodiment of the apparatus as it is presented in Figures, the optical parameters are determined from a single recording (thus from a single set of image data). The image recording device 11 and the projection device 12 are hereby calibrated relative to one another and internal to the apparatus. A calculation of the user data in three dimensions is enabled from the marking 14 or 14′ in the image data that is generated by the projection device 12. An alignment of the marking 14, 14′ may thereby be set in which the marking generated as a line is arranged essentially orthogonal to the epipolar plane. A triangle may thereby be used as a triangulation basis object, wherein the triangulation basis is aligned essentially vertically (in the Z-direction in Figures).
[0114] Via a mechanical and/or automatic image processing of the image data and/or a manual selection of the two pupil center points, the pupil separation (for example) may be determined in three dimensions from the image data. Additional parameters may be determined from the marked points and the generated user data, for example the face form angle and the horizontal section of the topography of the eye, as well as approximately the corneal vertex distance, the disc length to the left, the disc length to the right etc. An evaluation of the image data of a single recording hereby takes place along the projected marking (along the line in the shown embodiment). The evaluation can thereby be implemented very quickly.
[0115] Additional optical parameters may be determined via an additional second projection of an additional second line, for example a vertical line (not shown in Figures). In contrast to the essentially horizontally aligned first marking 14 and 14′, this essentially vertically aligned second marking that is generated by (for example) a second projection device can be calibrated by means of a second triangulation basis. For this, for example, a horizontal distance (in the X-direction in the shown embodiment) of the second projection device from the image recording device 11 may be known in advance. In any event, a direction vector of the second projection device relative to the optical axis 16 as well as a position of the second projection device relative to the image recording device 11 may be known in advance as additional information.
[0116] Via this additional, essentially vertical marking, additional optical parameters may be calculated, for example the forward inclination, the disc height, the corneal vertex distance etc.
[0117] This second marking may thereby be positioned so that it is arranged over a pupil, thus travels essentially vertically through the right or left pupil of the user.
[0118] In one embodiment, multiple—in particular two—such vertical markings are projected onto the head-with-spectacles system in the form of a line. In one exemplary embodiment, two vertical markings parallel to one another in the form of a line are used whose distance from one another corresponds essentially to the typical pupillary distance (meaning approximately 64 mm) at a predetermined measurement distance from the camera. Additional optical parameters may thereby be determined, and in fact separately from one another for each eye.
[0119] A triggering of the image recording device, thus a recording of the image data, may take place automatically. A detection, thus a recording of the image data, may hereby be executed during the positioning of the apparatus if suitable trigger conditions are satisfied and detected, for example automatically detected pupil positions.
[0120] In the apparatus 10 presented in Figures, the light projection generated by the projection device 12 has a projection direction 17 that generates an essentially horizontal marking 14′ on a user 2. The marking generated by the projection device 12 is hereby preferably aligned parallel to the lines in the image data of the image recording device, in particular if this is designed as a digital camera. This enables a simple triangulation basis and distance determination of the marked image points in the image data.
[0121] The marking enables a simplified selection of individual points in the image data, especially at intersection points of the marking with the frame edge and/or with the pupil center points.
[0122]
[0123] The recording contains image data of both pupils of the user as well as of the frame of the spectacles 4. The recording was recorded by the apparatus 10 counter to the zero viewing direction of the user, and is designed as a two-dimensional recording. The image data contain the marking 14′ generated by the light projection unit 12 by means of the projection plane 13. The marking 14′ in the recording thereby essentially has the form of a line. In order to clarify the different height of the marking 14′ (thus the different position on the Z-axis) in the image data, the variation of the projected line in the Z-axis direction in
[0124] Upon triggering the recording, the projected marking 14′ was aligned so that it travels through the two pupil center points PMR and PML of the user.
[0125] The curve of the marking 14′ is described in the following from left to right through the recording shown in
[0126] From there, the marking 14′ travels essentially with mirror symmetry across the left eye and frame half, in particular the left inner nasal frame point INL, the left pupil center point PML, up to the left outer temporal frame point ATL.
[0127] The image data of the recording shown in
[0128] All of these six points ATR, PMR, INR, INL, PML and ATL are specifically marked in the image data by means of the marking line provided by the projection unit 12. As explained above, the optical parameters of pupillary distance, disc length to the left, disc length to the right, face form angle, the horizontal section of the topography of the eye, the approximate corneal vertex distance etc. may be calculated as explained above from the three-dimensional spatial information of these six selected points of the head-with-spectacles system.
[0129] With an additional second light projection, and thus second marking in the vertical direction, the points of the right inner frame point above the pupils IOR, once again the right pupil center point PMR, and right inner frame point above the pupil IUR may additionally be marked. These points are shown in
[0130] Under consideration of information about the calibration of this second light projection (for example in the horizontal direction), spatial information in three-dimensional space may be determined for each point marked by the second light projection, especially for the three specifically marked points IOR, PMR and IUR. With the three-dimensional spatial information of these additional points, the additional optical parameters of forward inclination, disc height, corneal vertex distance etc. may be calculated for the right eye.
[0131] Analogous to this, the additional the [sic] points of left inner frame point above the pupil IOL, again the left pupil center point PML, and left inner frame point below the pupil IUL may be marked with an additional third light projection parallel to the second light projection in the vertical direction. These points are shown in
[0132] Under consideration of information about the calibration of this third light projection (for example in the horizontal direction), spatial information in three-dimensional space may be determined for each point marked by the third light projection, especially for the three specifically marked points IOL, PML and IUL. With the three-dimensional spatial information of these additional points, the additional optical parameters of forward inclination, disc height, corneal vertex distance etc. may be calculated for the left eye.
[0133] Upon triggering the recording, these optional second and third marking line are aligned so that a respective marking line travels vertically (parallel to the Z-axis) through a respective pupil center point PMR or PML of the user.
[0134] The calculation of two examples of optical parameters from the three-dimensional spatial information of the points cited above is described in the following:
[0135] The optical parameter “pupillary distance” may be calculated as a length between the points PMR and PML in three-dimensional space. A division of the pupillary distance into right pupillary distance and left pupillary distance may additionally take place as an additional optical parameter. For this, a pupil center plane may be defined that has the same distance from the points INL and INR, and thus is arranged between these two points. The intersection point of this pupil center plane with a connecting line of the two points PMR and PML provides a division of the optical parameter “pupillary distance” into the right pupillary distance (as a section of this intersection point to the PMR) and the left pupillary distance (as a section of this intersection point to the PML).
[0136] The optical parameter “face form angle” may be calculated in a horizontal projection from the angle between the straight lines that are provided by the sections ATR-INR and ATL-INL.
[0137] In general, more and/or other than the ten points explicitly cited above may be used for the calculation of the optical parameters. For the optical parameters “disc length” and “disc height”, only an approximate calculation is possible with the cited ten points. For a precise calculation of these parameters, the boxing system stated above may be used that may be taken into account in the user data.
[0138] To determine the boxing system as a component of the user data, a selection of a boundary of the spectacles lens may be performed (for example by the user data determination device) via a rectangle in the image data. Boundary lines of the boundary can thereby be shifted only using predefined directions. An upper boundary line may thus be arranged as a horizontal line in the disc plane and be depicted accordingly as a projection in the image data. This upper boundary line may be shifted only along a vertical direction, for example. For an inner boundary line, only a horizontal shift may be analogously provided, wherein the inner boundary line is depicted as a vertical line in the disc plane. A determination of the three-dimensional spatial information of the vertices of the boxing system may take place via the already selected points at which three-dimensional spatial information is present, as well as via scaling factors linking these points.
REFERENCE LIST
[0139] 1 subject [0140] 2 user [0141] 3R right pupil [0142] 3L left pupil [0143] 4 spectacles [0144] 10 apparatus [0145] 11 image recording device [0146] 12 projection device [0147] 13 projection plane [0148] 14 marking [0149] 14′ marking [0150] 15 display [0151] 16 optical axis [0152] 17 projection direction [0153] PMR right pupil center point [0154] PML left pupil center point [0155] ATR right outer temporal frame point [0156] ATL left outer temporal frame point [0157] INR right inner nasal frame point [0158] INL left inner nasal frame point [0159] IOR right inner frame point above the pupil [0160] IOL left inner frame point above the pupil [0161] IUR right inner frame point below the pupil [0162] IUL left inner frame point below the pupil