OCULAR BIOMETRY SYSTEMS AND METHODS
20210235987 · 2021-08-05
Inventors
Cpc classification
A61B3/107
HUMAN NECESSITIES
A61B3/11
HUMAN NECESSITIES
A61B3/12
HUMAN NECESSITIES
A61B3/117
HUMAN NECESSITIES
International classification
A61B3/14
HUMAN NECESSITIES
A61B3/00
HUMAN NECESSITIES
A61B3/107
HUMAN NECESSITIES
A61B3/117
HUMAN NECESSITIES
Abstract
Parameters of an eye are measured by capturing images of the eye when at least one light source is shone into the eye and analyzing the captured images. An ocular biometry system includes a light source configured to generate a light beam, cameras configured to capture images of the eye when the light beam passes through the eye, and processors configured to identify features in the captured images. The features represent the light beam passing from one part of the eye to another part of the eye. One or more parameters of the eye are determined from the identified features. The light beam can be adjusted to be incident on the eye in a number of positions and multiple light beams can be used.
Claims
1. An ocular biometry system comprising: a light source configured to generate a light beam for incidence on an eye; a beam adjustment mechanism configured to adjust the light beam to be incident on the eye in a plurality of incidence positions, wherein in at least one of the plurality of incidence positions the light beam is incident on the eye non-centrally; first and second cameras configured to capture a plurality of images of the eye, each of the plurality of images being of the light beam passing through the eye when the light beam is in a different incidence position of the plurality of incidence positions; and one or more processors configured to: identify a plurality of features in each of the plurality of images, each of the plurality of features being representative of the light beam passing from one part of the eye to another part of the eye; and determine, from the identified plurality of features in the plurality of images, one or more parameters of the eye.
2. An ocular biometry system as claimed in claim 1, wherein the parameters determined by the one or more processors are one or more of the parameters selected from the group consisting of: axial length; anterior chamber depth; posterior chamber depth; lens thickness; corneal radius/curvature; anterior lens radius/curvature; posterior lens radius/curvature; and retinal radius/curvature.
3. An ocular biometry system as claimed in claim 1, wherein the plurality of features identified in the captured images are representative of the light beam passing from and to one or more of the parts of the eye selected from the group consisting of: cornea; anterior chamber (aqueous humor); posterior chamber; lens; vitreous humor; and retina.
4. An ocular biometry system as claimed in claim 1, wherein the one or more processors identify the plurality of features in the captured images by identifying regions of relatively high intensity light in the captured images, the regions of relatively high intensity light corresponding to the plurality of features.
5. An ocular biometry system as claimed in claim 1, wherein the one or more processors determine an optical path length between two locations in the eye from positions of the features in the captured images, and calculate a geometric path length between the two locations in the eye from the optical path length.
6. (canceled)
7. An ocular biometry system as claimed in claim 1, wherein the beam adjustment mechanism comprises: a reflector, the light beam being reflected by the reflector before entering the eye; and a reflector adjustment mechanism configured to adjust the orientation and/or position of the reflector.
8. An ocular biometry system as claimed in claim 1, wherein the light source comprises one or more light sources configured to generate the light beam, wherein the light beam is a first light beam for incidence on the eye, and the one or more light sources are further configured to generate a second light beam for incidence on the eye, the first and second light beams being separated by a distance when incident on the eye, and further wherein the first and second cameras are configured to capture images of the eye when the first and second light beams pass through the eye.
9. (canceled)
10. (canceled)
11. An ocular biometry system as claimed in claim 8, wherein the one or more light sources are configured such that the first and second light beams are incident on the eye symmetrically with respect to an axis of the eye.
12. An ocular biometry system as claimed in claim 8, wherein the beam adjustment mechanism is configured to adjust the first and second light beams to be incident on the eye in a plurality of incidence positions, wherein each of the plurality of images is of the first and/or second light beams passing through the eye when the first and second light beams are in different incidence positions of the plurality of incidence positions.
13. An ocular biometry system as claimed in claim 12, wherein the beam adjustment mechanism comprises a first beam adjustment mechanism configured to adjust the first light beam to be incident on the eye in a plurality of incidence positions and a second beam adjustment mechanism configured to adjust the second light beam to be incident on the eye in a plurality of incidence positions.
14. An ocular biometry system as claimed in claim 8, wherein the ocular biometry system comprises third and fourth cameras configured to capture images of the eye when the first and second light beams pass through the eye.
15. An ocular biometry system as claimed in claim 14, wherein the first and third cameras are positioned symmetrically relative to the eye and are configured to capture images of a first set of parts of the eye, and the second and fourth cameras are positioned symmetrically relative to the eye and are configured to capture images of a second set of parts of the eye.
16.-20. (canceled)
21. A processor-implemented method of measuring a parameter of an eye, the method comprising: receiving a plurality of images of the eye, each of the plurality of images being of a light beam passing through the eye when the light beam is in one of a plurality of incidence positions, wherein in at least one of the plurality of incidence positions the light beam is incident on the eye non-centrally; identifying a plurality of features in each of the plurality of images, each of the plurality of features being representative of the light beam passing from one part of the eye to another part of the eye; and determining the parameter from the identified plurality of features in the plurality of images.
22. A processor-implemented method as claimed in claim 21, wherein the method comprises determining one or more parameters of the eye, the parameters being one or more of the parameters selected from the group consisting of: axial length; anterior chamber depth; posterior chamber depth; lens thickness; corneal radius/curvature; anterior lens radius/curvature; posterior lens radius/curvature; and retinal radius/curvature.
23. A processor-implemented method as claimed in claim 21, wherein the plurality of features identified in the images are representative of the light beam passing from and to one or more of the parts of the eye selected from the group consisting of: cornea; anterior chamber (aqueous humor); posterior chamber; lens; vitreous humor; and retina.
24. A processor-implemented method as claimed in claim 21, wherein the method comprises identifying the plurality of features in the images by identifying regions of relatively high intensity light in the captured images, the regions of relatively high intensity light corresponding to the plurality of features.
25. A processor-implemented method as claimed in claim 21, wherein the method comprises determining an optical path length between two locations in the eye from positions of the features in the images, and calculating a geometric path length between the two locations in the eye from the optical path length.
26. A processor-implemented method as claimed in claim 21, wherein the method comprises: controlling a beam adjustment mechanism to adjust the light beam to be incident on the eye in the plurality of incidence positions.
27. A processor-implemented method as claimed in claim 21, wherein the method comprises: controlling a first beam adjustment mechanism to adjust a first light beam to be incident on the eye in a plurality of incidence positions; and controlling a second beam adjustment mechanism to adjust a second light beam to be incident on the eye in a plurality of incidence positions.
28. (canceled)
29. (canceled)
30. A method of measuring a parameter of an eye, the method comprising: shining a light beam into the eye; adjusting the light beam to be incident on the eye in a plurality of incidence positions, wherein in at least one of the plurality of incidence positions the light beam is incident on the eye non-centrally; capturing a plurality of images of the eye, each of the plurality of images being of the light beam passing through the eye when the light beam is in a different incidence position of the plurality of incidence positions; identifying a plurality of features in each of the plurality of images, each of the plurality of features being representative of the light beam passing from one part of the eye to another part of the eye; and determining the parameter from the identified plurality of features in the plurality of images.
31.-44. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0065] One or more embodiments of the invention will be described below by way of example only, and without intending to be limiting, with reference to the following drawings, in which:
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
[0072]
[0073]
[0074]
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS OF THE INVENTION
[0075] Embodiments of the present invention are directed towards ocular biometry systems and methods for measuring parameters of an eye, which may be referred to as eye biometrics. In the context of the present invention, “biometrics” are understood to mean measurements of the body. In general terms some embodiments of the invention involve measuring parameters of an eye by capturing images of the eye when at least one light source is shone into the eye. The parameters of the eye are determined from analysis of the captured images.
[0076]
[0077] In the context of the specification, “light” will be understood to mean electromagnetic radiation, including visible and non-visible parts of the electromagnetic spectrum. Preferred forms of the present technology use “non-visible light”, i.e. those parts of the electromagnetic spectrum that cannot be seen by the eye being measured. If visible light is shone into the eye, the eye will typically adjust in some way to accommodate for the light, for example by altering the shape of the lens, so this may result in altered measurements of one or more parameters of the eye.
[0078] Ocular Biometry System
[0079] An ocular biometry system for measuring parameters of the eye is illustrated in
[0080] Ocular biometry system 200 comprises a light source 202 configured to generate a light beam 203 which is made incident on, i.e. shone into, eye 201. In the embodiment shown in
[0081] In the embodiment of the technology described here, light source 202 is a source of non-visible light and light beam 203 is a beam of non-visible light. For example, light source 202 may be an infra-red laser. As explained above, one reason to use non-visible light in system 200 is to avoid the eye 201 adjusting to accommodate for the light, for example by altering the shape of the lens, which may result in altered measurements of one or more parameters of the eye.
[0082] The ocular biometry system 200 may further comprise other optical components acting on the light beam 203 before being incident on the eye 201. In some embodiments, optical components configured to reduce the width of the light beam 203, for example an opaque member comprising pinholes configured to transmit a portion of light beam 203, may be provided.
[0083] Light source 202, reflector 204 and any other optical components provided in system 200 may be housed in a housing 205. To measure parameters of an eye 201, the patient is stationed in front of housing 205. Ocular biometry system 200 may comprise one or more eye positioning mechanisms, for example a chin rest and forehead support, to enable the patient to position themselves and their eye in the desired position with stability and in comfort.
[0084] The system may further comprise a sight 206 for the patient to look at during use of system 200. The sight 206 may be positioned optically far from the patient so that the eye 201 accommodates to viewing into the distance. A system of mirrors may be used to position sight 206 optically far from, but geometrically (i.e. physically) close to, the patient, for example if it is not possible to position sight 206 geometrically (i.e. physically) far from the patient. In other implementations the sight 206 may be positioned another distance from the patient, e.g. optically closer to the patient if it is desirable to measure parameters of the eye with the eye in a particular accommodation configuration.
[0085] The ocular biometry system 200 shown in
[0086] In this specification the term “camera” refers to any image capturing device or system. It will be understood that the cameras 207 used are configured to capture images in the part of the electromagnetic spectrum corresponding to the light source 202, e.g. infra-red. Cameras used in embodiments of the invention may be still frame or continuously filming cameras. Further, when this specification refers to capturing an image it will be understood that the image may be obtained in digital form, i.e. the image may be represented by digital image data. Reference to an “image” in this specification will be understood to refer either to the visual representation of what is imaged or to the data that is representative of the image, or both.
[0087] In preferred embodiments of the invention, cameras 207 are stereo cameras with focal lengths selected to obtain clear images of the eye based on the typical size of the human eye and the distance of the camera from the eye.
[0088] While the system 200 in the embodiment of
[0089] In one example of a four-camera arrangement of the system 200 in
[0090] Ocular biometry system 200 may comprise one or more camera adjustment mechanisms configured to adjust the positions and/or orientations of cameras 207. For example cameras 207 may be mounted on camera mounts able to move and rotate relative to the eye 201.
[0091] Ocular biometry system 200 also comprises a control system 208. Control system 208 is configured to communicate with other components of system 200, including cameras 207, light source 202, reflector 204 and a beam adjustment mechanism(s) (not shown in
[0092] Control system 208 is shown in more detail in
[0093] The processor 304 may be any suitable device known to a person skilled in the art. Although the processor 304 and memory 306 are illustrated as being within a single unit, it should be appreciated that this is not intended to be limiting, and that the functionality of each as herein described may be performed by multiple processors and memories, that may or may not be remote from each other or from the ocular biometry system 200. The instructions 308 may include any set of instructions suitable for execution by the processor 304. For example, the instructions 308 may be stored as computer code on the computer-readable medium. The instructions may be stored in any suitable computer language or format. Data 310 may be retrieved, stored or modified by processor 304 in accordance with the instructions 310. The data 310 may also be formatted in any suitable computer readable format. Again, while the data is illustrated as being contained at a single location, it should be appreciated that this is not intended to be limiting—the data may be stored in multiple memories or locations. The data 310 may also include a record 312 of control routines for aspects of the system 300.
[0094] The hardware platform 302 may communicate with a display device 314 to display the results of processing of the data. The hardware platform 302 may communicate over a network 316 with user devices (for example, a tablet computer 318a, a personal computer 318b, or a smartphone 318c), or one or more server devices 320 having associated memory 322 for the storage and processing of data collected by the local hardware platform 302. It should be appreciated that the server 320 and memory 322 may take any suitable form known in the art, for example a “cloud-based” distributed server architecture. The network 316 may comprise various configurations and protocols including the Internet, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, whether wired or wireless, or a combination thereof.
[0095] Linear/Axial Measurements
[0096] A method for measuring parameters of the eye will first be described with reference to
[0097] Calibration and Set Up
[0098] In step 401 the cameras 207 are calibrated. Any appropriate calibration technique may be used, for example positioning heat resistant material on which is printed a pattern in front of the eye 201 and imaging the heat resistant material with the cameras 207. An exemplary technique is explained here: Gschwandtner M, Kwitt R, Uhl A, Pree W, Infrared camera calibration for dense depth map construction, Intelligent Vehicles Symposium (IV), 2011 IEEE 2011 Jun. 5 (pp. 857-862).
[0099] In step 402 the centre of the eye 201 is located. Any suitable technique to locate the centre of eye 201 may be used. In certain embodiments of the invention, one of the cameras 207 is positioned directly in front of eye 201 on (or as close as possible to) the optical axis of eye 201 and an image of the eye, including the iris, is captured by the camera. A circle detection method is performed on the captured image to identify the iris in the captured image, and a circle centre location method is performed to locate the centre of the circle, which is assumed to correspond to the centre of eye 201 (i.e. the optical axis).
[0100] If an ocular biometry system 200 such as illustrated in
[0101] In step 403 the light source is targeted at the centre of the eye so that the light beam 403 is incident as closely along the optical axis of the eye 201 as possible. The ocular biometry system 200 comprises a beam adjustment mechanism configured to adjust the incidence of the light beam 203 on the eye 201. The beam adjustment mechanism may comprise one or more mechanisms to adjust the position and/or orientation of components of the ocular biometry system 200. For example the beam adjustment mechanism may comprise a mechanism for moving housing 205. In one embodiment, with the eye 201 in position, the housing 205 is coarsely adjusted so that the reflector 204 is generally at eye level. The light source 202 is activated so that light beam 203 is incident on the eye 201. In the embodiment shown, light beam 203 reflects off reflector 204 before entering the eye. The beam adjustment mechanism may further comprise one or more mechanisms for adjusting the position and/or orientation of the light source 202 and/or the reflector 204 in order to adjust the light beam 203 to be incident on the centre of the eye 201, for example as a fine adjustment step after the coarse adjustment of the housing 205.
[0102] The patient may be asked to look at sight 206 during this process so that the eye 201 accommodates to viewing into the distance.
[0103] In step 404 the cameras 207 are positioned to capture images of the eye 201. In certain embodiments the cameras 207 are moved such that camera 207a is positioned inferior to the eye 201 to image generally anterior parts of eye 201, for example the cornea and lens, and camera 207b is positioned superior to the eye 201 to image multiple parts of eye 201, for example the cornea, lens and retina, as shown in
[0104] The calibration and set up steps described above may not be required in all exemplary methods. For example, methods according to embodiments of the invention may be performed using an ocular biometry system already configured to perform said methods.
[0105] Imaging
[0106] In step 405 light source 202 is activated and light beam 203 is shone into eye 201, such as is shown in
[0107] In some embodiments the captured images are transmitted from the camera to control system 208. It will be appreciated that any suitable method of transmission may be used, including wireless or wired data transmission. Auxiliary image information may also be provided from the camera to control system 208, either contemporaneously with or subsequently to the sending of the captured images. Auxiliary image information may be additional information related to the captured images, for example properties of the camera taking an image (e.g. make, model, shutter speed, focal length, aperture settings, ISO, etc), the location of the camera in the system or any other information that may be required or useful to analyse the captured images.
[0108] In embodiments of the invention the control system 208 limits the time of activation of the light source 202 (or light sources in embodiments with multiple light sources, such as described below). The maximum exposure time of light to the patient depends on the type of light generated by the light source and the time of activation is limited based on the maximum exposure time in order to ensure patients are exposed to safe amounts of light.
[0109] Image Analysis
[0110] On receipt of the captured images the control system 208 may store the captured images in memory 306 for immediate processing or for processing at a later time. Alternatively, the control system 208 may send the captured images to a remote memory, for example memory 322 via server 320, for processing at a later time.
[0111] In step 406 the captured images are analysed by the one or more processors 304. In
[0112] In some embodiments the processor 304 may perform one or more image pre-processing steps on the captured images, for example noise reduction or averaging of images from multiple cameras to lessen the effects of small eye movements on the analysis.
[0113]
[0114] In certain embodiments of the invention, the processor 304 is configured to analyse images 700 and 750 to identify features in the images that are representative of light beam 203 passing from one part of the eye to another part of the eye. In the embodiment presently described, the features correspond to regions of relatively high intensity light in images 700 and 750 and the processor 304 identifies the regions of relatively high intensity light using conventional image analysis techniques.
[0115] When light beam 203 passes through the eye 201 it passes through the cornea, anterior chamber (aqueous humour), lens, vitreous humour and is incident on the retina. Along this path the beam 203 passes from one medium to another in several locations, marked as A, B, C and D in
[0120] At each of these points the beam 203 is refracted and partly reflected. This causes a scattering of some of the light in beam 203, which is seen by cameras 207 as a ‘halo’ or region of higher intensity light compared to other parts of the field of view.
[0121] In other embodiments other points in the eye 201 may also be identified through features in the captured images, for example the posterior chamber and iris.
[0122] In images 700 and 750 the regions of higher intensity light labelled A, B, C and D correspond to the locations A, B, C and D within the eye 201 shown in
[0123] If the camera is further away from the nose compared to the laser (i.e. the camera is temporal), then the region of higher light intensity closest to the camera (temporal) corresponds to the posterior eye (i.e. retina). If the camera is closer to the nose compared to the laser (i.e. the camera is nasal), then the point closest to the camera (nasal) corresponds to the anterior eye (i.e. cornea).
[0124] In image 700 the furthest temporal high intensity region (i.e. away from the nose) corresponds to the corneal surface reflection point A. The subsequent high intensity regions are arranged in a straight line in the image away from the region corresponding to corneal reflection point A, with the correspondence occurring in the order of the reflection points as the beam 203 passes into the eye 201, i.e. in the order A then B then C then D. The number of these reflection points that appear in each of images 700 and 750 depends on the field of view of the cameras 207 capturing the respective image.
[0125] Therefore, in image 700 (captured by inferior camera 207a) the high intensity region further from the nose (the left-most region in image 700) is identified by the processor 304 as corresponding to the corneal surface reflection point A, while regions B and C are recognised as corresponding to the anterior lenticular surface reflection point B and the posterior lenticular surface reflection point C respectively. Since the part of the retina on which beam 203 is incident is not in the field of view of camera 207a there is no high intensity light region corresponding to the retinal surface reflection point D in image 700.
[0126] Image 700 includes another high intensity light region R. It has been found that minor reflections within the eye can lead to other regions of high intensity in the images captured by cameras 207. Such regions may be identified by the processor 304 as not corresponding to important reflection points within the eye if they do not lie on the same straight line as the other high intensity regions in the image, as is the case with region R in image 700. In some embodiments, the processor 304 is therefore configured to ignore high intensity regions not on a straight line with the other regions in an image.
[0127] In image 750 (captured by superior camera 207b) the high intensity region furthest from the nose (the left-most region in image 750) is recognised by the processor 304 as corresponding to the corneal surface reflection point A, while regions B and D are recognised as corresponding to the anterior lenticular surface reflection point B and the retinal reflection point D respectively. It has been found that the posterior lenticular surface reflection point C does not appear in an image captured by a camera in the position of camera 207b in the embodiment of
[0128] The positions of the high intensity light regions A, B, C and D in images 700 and 750 are analysed using conventional image feature recognition techniques. For example, in the embodiment of the invention resulting in images 700 and 750, the high intensity regions are generally circular and circle detection techniques are used to identify the position of the centre of each circle within the image. Co-ordinates are allocated by the processor 304 to each of the regions.
[0129] In step 406 it will be understood that, when identifying features in an image, the processor 304 may operate by identifying such features from the image data representative of the image. In some embodiments it may not be necessary for the processor 304 to first construct the visual representation of the image from the image data in order to be able to identify the features.
[0130] Alternatively, or additionally, the processor 304 may be configured to construct a visual representation of the image from the image data and identify the features from the visual representation.
[0131] Biometry Calculations
[0132] In step 407 the processor 304 determines the optical path length (OPL) between two or more locations in eye 201 from the positions of the features in the captured images. In the case of images 700 and 750 the processor determines the apparent positions of any two or more of locations A, B, C and D and the apparent distance between those locations (OPL) in eye 201 are also calculated. Those skilled in the art will appreciate how to determine the OPLs from detection of the positions of the regions A, B, C and/or D in the images and calibration information using conventional techniques. In one example, image thresholding techniques may be used.
[0133] The OPL between any two points A, B, C and D may differ from the geometric path length (GPL) between the same points because of refraction in the different media in the eye distorting the path of light beam 203 and the path of the light reflected from the reflection points and captured by the cameras 207. In step 408 the processor 304 calculates the GPL(s) between two or more locations in eye 201 from the corresponding calculated OPL(s).
[0134] Two exemplary methods are described here for calculating one or more GPLs from the OPL(s) determined from the captured images. Other methods may be used in other embodiments of the invention.
[0135] Method 1: Apparent Depth Calculation Generally speaking the method performed by the processor 304 in this embodiment applies Snell's law at each boundary at which the light beam passes between two different media within the eye to correct the optical distortion seen by the cameras 207.
[0136] Since point A is the corneal surface, the interface is an air/cornea interface and the optical and geometric positions of point A are the same.
[0137] To determine the geometric position of point B (the anterior lenticular surface), Snell's law is applied. This will now be described with reference to
[0138] According to Snell's law for the light captured by superior camera 207b:
n.sub.air.Math.sin α.sub.A=n.sub.aqueous.Math.sin α.sub.B′
where n.sub.air is the refractive index of air, n.sub.aqueous is the refractive index of the aqueous humour and angles α.sub.A and α.sub.B′ are, respectively, the angle between the incident light beam 203 and the light received by camera 207b reflected from the surface of the cornea (point A) and the apparent angle between the incident light beam 203 and the light received by camera 207b reflected from the anterior surface of the lens (point B), i.e. virtual point B′.
[0139] For small angles as per the configuration in
therefore: n.sub.aqueous.Math.OPL.sub.AB1≈AB.sub.1
[0140] where OPL.sub.AB1 is the optical path length between A and B as seen by superior camera 207b (i.e. AB′, the distance between A and B′ as shown in
[0141] A similar calculation can be made for the light captured by inferior camera 207a. For the sake of clarity in the figure the distances and angles have not been marked in
n.sub.aqueous.Math.OPL.sub.AB2≈AB.sub.2
[0142] where OPL.sub.AB2 is the optical path length between A and B as seen by inferior camera 207a and AB is the geometric path length between A and B.
[0143] Clearly the geometric path length must be the same in both cases:
AB.sub.1=AB.sub.2.
[0144] The linking ratio V between the optical path lengths according to superior camera 207b and inferior camera 207a may be written as the tensor convolution:
φ=OPL.sub.AB1.Math.OPL.sub.AB2.
[0145] Having effectively undistorted reflection point B, the geometric path lengths BC and BD may be calculated using a similar calculation.
[0146] Finally, the geometric path lengths AD and CD may be calculated by repeating the above steps at each reflection point (B, C and D) and replacing these points in the above equation. These steps can be performed at each step independently, undistorting A, B, C and D points one at a time. Alternatively, matrices of the incident (SC.sub.1) and refraction (SC.sub.2) angles at all the points (A, B, C and D) can be created, and all OPLs can be estimated from a convolution of these matrices:
OPL=SC.sub.1.Math.SC.sub.2
[0147] Then, the GPLs (physical distances between points A, B, C and D) may be determined together, by the summation/integration of all the OPLs convoluted with the refractive indices of media of the eye (i.e. aqueous humour, lens and vitreous humour):
[0148] Where the GPL is a matrix in the form:
[0149] Method 2: Correlation Function
[0150] In an alternative embodiment a secondary modality is used to undistort the optical path lengths between points A, B, C and D determined from the images captured by cameras 207. In this embodiment, the method 400 is performed on a number of test subject eyes and the optical path lengths between one or more of points A, B, C and D are determined from the captured images. The same parameters are measured for the same test subject eyes with an alternative measuring technique, including but not limited to magnetic resonance imaging (MRI), ultrasound, interferometry (e.g. using the Lenstar™ or IOLMaster™ devices), for example.
[0151] If measurements are made for a sufficiently large sample of test subject eyes then a correlation function between geometric path lengths determined by the alternative eye biometric/parameter measurement method and the optical path lengths determined by the method 700 may be determined. This correlation function may subsequently be used to calculate geometric path lengths between two locations in an eye corresponding to optical path lengths determined by method 400.
[0152] The result of either method 1 or method 2 as described above is one or more geometric path lengths between locations in eye 201, the geometric path lengths being parameters of the eye/biometrics. In the above described example, in which the distances between locations A, B, C and D (as labelled in
[0157] Once calculated, the determined parameters may be stored in memory 306 or memory 322, output via display device 314 or communicated to other devices, for example over network 316. In preferred embodiments the parameters are used to assess refractive errors in the patient.
[0158] Multiple Light Beam Incidence Positions in some embodiments of the invention additional parameters of the eye may be determined by shining the light beam 203 into the eye 201 in a number of incidence positions and determining parameters for each incidence position. This enables parameters of the eye to be determined in multiple locations within the eye. In some embodiments parameters of the eye are determined at multiple locations along an axis of the eye in a first direction, for example the inferior-superior or lateral directions. In this way a two-dimensional model of parts of the eye along an axis can be generated. In some embodiments parameters of the eye are additionally determined along a second axis of the eye in a second direction. In this way a three-dimensional model of parts of the eye can be generated in the manner of a raster scan of the eye using the plurality of light beam incidence positions.
[0159] To achieve multiple light beam incidence positions, in some embodiments the ocular biometry system comprises a beam adjustment mechanism to adjust the light beam 203 to be incident on eye 201 in a plurality of incidence positions. Exemplary beam adjustment mechanisms have been described above in relation to targeting the light beam 203 at the centre of eye 201 in step 403. The same or similar beam adjustment mechanisms may be used to achieve multiple light beam incidence positions during the image acquisition process.
[0160] In step 409 the control system 208 controls the beam adjustment mechanism to adjust the light beam to be incident on the eye in a plurality of incidence positions. In the embodiment shown in
[0161] In certain embodiments, the system 200 may comprise a lens or other optical component configured to cause all light beams incident on eye 201 to travel in parallel. For example, the light beams may all be parallel to the optical axis of the eye. This may be achieved in one embodiment by locating a lens between reflector 204 and eye 201 with the point of reflection of the light beam from the reflector 204 being the focal point of the lens. In another embodiment two reflectors may be used, with the focal point of the lens being located between the reflectors. In other embodiments, another afocal arrangement of optical components acting on incident light may be provided. Such arrangements may be advantageous to ensure the optical properties of the light beam entering the eye are the same as those of the light beam generated by the light source.
[0162] Cameras 207 capture images of the light beam 203 passing through eye 201 for each of the plurality of light beam incidence positions.
[0163] In some embodiments the control system 208 controls activation of the light source 202 such that the light source repeatedly turns on and off, and control system 208 further controls the beam adjustment mechanism to adjust the components of the system (e.g. the orientation of reflector 204) that determine the incidence position of the light beam 203 while the light source 202 is turned off. Once the beam adjustment mechanism has suitably adjusted reflector 204, for example, the control system 208 re-activates the light source 202. This sequence is repeated for the number of incidence positions are required. In this manner the total exposure time of the eye 201 to the light source can be reduced to improve safety.
[0164] In an alternative embodiment the ocular biometry system comprises a shutter configured to selectively block the light beam from the light source. Control system 208 is configured to control the shutter to expose the eye to the light beam once the beam adjustment mechanism has adjusted the components of the system as required. The shutter may be controlled by selectively moving it between a first position in which it blocks the light beam from the light source and a second position in which it does not block the light beam from the light source, for example.
[0165] The number of incidence positions, and therefore the number of captured images of the light beam 203 passing through eye 201, and the spacing between incidence positions, may be selected depending on the parameters of the eye desired to be obtained. In one embodiment a sufficient number of incidence positions are provided such that light beam 203 is incident across a sector of the retina subtending an angle of substantially 60° as parameters across such a range may be particularly clinically useful in some circumstances. For example this ensures parameters are determined for parts of the retina including the macula and blind spot.
[0166] In step 410 the geometric path lengths/eye parameters are calculated by processor 304 for each of the light beam incidence positions. This step comprises similar methods to those described above in relation to steps 406, 407 and 408 applied to each of the images captured by cameras 207 for each light beam incidence position. The result of this step, referring to the labelling in
[0167] Increasing the number of incidence positions may increase the accuracy of the calculated parameters of the eye but may lengthen the duration of the scan (period of taking measurements) and the time and complexity of calculations.
[0168] Multiple Light Sources
[0169] In some embodiments of the invention multiple light sources are shone into the eye. Such embodiments may collect more information on the structure of the eye.
[0170] An exemplary embodiment is shown in
[0171] Ocular biometry system comprises two light sources 802a and 802b positioned laterally next to each other from the perspective of the patient whose eye 801 is being measured when standing in front of ocular biometry system 800. Light sources 802a and 802b project light beams 803a and 803b respectively incident on eye 801. Similarly to the configuration of system 200, each of light beams 803a and 803b are initially projected in the superior direction and are reflected off reflectors 804a and 804b respectively before entering the eye 801 (light sources 802a and 802b are shown in
[0172] In the embodiment of
[0173] In an alternative embodiment a single light source is provided and the ocular biometry system comprises a beam splitter to split the light beam from the single light source into two light beams for incidence on the eye, and reflectors to reflect the split light beams in parallel towards the eye. This avoids the expense of two light sources.
[0174] When incident on eye 801 the lights beams 803a and 803b are spaced apart by a distance. In the embodiment shown in
[0175] The light sources 802, reflectors 804 and, if present, beam splitter, may be housed in a housing 805.
[0176] Ocular biometry system 800 further comprises a plurality of cameras 807 configured to capture images of the eye 801 when the first and second light beams 803 pass through the eye. At least two cameras 807 are provided. In the case of two cameras 807, they are positioned in a similar manner to the cameras 207 as described in relation to the embodiment shown in
[0177] In the embodiment of
[0178] Ocular biometry system 800 may also comprise a control system similar to that described with reference to
[0179] The projection of light beams 803 into the eye 801, the capture of images of the light beams when passing through the eye using cameras 807, and the calculation of optical path lengths and geometric path lengths to determine parameters of the eye is performed in a similar manner to that described in relation to ocular biometry system 200 above. Since two light beams 803 are incident on the eye, reflection points A, B, C and D are determined for each light beam, represented as A.sub.1, A.sub.2, B.sub.1, B.sub.2, etc in
[0180] The parameters of the eye determined by applying the above-described method to the image data captured from ocular biometry system 800 are parameters of the eye at the positions at which the light beams 803 pass through the eye.
[0181] Multiple Light Sources and Multiple Light Beam Incidence Positions
[0182]
[0183] In certain embodiments, the system 800 may comprise one or more lenses or other optical components configured to cause all light beams from the same light source 802 incident on eye 801 to travel in parallel. For example, the light beams may all be parallel to the optical axis of the eye. This may be achieved in one embodiment by locating a lens between reflector 804 and eye 801 with the point of reflection of the light beam from the reflector 804 being the focal point of the lens. In another embodiment two reflectors for each light beam may be used, with the focal point of the lens being located between the reflectors. In other embodiments, another afocal arrangement of optical components acting on incident light may be provided. As has been explained above, such arrangements may be advantageous to ensure the optical properties of the light beam entering the eye are the same as those of the light beam generated by the light source.
[0184] Eye parameters are determined for multiple light beam incidence positions using the system shown in
[0185] Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise”, “comprising”, and the like, are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense, that is to say, in the sense of “including, but not limited to”.
[0186] The entire disclosures of all applications, patents and publications cited above and below, if any, are herein incorporated by reference.
[0187] Reference to any prior art in this specification is not, and should not be taken as, an acknowledgement or any form of suggestion that that prior art forms part of the common general knowledge in the field of endeavour in any country in the world.
[0188] The invention may also be said broadly to consist in the parts, elements and features referred to or indicated in the specification of the application, individually or collectively, in any or all combinations of two or more of said parts, elements or features.
[0189] Where in the foregoing description reference has been made to integers or components having known equivalents thereof, those integers are herein incorporated as if individually set forth.
[0190] It should be noted that various changes and modifications to the presently preferred embodiments described herein will be apparent to those skilled in the art. Such changes and modifications may be made without departing from the spirit and scope of the invention and without diminishing its attendant advantages. It is therefore intended that such changes and modifications be included within the present invention.