OPTICAL SYSTEM FOR REAL-TIME CLOSED-LOOP CONTROL OF FUNDUS CAMERA AND IMPLEMENTATION METHOD THEREFOR

20220060634 · 2022-02-24

Assignee

Inventors

Cpc classification

International classification

Abstract

An optical system for real-time closed-loop control of a fundus camera and an implementation method therefor. The optical system comprises an optical path structure composed of a fundus camera, light sources (LS1, LS2), a plurality of lenses (L1, L2, L2′, L3′) and a dividing mirror (DM1, DM2), and further comprises an orthogonal steering mirror group, which comprises: a first steering mirror (SM1) moving in a horizontal direction and a second steering mirror (SM2) moving in a vertical direction. The optical system converts fundus motion information obtained from a fundus camera image to residual motion information compensated by means of the first steering mirror (SM1) and the second steering mirror (SM2), uses a relationship between control parameters, and by means of a translation control instruction or/and the fundus rotation control instruction, operates the first steering mirror (SM1) and the second steering mirror (SM2) in real time to compensate for translational motion or/and controls the fundus camera to compensate for fundus rotation. By using the optical system and the implementation method therefor, and by improving the optical system of the fundus camera, the optical system is enabled to have a real-time closed-loop control function so as to implement real-time optical tracking of a fundus/retina position and a target.

Claims

1. An optical system for real-time closed-loop control of a fundus camera, comprising an optical path structure composed of a fundus camera, a light source, a plurality of lenses, and a dividing mirror, and characterized by further comprising an orthogonal steering mirror group, the orthogonal steering mirror group comprising a first steering mirror SM1 moving in a horizontal direction and a second steering mirror SM2 moving in a vertical direction; the optical system is arranged to convert fundus motion information obtained from an image of the fundus camera into residual motion information that has been compensated by the SM1 and SM2, and to manipulate the SM1 and SM2 respectively in real time to compensate for a translational motion or/and control the fundus camera to compensate for a fundus rotation by a translation control instruction and a fundus rotation control instruction using a relationship between control parameters.

2. The optical system for real-time closed-loop control of the fundus camera according to claim 1, characterized in that the relationship between the control parameters is expressed by equation (1):
(x.sub.t+1,y.sub.t+1,θ.sub.t+1)=(x.sub.t,y.sub.t,θ.sub.t)+gx.sub.t,Δy.sub.t,Δθ.sub.t)  (1) wherein (x.sub.t, y.sub.t) is the translation control instruction accumulated on the first steering mirror SM1 and the second steering mirror SM2 at a current time point, θ.sub.t is the fundus/retinal rotation control instruction accumulated at the current time point; (Δx.sub.t, Δy.sub.t) is a residual fundus translation amount obtained from the image of the fundus camera, Δθ.sub.t is a residual fundus rotation amount obtained from the image; (x.sub.t+1, y.sub.t+1) is the translation control instruction that needs to be updated for the SM1 and SM2 at a next sampling time point, θ.sub.t+1 is the fundus/retinal rotation control instruction that needs to be updated at the next sampling time point; index t represents a time sequence; g is a gain of the closed-loop control system.

3. The optical system for real-time closed-loop control of the fundus camera according to claim 1, characterized in that the control instructions for controlling the SM1 and SM2 are configured to be sent from a personal computer or a dedicated processor connected to the fundus camera of the optical system.

4. The optical system for real-time closed-loop control of the fundus camera according to claim 1, characterized in that the SM1 and SM2 are a 6210H biaxial scanning mirror of CTI or an S334-2SL two-dimensional steering mirror of PI.

5. (canceled)

6. An optical system for real-time closed-loop control of a fundus camera, comprising an optical path structure composed of a fundus camera, a light source, a plurality of lenses, and a dividing mirror, and characterized by the fundus camera is disposed on an eyeball rotation signal compensation device; an orthogonal steering mirror group is disposed into the optical path system, the orthogonal steering mirror group comprising a first steering mirror SM1 moving in a horizontal direction and a second steering mirror SM2 moving in a vertical direction; the optical system is arranged to convert fundus motion information obtained from an image of the fundus camera into residual motion information that has been compensated by the SM1 and SM2, and to manipulate the SM1 and SM2 respectively in real time to compensate for a translational motion or/and control the eyeball rotation signal compensation device to compensate for a fundus rotation by a translation control instruction and a fundus rotation control instruction using a relationship between control parameters.

7. The optical system for real-time closed-loop control of the fundus camera according to claim 6, characterized in that the relationship between the control parameters is expressed by equation (1)′:
(x.sub.t+1,y.sub.t+1,θ.sub.t+1)=(x.sub.t,y.sub.t,θ.sub.t)+gx.sub.t,Δy.sub.t,Δθ.sub.t)  (1)′ wherein (x.sub.t, y.sub.t) is the translation control instruction accumulated on the first steering mirror SM1 and the second steering mirror SM2 at a current time point, θ.sub.t is the rotation control instruction accumulated on the eyeball rotation signal compensation device at the current time point; (Δx.sub.t, Δy.sub.t) is a residual fundus translation amount obtained from the image of the fundus camera, Δθ.sub.t is a residual fundus rotation amount obtained from the image of the fundus camera; (x.sub.t+1, y.sub.t+1) is the translation control instruction that needs to be updated for the SM1 and SM2 at a next sampling time point, θ.sub.t+1 is the fundus/retinal rotation control instruction that needs to be updated for the eyeball rotation signal compensation device at the next sampling time point; index t represents a time sequence; g is a gain of the closed-loop control system.

8. The optical system for real-time closed-loop control of the fundus camera according to claim 6, characterized in that the eyeball rotation signal compensation device is a rotating stage capable of rotating the fundus camera along an optical axis to optically compensate for the fundus rotation amount in real time.

9. The optical system for real-time closed-loop control of the fundus camera according to claim 6, characterized in that the control instructions for controlling the SM1 and SM2 are configured to be sent from a personal computer or a dedicated processor connected to the fundus camera of the optical system.

10. The optical system for real-time closed-loop control of the fundus camera according to claim 6, characterized in that the SM1 and SM2 are a 6210H biaxial scanning mirror of CTI or an S334-2SL two-dimensional steering mirror of PI.

11. An optical system for real-time closed-loop control of a fundus camera, comprising an optical path structure composed of a fundus camera, a light source, a plurality of lenses, and a dividing mirror, and characterized by the fundus camera is disposed on an eyeball rotation signal compensation device; an orthogonal steering mirror group is disposed into the optical path system, the orthogonal steering mirror group comprising a first steering mirror SM1 moving in a horizontal direction and a second steering mirror SM2 moving in a vertical direction; the optical system is arranged to obtain a reference image from the fundus camera, to import fundus position information from outside or extract it from a real-time video using a cross-correlation algorithm, to obtain an offset amount comprising a translation amount and a rotation amount of any current image and the reference image by calculation; and to manipulate the SM1 and SM2 respectively in real time to compensate for a translational motion or/and control the eyeball rotation signal compensation device to compensate for a fundus rotation by a translation control instruction and a fundus rotation control instruction using a relationship between control parameters.

12. The optical system for real-time closed-loop control of the fundus camera according to claim 11, characterized in that the relationship between the control parameters is expressed by equation (1)″:
(x.sub.t+1,y.sub.t+1,θ.sub.t+1)=(x.sub.t,y.sub.t,θ.sub.t)+gx.sub.t,Δy.sub.t,Δθ.sub.t)  (1)″ wherein (x.sub.t, y.sub.t) is the translation control instruction accumulated on the first steering mirror SM1 and the second steering mirror SM2 at a current time point, θ.sub.t is the rotation control instruction accumulated on the eyeball rotation signal compensation device at the current time point; (Δx.sub.t, Δy.sub.t) is a residual fundus translation amount obtained from the image of the fundus camera, Δθ.sub.t is a residual fundus rotation amount obtained from the image of the fundus camera; (x.sub.t+1, y.sub.t+1) is the translation control instruction that needs to be updated for the SM1 and SM2 at a next sampling time point, θ.sub.t+1 is the fundus/retinal rotation control instruction that needs to be updated for the eyeball rotation signal compensation device at the next sampling time point; index t represents a time sequence; g is a gain of the closed-loop control system.

13. The optical system for real-time closed-loop control of the fundus camera according to claim 11, characterized in that the eyeball rotation signal compensation device is a mechanical device capable of compensating for an eyeball rotation signal in real time.

14. The optical system for real-time closed-loop control of the fundus camera according to claim 11, characterized in that the control instructions for controlling the SM1 and SM2 are configured to be sent from a personal computer or a dedicated processor connected to the fundus camera of the optical system.

15. The optical system for real-time closed-loop control of the fundus camera according to claim 11, characterized in that the SM1 and SM2 are a 6210H biaxial scanning mirror of CTI or an S334-2SL two-dimensional steering mirror of PI.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0030] FIG. 1 is a schematic diagram of an optical system of an existing fundus camera;

[0031] FIG. 2 is an example of a fundus image captured by an existing fundus camera;

[0032] FIG. 3 is a schematic diagram of an optical system in which an existing primary fundus camera (i.e., a primary system) integrates a secondary system (i.e., an auxiliary system);

[0033] FIG. 4 is a schematic diagram of an optical system in which an existing primary fundus camera (i.e., a primary system) integrates a plurality of auxiliary systems;

[0034] FIG. 5 is a schematic diagram of an optical system structure for real-time closed-loop control of a fundus camera according to a first embodiment of the present invention;

[0035] FIG. 6 is a schematic structural diagram of an optical system for real-time closed-loop control of a fundus camera according to a second embodiment of the present invention;

[0036] FIG. 7 is a schematic diagram of a rotating imaging camera for compensating rotational motion of a fundus according to the present invention;

[0037] FIG. 8 is a schematic diagram of an optical system in which a primary fundus camera imaging system is used to implement closed-loop control of a secondary system according to the first embodiment of the present invention;

[0038] FIG. 9 is a schematic diagram of an optical system in which a primary fundus camera imaging system is used to implement closed-loop control of a secondary system according to the second embodiment of the present invention;

[0039] FIG. 10 is a schematic diagram of a correspondence relationship between imaging spaces of the primary and auxiliary systems according to the present invention.

DETAILED DESCRIPTION OF THE INVENTION

[0040] Hereinafter, the technical solution of the present invention will be further described in detail in connection with the drawings and embodiments of the present invention.

[0041] FIG. 1 is a schematic diagram of an optical system of an existing fundus camera. An incident light emitted from a light source (LS) 1 is collimated by a lens L1 and reaches a dividing mirror (DM) 1. A portion of the incident light is reflected by the DM1 and enters a lens L2 to reach a fundus (Eye), namely a retina. The light reflected from the fundus transmits through the lens L2, the dividing mirror DM1, and a focusing lens L3, and is finally received by a fundus camera.

[0042] In an embodiment of the present invention, the fundus camera may be operated independently, or may be operated by a PC or under the control of a dedicated processor. The fundus camera shown in FIG. 1 may be used to obtain an image of the fundus, as shown in FIG. 2. The image may be colorful or black-and-white.

[0043] FIG. 3 is a schematic diagram of an optical system in which an existing primary fundus camera (i.e., a primary system, or known as a first system) integrates a secondary system (i.e., an auxiliary system).

[0044] In a clinical application, a secondary optical system is usually integrated into the primary fundus camera. In an embodiment of the present invention, a fundus camera with a function of closed-loop control and real-time optical tracking of fundus position is defined as the primary system; meanwhile, another optical system integrated into the primary system, with common path or non-common path, is defined as the auxiliary system.

[0045] As shown in FIG. 3, a light emitted by a light source LS2 of the auxiliary system reaches a dividing mirror DM2 after passing through optical elements of the auxiliary system, such as a lens L2 and scanning mirrors. In general, the DM2 transmits all light passing through the primary system, but reflects all light of the auxiliary system. The optical elements in the auxiliary system, such as scanning mirrors, may be controlled by the primary system or operated independently.

[0046] FIG. 4 is a schematic diagram of an optical system in which an existing primary fundus camera (i.e., a primary system) integrates a plurality of auxiliary systems.

[0047] As shown in FIG. 4, an optical system integrates a plurality of auxiliary systems on the basis of FIG. 3. Each auxiliary system may have its own specific function, for example, one is used for OCT (Optical Coherence Tomography), and another one is used to project a focused beam onto the fundus for treatment purpose. The optical elements in each auxiliary system, such as scanning mirrors and optical component, may be controlled by the primary system or operated independently.

[0048] FIG. 5 is a schematic diagram of an optical system structure for real-time closed-loop control of a fundus camera according to a first embodiment of the invention.

[0049] In an embodiment of the present invention, by improving the above-mentioned optical system structure of the fundus camera, the improved optical system of the fundus camera has a function of real-time closed-loop control and optical tracking of the fundus position/retinal target.

[0050] As shown in FIG. 5, the optical system for real-time closed-loop control of the fundus camera disposes an orthogonal steering mirror group, which includes two orthogonally moving one-dimensional steering mirrors (SM), specifically, a first steering mirror SM1 moving in a horizontal direction and a second steering mirror SM2 moving in a vertical direction. Obviously, the steering mirrors SM1 and SM2 may move in any direction and angle in a 360-degree space, as long as the motion axes of the steering mirrors SM1 and SM2 satisfy an orthogonal relationship.

[0051] In an embodiment of the present invention, a 6210H biaxial scanning mirror of CTI (Cambridge Technology Inc) is used as the steering mirror elements.

[0052] In another embodiment of the present invention, the steering mirrors SM1 and SM2 may also be replaced with a two-dimensional steering mirror with two orthogonal motion axes. An implementable element is a S334-2SL two-dimensional steering mirror of PI (Physik Instrumente).

[0053] As shown in FIG. 5, a control instruction/signal (controlled by a PC) for controlling the steering mirrors SM1 and SM2 may be obtained from the PC or a dedicated processor.

[0054] A simple implementation is to apply a cross correlation algorithm to obtain a fundus position signal (x, y, θ) from a fundus image signal collected by the fundus camera. The specific method is to use an image previously obtained in time sequence from the fundus camera as a reference image, and cross-correlate any image obtained subsequently with the reference image to obtain a relative displacement (x,y, θ), wherein (x, y) is a translation amount of the eyeball/retina obtained from the fundus image, and θ is a rotation amount of the eyeball/retina obtained from the fundus image. A method employed in an embodiment of the present invention is to calculate (x, y, θ) by using a cross correlation algorithm. A fundus image previously obtained in time sequence is used as a reference image, for example, defined as R, and a fundus image subsequently obtained from the fundus camera at any time point is defined as T.sub.k, wherein the index k (=1, 2, 3, . . . ) is the time sequence, which all occur after the reference image. The cross-correlation algorithm xcorr(T.sub.k, R) is performed to obtain a spatial relative relationship (x, y, θ) between T.sub.k and R. Performing the cross-correlation algorithm xcorr (T.sub.k, R) may be implemented by conventional Fast Fourier Transform (FFT) or by other methods.

[0055] The above three parameters (x, y, θ) may generally describe a motion of the eyeball/fundus target relatively completely.

[0056] In the embodiment shown in FIG. 5, a closed-loop control method is disclosed. The so-called closed-loop control means that the steering mirrors SM1 and SM2 are disposed before the fundus camera as a signal detector. Before the fundus image signal enters the fundus camera, an image drift caused by the fundus motion has been compensated by the steering mirrors SM1 and SM2 in real time. Therefore, the image drift and rotation amount obtained by the fundus camera at any time point is actually residual motion information. As mentioned above, the residual motion information is still obtained from the image through the cross-correlation algorithm. In this way, in a time-space domain, the above-mentioned relationship of control parameters may be expressed in a form of equation (1):


(x.sub.t+1,y.sub.t+1,θ.sub.t+1)=(x.sub.t,y.sub.t,θ.sub.t)+gx.sub.t,Δy.sub.t,Δθ.sub.t)  (1)

[0057] wherein (x.sub.t, y.sub.t) is a translation control instruction accumulated on the steering mirrors SM1 and SM2 at a current time point, θ.sub.t is a fundus/retinal rotation control instruction accumulated at the current time point (in a certain case, such as there is only translation without rotation, and thus θt is 0); (Δx.sub.t, Δy.sub.t) is a residual fundus translation amount obtained from the image of the fundus camera, Δθ.sub.t is a residual fundus rotation (angle) amount obtained from the image; (x.sub.t+1, y.sub.t+1) is the translation control instruction that needs to be updated for the steering mirrors SM1 and SM2 at a next sampling time point, and θ.sub.t+1 is the fundus/retinal rotation control instruction that needs to be updated (by controlling the fundus camera) at the next sampling time point. Index t represents a time sequence, and g is a gain of the closed-loop control system.

[0058] In the above equation (1), the eyeball/retinal rotation signal may be compensated in a digital manner, or compensated in an optical-mechanical manner as shown in FIG. 6 below.

[0059] FIG. 6 is a schematic structural diagram of an optical system for real-time closed-loop control of a fundus camera according to a second embodiment of the present invention.

[0060] In the optical-mechanical compensation, one method is to mount the fundus camera on an eyeball rotation signal compensation device, such as a rotation stage, so that the fundus camera may rotate along an optical axis for real-time optical compensation of the fundus rotation amount. In this case, θ.sub.t is the rotation control instruction accumulated on the rotating stage at the current time point, Δθ.sub.t is the residual fundus rotation amount obtained from the fundus camera image, and θ.sub.t+1 is the rotation control instruction that needs to be updated for the rotating stage at the next sampling time point. In this embodiment, the remaining of the optical path structure is the same as that shown in FIG. 5.

[0061] As shown in FIG. 6, the optical system for real-time closed-loop control of the fundus camera also provides another eyeball rotation signal compensation device, such as a mechanical device for real-time compensation of the eyeball rotation signal. In this embodiment, a two-dimensional camera for imaging is mounted on a rotation stage (see the dashed rectangular box), and the rotating axis of the rotation stage is consistent with the optical axis of the optical system.

[0062] As shown in equation (1), the method of obtaining the fundus position information from the camera image usually employs a cross-correlation method. The method is to firstly select a reference image, import the fundus position information from an external file or extract it from a real-time video; in the following time sequence, calculate an offset amount including a translation amount and a rotation amount of any future current image and this reference image, such as (Δx.sub.t, Δy.sub.t, Δθ.sub.t) in equation (1).

[0063] FIG. 7 is a schematic diagram of a rotating imaging camera for compensating rotational motion of a fundus according to the present invention. The fundus rotation may be caused by many factors, including eyeball rotation, head rotation or other reasons, but for a fundus imaging system, the final result is a rotation of the fundus image. Therefore, here are collectively referred to as fundus rotation.

[0064] As shown in FIG. 7, the lower part of FIG. 7A is the reference image, and the upper part is the position of the imaging camera. The lower part of FIG. 7B shows due to the rotation of the fundus, the image obtained by the fundus camera also rotates, such as “rotate by 0 degree” in the figure.

[0065] The cross-correlation algorithm obtains the rotation angle θ from the above images in FIGS. 7A and 7B in a closed-loop manner from equation (1). Then, the angle θ may be sent to the rotation stage that controls the fundus (imaging) camera, to make a photosensitive surface of the imaging camera also rotate by θ degree to compensate for the rotation amount θ of the fundus. An equivalent result of the rotation compensation is to restore the image obtained by the camera to the position of the lower image in FIG. 7A.

[0066] FIG. 8 is a schematic diagram of an optical system in which a primary fundus camera imaging system is used to implement closed-loop control of a secondary system according to the first embodiment of the present invention.

[0067] As shown in FIG. 8, a closed-loop control fundus tracking signal of the primary system is used to drive one or more similar optical-mechanical devices of the auxiliary system, so that the auxiliary system may also achieve the purpose of tracking the fundus position.

[0068] As shown in FIG. 8, the closed-loop control of the primary fundus camera imaging system still employs the relationship of equation (1). In order to achieve the purpose that the auxiliary system may also track the eyeball motion, the eyeball motion signal (x, y, θ) obtained by the primary imaging system needs to be converted to the scanning mirrors or tracking mirrors of the auxiliary system. The tracking mirrors of the primary imaging system, namely steering mirrors SM1 and SM2, are used for the tracking within the primary imaging system, and the scanning mirrors or tracking mirrors of the auxiliary system is used for the optical tracking within the auxiliary system.

[0069] The spatial transformation relationship f(x, y, θ; x′, y′, θ′) between the tracking mirrors SM1 and SM2 of the primary imaging system and the scanning mirrors of the auxiliary system is implemented by system calibration. As such, at any sampling time point, the control signals sent to the tracking mirrors of the auxiliary system according to equation (1) have the following relationship:


(x′.sub.t+1,y′.sub.t+1,θ′.sub.t+1)=f(x,y,θ;x′,y′,θ′)(x.sub.t+1,y.sub.t+1,θ.sub.t+1)  (2)

[0070] The result of the above equation (2) is used to adjust the position of the scanning mirrors of the auxiliary system to implement the real-time tracking of the target by the auxiliary system. However, this group of signals does not include the unique functions of the scanning mirrors of the auxiliary system used by itself, such as OCT scanning of the fundus.

[0071] FIG. 9 is a schematic diagram of an optical system in which a primary fundus camera imaging system is used to implement closed-loop control of a secondary system according to the second embodiment of the present invention.

[0072] As shown in FIG. 9, it is another embodiment of the present invention in which the optical tracking closed-loop control of the primary imaging system is used to drive the tracking of one or more auxiliary imaging systems.

[0073] As shown in FIG. 9, the tracking mirrors of the primary imaging system, namely steering mirrors SM1 and SM2, are shared by all auxiliary systems. In other words, after the real-time compensation of the SM1 and SM2, the eyeball motion signal reaching the auxiliary system has also been compensated. If the auxiliary system is an imaging system, such as an OCT, the real-time image of the OCT has been stabilized by the SM1 and SM2.

[0074] In FIG. 9, the scanning mirrors of the auxiliary system are only used for its own purposes, such as B-scan and C-scan of the OCT, or for navigating the spatial position of the focused laser beam projected to the fundus. However, there may also be some special reasons involved in the design of the optical system. The primary system and the auxiliary system have different optical magnifications, spatial offsets amount and other reasons. As such, in order for the auxiliary system to accurately track the fundus position, it is still necessary to use a spatial transformation relationship such as equation (2) to convert the eyeball motion information of the primary system to the scanning mirrors of the auxiliary system to implement the optical tracking of the auxiliary system.

[0075] FIG. 10 is a schematic diagram of a correspondence relationship between imaging spaces of the primary and auxiliary systems according to the present invention.

[0076] As shown in FIG. 10, it is assumed that the size of the imaging surface of the primary optical system is exactly twice the size of the imaging surface of the auxiliary system, and the imaging center positions of the two systems are consistent with each other. This is very common in clinical practice, for example, the auxiliary system “digging out” a local area from the primary system for optical/digital magnification, or other forms of imaging. Then, when using the spatial transformation relation of equation (2), it is easy to obtain:


x′.sub.t+1=x.sub.t+1/2  (3)


y′.sub.t+1=y.sub.t+1/2  (4)


θ′.sub.t+1=θ.sub.t+1  (5)

[0077] Obviously, the spatial mapping relationship between the primary system and the auxiliary system in the above equations (3), (4), and (5) may also have other forms than FIG. 10, then equations (3)-(5) obtained from equation (2) have different results.

[0078] Once the design of the optical system is determined, this certain relationship may generally be obtained by one-time calibration measurement and calculation.

[0079] The foregoing descriptions are only preferred embodiments of the present invention, and are not used to limit the protection scope of the present invention.