Light microscope with automatic focusing
20200371335 ยท 2020-11-26
Inventors
Cpc classification
G06T7/80
PHYSICS
H04N23/959
ELECTRICITY
G02B21/34
PHYSICS
H04N23/673
ELECTRICITY
H04N23/676
ELECTRICITY
International classification
G06T7/80
PHYSICS
Abstract
In a method for automatically focusing a microscope, an overview image that shows a sample and an environment is recorded. An image processing algorithm ascertains in the overview image at least one boundary of an object that is not the sample itself. A focus setting is then ascertained at a location of at least one of the ascertained boundaries.
Claims
1. a method for automatically focusing a microscope, comprising ascertaining, by means of an image processing algorithm, at least one boundary of an object that is not a sample itself in an overview image showing the sample and an environment, and determining a focus setting at a location of at least one of the ascertained boundaries:
2. The method of claim 1, wherein an actuating device performs a relative movement between the sample and a detection beam path with the result that the detection beam path is directed at the location of the ascertained boundary, and subsequently the focus setting is determined.
3. The method of claim 1, wherein the image processing algorithm comprises a segmentation algorithm or detection algorithm via which boundaries of objects are ascertained in the overview image, wherein the segmentation algorithm or detection algorithm is based on a machine learning algorithm.
4. The method of claim 1, wherein the image processing algorithm ascertains in the overview image a plurality of locations of one or more boundaries and selects one or more of the locations of the boundaries on the basis of predetermined criteria that have been taught using a machine learning algorithm for determining the focus setting.
5. The method of claim 1, wherein the location of a boundary selected is one of a plurality of boundary portions, wherein the boundary portion is selected in dependence on its alignment relative to an illumination direction in the subsequent determination of the focus setting.
6. The method of claim 1, further comprising providing a learning mode to a user, in which: reference overview images of reference samples are represented, a marking tool is provided for the user with which the user can mark one or more boundaries in each reference overview image, a machine learning algorithm of the image processing algorithm establishes, based on the reference overview images and the boundaries marked by the user, criteria with which a boundary is ascertained for determining the focus setting.
7. The method of claim 1, wherein the boundary ascertained in the overview image is or comprises an edge of the object.
8. The method of claim 1, wherein a plurality of edges and at least one corner of the object are ascertained in the overview image, and the corner is used as the location of the boundary of the object where the focus setting is determined.
9. The method of claim 1, wherein the object is a cover slip and the boundary is or comprises a cover slip edge or a cover slip corner, or the object is a microscope slide and the boundary is or comprises a microscope slide periphery, or in that the object is a multiwell plate and the boundary is or comprises a well periphery, or the object is a sticker, a printed area or a mark on a microscope slide and the boundary is or comprises a periphery of the sticker, of the printed area or of the mark.
10. The method of claim 1, wherein determining the focus setting comprises at least: recording a plurality of microscope images of different height focusing at the location of the ascertained boundary; determining a respective image sharpness quality factor of the microscope images; determining that microscope image or those microscope images having the maximum image sharpness quality factor; establishing a focus setting based on the microscope image / microscope images having the maximum image sharpness quality factor.
11. The method of claim 10, wherein, as part of the determination of the focus setting, a cover slip thickness is also determined, wherein a cover slip periphery is ascertained in the overview image as a boundary of the object, wherein during the determination of the microscope images with a maximum image sharpness quality factor, two microscope images with a local maximum in the image sharpness quality factor are ascertained, wherein a cover slip thickness is ascertained from the difference between height values of the two microscope images.
12. The method of claim 1, further comprising fine focusing on sample structures based on the determined focus setting, and recording at least one sample image with the fine focusing.
13. The method of claim 1, wherein as part of the determination of the focus setting, a microscope slide inclination is also ascertained, wherein at least three locations of a boundary are ascertained on which then in each case a focus setting that indicates a height position of the respective location is ascertained, wherein a microscope slide inclination is ascertained based on the at least three height positions.
14. The method of claim 13, further comprising selecting the locations of the boundary that are used for ascertaining the microscope slide inclination such that the sample is located laterally between these locations of the boundary.
15. The method of claim 13, wherein the number of locations of the boundary used for determining the microscope slide inclination is selected in dependence on a lateral distance between said locations.
16. The method of claim 13, further comprising using the ascertained microscope slide inclination for a focus setting in subsequent sample examinations, wherein for different lateral positions of the sample a respective focus setting is adjusted in accordance with the microscope slide inclination.
17. The method of claim 1, wherein, as part of the ascertainment of the focus setting, an orientation alignment is ascertained by ascertaining in the overview image at least one cover slip periphery and one microscope slide boundary as boundaries, ascertaining a focus setting and thus a height position in each case for the cover slip periphery and the microscope slide boundary, ascertaining, using a comparison of the height positions of the cover slip periphery and of the microscope slide boundary, whether the cover slip is located above or below the microscope slide.
18. The method of claim 17, wherein the ascertained microscope slide inclination is taken into account for the ascertainment of the orientation alignment by calculating a height offset based on the microscope slide inclination for the height positions to be compared.
19. The method of claim 1, further comprising ascertaining at least one cover slip periphery and one microscope slide boundary as boundaries in the overview image, determining a focus setting and thus a height position in each case for the cover slip periphery and the microscope slide boundary, and setting a focus setting for subsequent sample examinations to a height position, wherein the height position is between the determined height positions of the cover slip periphery and the microscope slide boundary.
20. A computer program product comprising instructions that, upon execution by a computing unit of a light microscope, cause the latter to carry out the method of claim 1.
21. A light microscope with automatic focusing, comprising an electronic control and evaluation device, which comprises an image processing algorithm that is designed to ascertain at least one boundary of an object that is not a sample itself in an overview image showing the sample and an environment, and wherein the electronic control and evaluation device is configured to determine a focus setting at a location of at least one ascertained boundary.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0043] A better understanding of the invention and various other features and advantages of the present invention will become readily apparent by the following description in connection with the schematic drawings, which are shown by way of example only, and not limitation, wherein like reference numerals may refer to alike or substantially alike components:
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051] Identical and identically acting constituent parts are generally identified by the same reference signs in the figures.
DETAILED DESCRIPTION OF EMBODIMENTS
[0052]
[0057] In the light microscope 100 shown, measurement is effected in reflected light, that is to say both illumination light 5 and detection light 6 travel in part via the same optics elements 11-13, wherein a beam splitter 19, for example a colour or polarization beam splitter 19, is arranged in the common beam path. The beam splitter 19 in the example shown is transmissive for detection light 6 and reflective for illumination light 5, although this may also be designed to be the other way around. In the example depicted, the illumination light 5 and the detection light 6 also travel via the same objective 20 or 20B, although the invention is not limited to such designs. In other designs according to the invention, measurements are not performed in reflected light and the illumination light 5 and detection light 6 do not travel via the same optics elements or the same objective.
[0058] First, an overview image of the sample 62 is recorded using the light microscope 100. One or more suitable locations at which focusing takes place are ascertained using the overview image. Next, sample images with a larger imaging scale (magnification) than the overview image can be recorded. The recording of the overview image and subsequent measurements can be effected with different objectives 20, 20B. Depending on the setup, the same image sensor 30 or different image sensors can be used for the recording of the overview image and for subsequent measurements for ascertaining a focus setting or for recording sample images.
[0059] The invention makes automatic focusing possible, wherein the necessary calculation and control steps are performed by an electronic control and evaluation device 50. The control and evaluation device 50 is configured to evaluate image data of the image sensor 30, change focusing of the light microscope and set a lateral position of the sample 62. For setting the lateral position of the sample 62, the control and evaluation device 50 controls for example an actuation device 42, which can comprise in particular a movable sample stage. Alternatively, a scanner can also be driven by the control and evaluation device 50 to scan a specific lateral region. A lateral position designates the position of the sample 62 perpendicular to the detection axis, that is to say to the direction of the detection beam path in which it is incident on the sample 62. The focusing is effected in the direction of the detection axis, which is also referred to as the z-direction. The lateral position is consequently defined by x-y-coordinates. Different focus settings differ in the relative z-position between the sample 62 and the detection plane that is sharply imaged via the objective 20 or 20B and optics elements 11-16 onto the image sensor 30. For changing the focusing, the control and evaluation device 50 can adjust the sample 62 itself in the z-direction, in particular via a movable sample stage. Alternatively, a zoom optical unit can be present and be driven by the control and evaluation device 50 such that the zoom optical unit effects a focus adjustment. The zoom optical unit can be formed in the objective 20, 20B or separately herefrom.
[0060] The automatic focusing will now be described in more detail with reference to
[0061]
[0062] The invention uses the knowledge that other structures, however, in particular edges of objects that are not the sample themselves are frequently significantly easier to see, can be detected more easily in automated fashion and are suitable due to a large capture region to ascertain a focus setting. To this end, an image processing algorithm that is part of the control and evaluation device 50 first ascertains a boundary of an object. In the example illustrated, the object is the cover slip 63 and the boundary 63A of the cover slip 63 is formed by the edges 63B to 63E thereof.
[0063] The image processing algorithm can comprise a segmentation algorithm or detection algorithm to identify the bounding box. In this case, a machine learning algorithm that was trained previously for this type of objects can be used. The knowledge that frequently uniformly and regularly shaped objects are used, independently of the specific sample, is utilized. For example, cover slips are generally rectangular or circular. The size of the cover slip can also be trained easily if a multiplicity of measurements are performed on different samples having cover slips of the same type (or on the same sample). For training the image processing algorithm for finding a bounding box, the following factors can be considered: The boundary forms a symmetric shape, for example a rectangle, square, circle or, in particular in the case of arrangement inaccuracies, a parallelogram, trapezium or an ellipse. Depending on the field of view recorded with the overview image 60, the object is additionally visible in its entirety, with the result that the bounding box forms a closed shape.
[0064] In the example illustrated, the image processing algorithm ascertains the boundary 63A, that is to say the periphery of the cover slip 63. A location 65 of the cover slip boundary 63A is now used for a subsequent ascertainment of a focus setting. The location 65 designates a region of the boundary 63A, for example part of a cover slip edge 63B-63E or, as in the example illustrated, a cover slip corner where two cover slip edges 63D, 63E meet.
[0065] The control and evaluation device now moves the sample stage such that the detection beam path is centred onto the location 65. In principle, such a lateral movement between the sample and detection beam path can also be effected in a different way. Subsequently, a plurality of images (below: microscope images) that differ in the z-focusing are recorded at greater magnification than in the overview image. To set different focusing, the control and evaluation device changes either the z-position of the sample or the z-position of the detection focus, as described above. Two such microscope images 70 and 71, which were recorded with different focusing, are shown in
[0066] For example, different sample images can be recorded with the focus setting thus ascertained. For a clearer definition of terminology, sample images are intended to designate recordings of the image sensor that are directed at the sample 62 itself rather than to the periphery of an object (cover slip 63), which is not the sample itself, like the aforementioned microscope images 70, 71. With this terminology as defined, the microscope images recorded for focus ascertainment differ from the sample images in their x-y-positions.
[0067] Alternatively, the focus setting thus ascertained can also be considered to be a coarse focus setting and be used for subsequent fine focusing before sample images are recorded. After all, owing to for example the cover slip thickness, a slight deviation between the detection plane and the z-position of the sample can be present in the ascertained focus setting. The accuracy can lie for example in the range of typical cover slip thicknesses of approximately 20 pm. In the fine focusing, the detection beam path can be directed in particular at the sample 62 itself and then images can be recorded with different fine focus settings that lie around the previously ascertained (coarse) focus setting. Next, the fine focus setting for which the recorded image has the greatest quality factor is ascertained and used for the subsequent sample recordings.
[0068] Automated focusing can be reliably performed in the manner described. Advantageously, the capture range is large, i.e., the range of different z-focus settings from which a focus adjustment to a sharply imaging focus setting can be determined.
[0069] In
[0070] In the above examples, the cover slip periphery 63A is detected by the image processing algorithm as the boundary of the object that is not the sample 62 itself. However, the image processing algorithm can also be designed to additionally or alternatively determine other bounding boxes. For example, it can identify the microscope slide periphery 61A in the overview image 60 and select a portion of one of the edges 61B-61E or one of the corners of the microscope slide 61 as the location of the microscope slide periphery 61A for the subsequent ascertainment of the focus setting.
[0071] In further alternatives, other bounding boxes are selected, for example the periphery 64A of a sticker or printed area 64 on the microscope slide 61. However, it is not necessarily known in this case whether the sticker/printed area 64 is located on the sample side of the microscope slide 61 or on the opposite side of the microscope slide 61.
[0072] In further embodiment variants, other sample vessels are used, for example a multiwell plate.
[0073] In one invention variant, the cover slip thickness is additionally determined. In this case, a stack of microscope images 68-73 is recorded at a location of a boundary of the cover slip, as shown in
[0074] The invention thus makes reliable automated focusing possible, wherein as part of the ascertainment of a suitable focus setting, additional measurement results are also optionally obtained, for example relating to the cover slip thickness, microscope slide inclination or microscope slide orientation. It is not necessary for focusing that the sample is clearly visible. The computer program product according to the invention can also be realized with conventional light microscopes, that is to say no additional hardware is absolutely necessary in the case of such light microscopes.
LIST OF REFERENCE SIGNS
[0075] Illumination light Detection light Detection beam path Light source 11-16 Optics elements Beam splitter Objective with which a focus setting is ascertained 20B Objective with which an overview image is recorded Image sensor Actuating device for moving the sample Control and evaluation device Overview image Microscope slide 61A Boundary / periphery of the microscope slide 61B-61E Edges of the microscope slide 61 Sample Cover slip 63A Boundary / periphery of the cover slip 63B-63E Edges of the cover slip 63 Sticker/printed area on the microscope slide 61 64A Boundary / periphery between sticker/printed area 64 and microscope slide 61 Site of the boundary for ascertaining the focus setting Multiwell plate 66A Well peripheries of the multiwell plate 66 68-73 Microscope images with different z-focusing 100 Light microscope Q Image sharpness quality factor of the microscope images 68-73