Microscope and Method for Determining a Distance to a Sample Reference Plane
20220018652 · 2022-01-20
Inventors
Cpc classification
G02B21/365
PHYSICS
G01B11/14
PHYSICS
International classification
G01B11/14
PHYSICS
Abstract
A method for determining a distance of a sample reference plane of a sample carrier from a reference plane of a microscope, the microscope including a sample stage for the sample carrier and a camera, comprises the following steps: taking an overview image of the sample carrier by means of the camera; evaluating the overview image and thus detecting at least one characteristic of the sample carrier; ascertaining contextual data of the characteristic from a data set; and determining the distance of the sample reference plane from the reference plane based on the characteristic and the contextual data of the sample carrier. A microscope configured to determine the distance of the sample reference plane of the sample carrier from the reference plane is also described.
Claims
1. A method for determining a distance of a sample reference plane of a sample carrier from a reference plane of a microscope, the microscope comprising a sample stage for the sample carrier and a camera, wherein the method comprises the following steps: taking an overview image of the sample carrier by means of the camera; evaluating the overview image and thus detecting at least one characteristic of the sample carrier; ascertaining contextual data of the characteristic from a data set; determining the distance of the sample reference plane from the reference plane based on the characteristic and the contextual data of the sample carrier.
2. The method of claim 1, wherein the characteristic is a reference object that has at least one of a known size and a known geometry as contextual data.
3. The method of claim 2, wherein the reference object is a calibration slide, an adhesive label, a holder for sample carriers, a structure on the holder for sample carriers or characters on the sample carrier.
4. The method of claim 1, wherein image areas of the sample carrier or of sample carrier parts are identified in the overview image as a characteristic; the contextual data indicate geometric data relating to physical dimensions or a shape of the sample carrier or sample carrier parts; and the distance of the sample reference plane from the reference plane is determined based on the shape or size of the identified image areas and the associated geometric data relating to the actual physical size or shape.
5. The method of claim 4, wherein the sample carrier parts whose image areas are identified in the overview image represent at least one sample receptacle; the contextual data comprise an indication of a physical shape or size of the sample receptacle; and the distance is determined based on the size or shape of the image areas of the at least one sample receptacle and based on the indication of the physical shape or size of the sample receptacle.
6. The method of claim 1, wherein a classification is carried out in which an object carrier type of the sample carrier is identified as a characteristic and contextual data are stored in the data set for different object carrier types. The method of claim 6, wherein object carrier heights for different object carrier types are stored in the data set as contextual data; and the distance of the sample reference plane from the reference plane is determined based on the object carrier height of the presently provided object carrier type.
8. The method of claim 6, wherein the classification comprises a distinction of whether or not the sample carrier has a lid.
9. The method of claim 1, wherein data relating to distance-dependent overview-image depictions of different object carrier types are stored in the data set; the contextual data comprise stored data relating to a distance-dependent overview-image depiction of the identified object carrier type; and the distance is determined by evaluating geometric properties of a depiction of the sample carrier in the overview image while taking into account the stored data relating to the distance-dependent overview-image depiction of the identified object carrier type.
10. The method of claim 1, wherein a reference model implicitly comprising the contextual data is trained by means of machine learning with the step of evaluating the overview image and thus detecting a characteristic of the sample carrier and the step of ascertaining contextual data of the characteristic.
11. The method of claim 1, wherein the detection of the characteristic occurs by means of a machine learning model.
12. The method of claim 1, further comprising monitoring by means of image analysis whether a characteristic is contained in the overview image or in a further overview image and, if this is the case, automatically determining the distance.
13. A microscope with a distance determination system, the distance determination system comprising a sample stage for a sample carrier; a camera aimed at the sample stage; an evaluation unit, which is connected to the camera, the evaluation unit comprising: a) a data storage system for storing an overview image of the sample carrier at the sample stage; b) a trained machine learning model, wherein the trained machine learning model includes a trained reference model that is trained by a set of annotated training images of sample carriers and thus configured so that a characteristic of the sample carrier is detectable in the captured overview image; and c) a distance determination unit adapted to determine a distance of a sample reference plane from a reference plane of the microscope based on the detected characteristic of the sample carrier at the sample stage and associated contextual data.
14. The microscope of claim 13, wherein the evaluation unit is configured to carry out the method of claim 1.
15. The microscope of claim 13, further comprising a component that is configured to be controlled as a function of the determined distance.
16. The microscope of claim 15, wherein the component is a focus drive or an alarm system.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0051] A better understanding of the invention and various other features and advantages of the present invention will become readily apparent by the following description in connection with the schematic drawings, which are shown by way of example only, and not limitation, wherein like reference numerals may refer to alike or substantially alike components:
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0060] Different example embodiments are described in the following with reference to the figures. As a rule, similar components and components that function in a similar manner are designated by the same references.
Example Embodiment of FIG. 1
[0061] An example embodiment of a microscope 100 according to the invention is shown schematically in
[0062] The microscope 100 comprises a sample stage 9, which is height-adjustable via a focus drive 19 and on which a sample carrier 10 can be positioned. The type of sample carrier 10 provided can vary depending on the measurement situation. In the illustrated example, the sample carrier 10 is a microtiter plate with a plurality of wells or sample receptacles 11 in which a sample can be received.
[0063] The microscope 100 further comprises at least one objective 15, which defines an optical axis 16 and which is used to observe one of the samples. The objective 15 conducts detection light from the sample to a microscope camera 17. Further optional components arranged between the objective 15 and the microscope camera 17 are not illustrated in the purely schematic drawing. Illuminating light is conducted via an optional condenser 14 onto the sample.
[0064] The microscope 100 further comprises a stand or microscope stand 20 via which microscope components—such as the components of the microscope 100 mentioned in the foregoing—are supported.
[0065] The microscope 100 further comprises a camera 30, which is an overview camera 31 here with which an overview image of the sample carrier 10 can be captured.
[0066] Potential sample carriers 10 can differ significantly with respect to their shape, size and the number of sample receptacles 11 they comprise. A depth of the sample receptacle 11 and thus a z-plane of the sample to be analyzed can also vary depending on the sample carrier 10. The object carrier height H, which can vary according to the sample carrier 10, is indicated in
[0067] The aim here is to obtain a height datum of the sample carrier 10 in a manner that is as simple, as quick and as reliable as possible. To this end, at least one overview image of the overview camera 31 is evaluated. The evaluation is carried out with an evaluation unit 91, which can be constituted, for example, by a computer. The evaluation unit 91 in this example comprises a data storage system 92 with which in particular an overview image is stored, a machine learning model M for evaluating the overview image and a distance determination unit A. The evaluation unit 91 can be configured to carry out the steps of the machine learning model M and of the distance determination unit A in particular by running corresponding software. These steps are described in greater detail with reference to
[0068] As illustrated in
[0069] A viewing direction of the overview camera 31 is oblique to the optical axis 16 of the objective 15. In the example illustrated in
Example Embodiment of FIG. 2
[0070]
[0071] In further variants of
[0072] In further variations of
[0073] The evaluation of an overview image is explained in the following.
FIGS. 3 and 4
[0074]
[0075]
[0076] The perspective depiction in the overview image 40 depends on the distance z. In particular size, shape and perspective distortion depend on the distance z. It is thus possible to infer the distance z from the size or geometry in the overview image 40 if the actual physical size or geometry of the object is known. For example, if the physical size of the chessboard pattern or of another reference object R is known, then the distance z can be calculated as a function of the size and perspective distortion of the chessboard pattern in the overview image 40. In principle, the size of the reference object R alone can suffice for this calculation, although lateral variations of the sample carrier 10 generally also influence size in the overview image 40 if the imaging does not occur orthogonally to the sample carrier 10. It is thus often possible to achieve more precise results when distortion is (additionally) evaluated, for example how the angles of the chessboard pattern in the overview image 40 deviate from right angles.
[0077] Instead of or in addition to the reference object R, it is also possible to evaluate the shape of the sample carrier 10 in the overview image 40, for example which angles form its edges or how the essentially circular sample receptacles 11 are distorted. It is further expedient in this connection to consider a plurality of sample receptacles 11 together, as a different perspective distortion between the latter provides further information regarding the position of the sample carrier 10 relative to the camera 30.
[0078] As implied in
[0079] Different illustrative evaluation options for determining the distance z are described in the following with reference to
FIG. 5
[0080]
[0081] First, the overview image 40 is fed to an imaging processing algorithm. As Step S1, the image processing algorithm runs a detection of a characteristic C of the sample carrier 10 in the overview image 40. The characteristic C can be, for example, a reference object R, as described with reference to
[0082] Alternatively or additionally, an object carrier type can be identified as the characteristic C. The imaging processing algorithm is designed to differentiate between different groups of predetermined object carrier types which differ, for example, in height, shape, the number and the size of provided sample receptacles, in the distance between such sample receptacles or in lateral dimensions of the sample carrier. It is also possible to identify any labels or characters present on the sample carrier using imaging processing to identify the object carrier type.
[0083] The imaging processing algorithm is thus designed to ascertain whether or not a given predetermined object (the characteristic) is present in the overview image 40, wherein the depiction of this characteristic C varies with the distance z. The imaging processing algorithm also identifies which pixels in the overview image 40 belong to this characteristic C. This process is called classification.
[0084] Step S2 follows, in which contextual information or data K of the characteristic C is determined from a data set D. The data set D can be included in the aforementioned data storage system 92. If the characteristic C is a reference object R, then the contextual data K can be, for example, a physical geometry or size of this reference object R. If the characteristic C is the identified object carrier type, the contextual data K can be the height H of this object carrier type. Corresponding contextual data K can be stored in the data set for numerous different reference objects R and/or object carrier types.
[0085] The distance z is then determined by the distance determination unit in Step S3 based on the characteristic C and the contextual data K provided via the data set D. For example, Step S3 can be implemented by Step S3a, which uses the shape and size of the characteristic C in the overview image 40 identified in S1. Since the physical shape and size of the characteristic C are known from the data set D, the distance z can be estimated from the shape and size of the characteristic C in the overview image 40.
[0086] The steps described with reference to
FIGS. 6 and 7
[0087]
[0088] Annotated training images 41 are used in this connection. The training images 41 comprise different overview images for which the respective distances are known and specified as the target variable T. This is illustrated in
[0089] With reference to
[0090] The trained reference model P is able to evaluate an overview image 40 and to use characteristics C in the overview image 40 to infer the distance z. The trained reference model P uses contextual data K derived from the training images 41 in this scenario, wherein the contextual data K take the form of weights. The data set D is thus constituted by the reference model P and its weights.
FIG. 8
[0091]
[0092] A second ML model M2, which represents a distance determination unit A, follows. Its reference model P2 is intended to estimate the sought distance z from the output of the first ML model M1. During the training process, the second ML mode M2 receives the outputs of the first ML model M1 as well as the already known distances as target variables T, as illustrated in
[0093] In variants of the illustrated example embodiment, the second ML model M2 can also be replaced by a classic imaging processing algorithm without machine learning components. In this case, e.g., lengths and shapes of the image areas C1 of the characteristics C output by the first ML model M1 are measured. An analytic model can be stored which computes a distance z from these lengths or shapes. The analytic model can determine the distance z in particular from the roundness of sample receptacles or the displayed angles of sample carrier structures in the overview image which physically form a right angle, e.g., coverslip edges, corners of rectangular sample receptacles or an outer corner of the sample carrier.
[0094] The illustrated example embodiments share the advantage that a sample reference plane can be determined easily and quickly without requiring a complex apparatus. It is possible to combine features of different example embodiments. Individual features can also be omitted or varied within the scope of the attached claims.
List of References
[0095] 1 Sample reference plane
[0096] 2 Reference plane
[0097] 9 Sample stage
[0098] 10 Sample carrier
[0099] 11 Sample receptacle of the sample carrier 10
[0100] 12 Lid of the sample carrier 10
[0101] 14 Condenser
[0102] 15 Objective/Microscope objective
[0103] 16 Optical axis of the objective 15
[0104] 17 Microscope camera
[0105] 19 Focus drive
[0106] 20 Microscope stand
[0107] 30 Camera
[0108] 31 Overview camera
[0109] 40 Overview image
[0110] 41, 41a-41f Training images
[0111] 90 Distance determination system
[0112] 91 Evaluation unit
[0113] 92 Data storage system
[0114] 100 Microscope
[0115] A Distance determination unit
[0116] C Characteristic
[0117] C1 Image area of a characteristic C in the overview image 40
[0118] D Data set
[0119] H Object carrier height
[0120] K Contextual data
[0121] L Loss function
[0122] M, M1, M2 Machine learning model
[0123] P, P1, P2 Reference model of a machine learning model
[0124] R Reference object on the sample carrier 10
[0125] S1, S2, S3, S3a Steps of a method according to the invention
[0126] T, T1 Target variables during the training of a machine learning model
[0127] Ta-Tf Predetermined target variables for different training images 41a-41f
[0128] z Distance