APPARATUS AND METHOD FOR CAPTURING IMAGE DATA

20220137383 ยท 2022-05-05

    Inventors

    Cpc classification

    International classification

    Abstract

    An apparatus includes a detection beam path, along which detection radiation is guided, and a means for splitting the detection radiation between first and second detection paths, with a detector being in each detection path. A microlens array is arranged upstream of at least one detector. The first detector has a first spatial resolution, and the second detector has a second spatial resolution that is lower than the first spatial resolution. Also, the first detector has a first temporal resolution, and the second detector has a second temporal resolution that is higher than the first temporal resolution. Captured image data and computationally combined to form a three-dimensionally resolved resulting image.

    Claims

    1. Apparatus (1) for capturing image data, comprising: a detection beam path, along which detection radiation of at least one microscope (11) is guided or is guidable; and a means for splitting the detection radiation (2) between a first detection path (3) and a second detection path (6); a first detector (4) in the first detection path (3) and a second detector (7) in the second detection path (6), wherein a microlens array (5, 8) is arranged upstream of at least one of the detectors (4, 7); wherein the first detector (4) has a first spatial resolution and the second detector (7) has a second spatial resolution, and the first spatial resolution is higher than the second spatial resolution; and/or the first detector (4) has a first temporal resolution and the second detector (7) has a second temporal resolution, wherein the first temporal resolution is lower than the second temporal resolution, characterized in that an evaluation unit (10) for evaluating the captured image data of the first and second detectors (4, 7) is present, wherein the evaluation unit (10) is configured in a manner such that the evaluation of the image data of both detectors (4, 7) takes place and a three-dimensionally resolved resulting image is produced from said evaluated image data.

    2. Apparatus according to claim 1, characterized in that the first detector (4) has a higher spatial resolution than the second detector (7); a pinhole (23) is present in the first detection path (3) in an intermediate image plane and optically upstream of the first detector (4), with the result that confocal capturing of the detection radiation takes place by means of the first detector (4); and a microlens array (8) is arranged upstream of the second detector (7).

    3. Apparatus (1) according to claim 1, characterized in that the first spatial resolution is higher than the second spatial resolution at least by a factor of 1.5, and the first temporal resolution is lower than the second temporal resolution at least by a factor of 2.

    4. Apparatus (1) according to claim 1, characterized in that the means for splitting the detection radiation (2) is a beam splitter, a dichroic beam splitter, or a switchable mirror.

    5. Microscope (11) comprising: a light source (15); a detection beam path, along which detection radiation of at least one microscope (11) is guided or is guidable; and a means for splitting the detection radiation (2) between a first detection path (3) and a second detection path (6); a first detector (4) in the first detection path (3) and a second detector (7) in the second detection path (6), wherein a microlens array (5, 8) is arranged upstream of at least one of the detectors (4, 7); wherein the first detector (4) has a first spatial resolution and the second detector (7) has a second spatial resolution, and the first spatial resolution is higher than the second spatial resolution; and/or the first detector (4) has a first temporal resolution and the second detector (7) has a second temporal resolution, wherein the first temporal resolution is lower than the second temporal resolution, characterized in that an evaluation unit (10) for evaluating the captured image data of the first and second detectors (4, 7) is present, wherein the evaluation unit (10) is configured in a manner such that the evaluation of the image data of both detectors (4, 7) takes place and a three-dimensionally resolved resulting image is produced from said evaluated image data.

    6. Microscope (11) according to claim 5, characterized in that the light source (15), in particular a laser light source (15), and an objective (18) functioning as an illumination objective are present in an illumination beam path, wherein widefield illumination is generated.

    7. Microscope (11) according to claim 5, characterized in that the light source (15), an objective (18) functioning as an illumination objective, and an apparatus for generating a light sheet (19) are present in an illumination beam path, wherein the light sheet (19) is generated on the object side upstream of the objective (18) in a sample space (20).

    8. Microscope (11) according to claim 6, characterized in that the light source (15) is embodied for providing pulsed illumination light, in particular having pulse durations in the picoseconds or femtoseconds range.

    9. Microscope (11) according to one of claim 7, characterized in that the apparatus for generating a light sheet (19) is a cylindrical optical unit or a scanning apparatus (17), wherein an illumination light beam of the light source (15) that is shaped due to the effect of the cylindrical lens or an illumination light beam of the light source (15) that is deflected by means of the scanning apparatus (17) is directed into an entrance location in an objective pupil (EP) of the objective (18), said entrance location lying outside of the optical axis (oA) of the objective (18).

    10. Microscope (11) according to claim 5, characterized in that settable optical means (21) are present in the illumination beam path, the effect of which is that a thickness of the light sheet (19) transversely with respect to a light sheet plane is settable.

    11. Method for capturing image data, in which detection radiation of at least one microscope (11) is split between a first detection path (3) and a second detection path (6), wherein a microlens array (5, 8) is present at least in one of the two detection paths (3, 6) and is arranged optically upstream of a detector (4, 7) that is likewise present there; in the first detection path (3), the detection radiation is captured by means of a first detector (4) with a first temporal resolution and a first spatial resolution, in the second detection path (6), the detection radiation is captured by means of a second detector (7) with a second temporal resolution and a second spatial resolution, wherein the first temporal resolution is lower than the second temporal resolution and/or the first spatial resolution is higher than the second spatial resolution, and the captured image data of both detectors (4, 7) are computationally combined to form a three-dimensionally resolved resulting image.

    12. Method according to claim 11, characterized in that image data of the higher spatial resolution or image data of the higher temporal resolution are used for computationally increasing the spatial resolution and/or the temporal resolution of the image data with the lower spatial resolution or the lower temporal resolution, respectively.

    13. Method according to claim 11, characterized in that in each case only a selected portion of the detector elements of the second detector (7) are read.

    14. Method according to claim 11, characterized in that the detection radiation is guided temporally alternately along the first detection path (3) and the second detection path (6), wherein the switching time points and the time durations of the switching are established on the basis of the frame rate of the first detector (4).

    15. Method according to claim 11, characterized in that the computational combination of the image data and/or the combination of the image data or of the images to form a resulting image are/is performed with the application of machine learning, in particular with the application of CNNs (convolutional neural networks).

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0078] The invention will be described in more detail below on the basis of exemplary embodiments and figures. In the figures:

    [0079] FIG. 1 shows a schematic illustration of a first exemplary embodiment of an apparatus according to the invention;

    [0080] FIG. 2 shows a schematic illustration of a second exemplary embodiment of an apparatus according to the invention;

    [0081] FIG. 3 shows a schematic illustration of a third exemplary embodiment of an apparatus according to the invention;

    [0082] FIG. 4 shows a schematic illustration of a first exemplary embodiment of a microscope with an apparatus according to the invention and with means for generating a light sheet;

    [0083] FIG. 5 shows a schematic illustration of a second exemplary embodiment of a microscope with an apparatus according to the invention and with means for generating a light sheet;

    [0084] FIG. 6 shows a flowchart of a first configuration of the method according to the invention; and

    [0085] FIG. 7 shows a flowchart of a second configuration of the method according to the invention.

    DETAILED DESCRIPTION

    [0086] A general setup of an apparatus 1 according to the invention has along a beam path a means 2 for splitting detection radiation in the form of a beam splitter (beam splitter 2), the effect of which is to split detection radiation between a first detection path 3, having a first detector 4 and, arranged upstream thereof, a first microlens array 5, and a second detection path 6, having a second detector 7 and, arranged upstream thereof, a second microlens array 8. The microlens arrays 5 and 8 are each arranged in a pupil. If optical lenses 9 are specified in the exemplary embodiments, they optionally also stand for corresponding combinations of optical elements (lens systems).

    [0087] The first detector 4 allows a first spatial resolution that is higher than the spatial resolution of the second detector 7. A temporal resolution of the first detector 4 (slowcam) is lower than the temporal resolution of the second detector 7 (fastcam). In further embodiments, the first and second detectors 4 and 7 can also be arranged in the respectively other detection path 3 or 6.

    [0088] Detection radiation that comes from a microscope 11 and is focused due to the effect of a tube lens 9TL passes through an optional field stop 14 in an intermediate image plane, passes to an optical lens 9, and is split due to the effect of the beam splitter 2 between the first detection path 3 and the second detection path 6. The imaging of the pupil plane of the microscope 11, in particular of a pupil plane (back focal plane) of the microscope objective 18 (see e.g. FIG. 4), into the planes of the microlens array 8 is effected via the lens system 9TL, 9. The lens 9TL functions as a tube lens, while the downstream lens 9 acts as a Fourier lens, i.e. brings about a Fourier transform of the detection radiation.

    [0089] The image data captured by the detectors 4, 7 are fed to an evaluation unit 10 in the form of a computer or an FPGA. The latter is configured such that the evaluation of the captured image data takes place by taking into account the location information, angle information, and intensity values, and for example either the captured portions of the angle information per detector 4, 7 as image data are computationally combined to form in each case a three-dimensionally resolved image and subsequently combined to form a resulting image, or the captured portions of the angle information of both detectors 4, 7 as image data are combined to form a three-dimensionally resolved resulting image.

    [0090] The evaluation unit 10 is optionally connected to a display 13, for example a monitor, on which the image data and/or the resulting image or a resulting image stack are representable. Moreover, the evaluation unit 10 is optionally connected to a control unit 12, which can be in particular a constituent part of the microscope 11. In further possible embodiments, the control unit 12 is not an integral part of the microscope 11, but can be connected to the latter in a manner suitable for transmitting data (see e.g. schematically in FIG. 2).

    [0091] The control unit 12 is configured for generating control signals on the basis of results of the evaluation unit 10. Said control signals can serve for controlling functions of the microscope 11.

    [0092] In a further exemplary embodiment of the apparatus 1, the microlens arrays 5, 8 are arranged in each case in a nominal image plane nBE (FIG. 2). From there, the captured detection radiation is directed onto the respective detector 4 or 7 due to the effect of the microlenses. The nominal image plane nBE thus represents an intermediate image plane. An optical lens 9 functioning as a focusing lens is arranged upstream of the beam splitter 2. Depending on whether a (point) light source (not shown), the light of which is intended to be captured, is located in an object plane (focal plane) of the objective 18 (see, for example, FIG. 4), the point light source in turn is imaged (in an idealized fashion) in the shape of a point onto the MLA 5, 8. If the point light source is situated above or below the object plane in the detection direction, the point light source is imaged not exactly into the nominal image plane nBE but behind or in front of it (see, by way of example, FIG. 2). The spatial position of the point light source can be computed for example by means of the correspondingly configured evaluation unit 10 on the basis of the intensity values that have been captured by means of the individual pixels and additionally represent location information, and on the basis of the captured angle information.

    [0093] The third exemplary embodiment of the apparatus 1 likewise has the microlens arrays 5 and 8 in each case in a nominal image plane nBE (FIG. 3). The detection radiation is focused into the nominal image plane nBE due to the effect of optical lenses 9 arranged in the respective detection paths 3 and 6.

    [0094] The exemplary embodiments illustrated in FIGS. 1 to 3 can be used for example for two-channel light field microscopy. Spectral components of the detection radiation can be separated by means of the beam splitter 2.

    [0095] In an exemplary embodiment of the apparatus 1 according to the invention in a microscope 11 (FIG. 4), a light source 15 for providing laser light as excitation light, optical lenses 9, a light-directing device 17 or scanning apparatus 17, a color splitter 16, and an objective 18 with an entrance pupil EP, said objective functioning as an illumination objective, are present in an excitation beam path. The light source 15, in particular in the form of a laser light source, can optionally be operated in a pulsed manner.

    [0096] An optical lens 9 and the beam splitter 2, by means of which detection radiation is directed along the first detection path 3 with the first microlens array 5 and the first detector 4 and/or along the second detection path 6 with the second microlens array 8 and the second detector 7, are arranged in a detection beam path (symbolized by interrupted solid lines). The detectors 4 and 7 are connected to the evaluation unit 10 and the latter is connected to the control unit 12 in a manner suitable for exchanging data. By means of the control unit 12, it is possible to generate control commands that serve for controlling the scanning apparatus 17 (henceforth also: scanner 17). In further embodiments, the light source 15 can also be controlled by the control unit 12.

    [0097] During the operation of the microscope 11 comprising the apparatus 1 according to the invention, laser light emitted by the laser light source 15 is focused and passes to the scanning apparatus 17. The scanning apparatus 17, which is controlled by the control unit 12, deflects the laser light in a controlled manner in an x-direction x and/or in a y-direction y. The scanning apparatus 17 can be used to vary the angle of incidence and an entrance location of the excitation light in the entrance pupil EP (objective pupil).

    [0098] The excitation light, after passing through the dichroic color splitter 16, is directed into an entrance location in the entrance pupil EP that lies away from the optical axis oA of the objective 18. As a result, a light sheet 19 which is inclined with respect to the optical axis oA in a correspondingly inclined light sheet plane is generated on the object side by way of the objective 18. If a sample is located in a sample space 20 upstream of the objective 18, the light sheet 19 can be directed into said sample.

    [0099] Optionally settable optical means 21, such as a zoom optical unit or a stop, for example, the effect of which is that a thickness of the light sheet 19 transversely with respect to the light sheet plane is settable (only shown by indication), can be present in the excitation beam path (=illumination beam path). The settable optical means 21 can be controlled by means of the control unit 12.

    [0100] Due to the effect of the light sheet 19 formed from the excitation light, fluorescence can be excited in the sample and be emitted as detection light (detection radiation). Emitted detection light is collected by the objective 18, which serves both as the illumination objective and as the detection objective. In the color splitter 16, the detection light, which has a longer wavelength than the excitation light, is reflected into the further course of the detection beam path and passes via the beam splitter 2 to the first microlens array 5 and/or the second microlens array 8. The microlenses, shown by indication, can be regarded as individual imaging systems. The image points brought about by the individual microlenses are captured as image data by correspondingly positioned detector elements of the detectors 4 and 7, respectively, and are fed to the evaluation unit 10.

    [0101] One further possible embodiment of the invention is illustrated in FIG. 5, based on FIG. 4. A further dichroic beam splitter 22 is arranged in the illumination beam path between the light source 15 and the scanning apparatus 17. Due to the effect of said beam splitter, detection radiation which, coming from the sample space 20, has passed through the beam splitter 16 and the subsequent optical elements and has been converted into a stationary beam (descanned) due to the effect of the scanning apparatus 17 is directed into the last section of the first detection path 3. In this exemplary embodiment, the beam splitter 16 (also) functions for splitting the captured detection radiation between the first and second detection paths 3, 6 (beam splitter 2) and can be dichroic or split detection radiation in a specific ratio. The detection radiation is focused by means of an optical lens 9 into an intermediate image plane, in which a pinhole 23 in the form of a pinhole stop or a slit stop is situated. Due to the effect of the pinhole 23, the portions that originate from out-of-focus regions are removed from the beam of the detection radiation or at least largely reduced. By way of example, a secondary electron multiplier (photomultiplier tube, PMT), an array of a plurality of PMTs, or a two-dimensional detector (see above) can be used as the first detector 4. The first detector 4 is connected to the evaluation unit 10. The latter is in turn connected to the scanning apparatus 17 in order to obtain data relating to a respectively current alignment of the scanning apparatus 17. On the basis of the current alignment, a position in an X-Y-plane can be assigned to the individual image data captured by means of the first detector 4. Information relating to the axial position (position in the z-direction, Z-position) can be ascertained on the basis of the known position of the current focal plane of the objective 18 and optionally taking account of a point spread function (PSF) that is known for the image capturing system. Image data can be captured at different z-positions (z-stack). In this way, three-dimensionally resolved image data can be captured with the first detector 4. Owing to the design of the first detection path 3 as a confocal detection path, a higher spatial resolution by comparison with the second detector 7 is achieved. If switching is effected alternately between capturing by means of the first detection path 3 (confocal) and the second detection path 6, the settable optical means 21 can accordingly be controlled to generate either an illumination light spot or a light sheet 19.

    [0102] Due to the effect of the beam splitter 2, 16, the second detection path 6 receives detection radiation that is imaged onto the second detector 7 and captured. A microlens array 8 is arranged upstream of the second detector 7. In further embodiments, the beam splitter 2, 16 can be replaced by a switchable mirror, for example. The image data captured by means of the first detector 4 and the second detector 7 are combined by the evaluation unit 10 and a three-dimensional resulting image is computed.

    [0103] The method according to the invention can be carried out in two alternative configurations. In the first alternative (FIG. 6), the detection radiation is directed into the first and/or the second detection path 3, 6 and is captured there by means of the detector 4, 7 that is respectively present for example in accordance with the principle of light field technology. In each case, a three-dimensionally resolved image is computed both from the captured image data of the slowcam of the first detection path 3 and also from the image data of the fastcam of the second detection path 6. Subsequently, the three-dimensionally resolved images of the two detection paths 3, 6 are combined to form a resulting image.

    [0104] In an alternative configuration of the method, the captured image data of the slowcam and of the fastcam are combined to form a resulting three-dimensionally resolved image without previously generating at least one three-dimensionally resolved image for each of the detection paths 3, 6 or for each of the detectors 4, 7 (FIG. 7).

    [0105] In further configurations of the method, the image data captured by means of the first detector 4 and the second detector 7 or respectively images calculated therefrom can be mapped to one another using a CNN (convolutional neural network) and a three-dimensionally resolved resulting image can be calculated.

    LIST OF REFERENCE SIGNS

    [0106] 1 Apparatus [0107] 2 Means for splitting the detection radiation/beam splitter [0108] 3 First detection path [0109] 4 First detector (slowcam) [0110] 5 First microlens array [0111] 6 Second detection path [0112] 7 Second detector (fastcam) [0113] 8 Second microlens array [0114] 9 Optical lens [0115] 10 Evaluation unit [0116] 11 Microscope [0117] 12 Control unit [0118] 13 Display [0119] 14 Field stop [0120] 16 Color splitter [0121] 15 Light source [0122] 17 Scanning apparatus [0123] 18 Objective [0124] 19 Light sheet [0125] 20 Sample space [0126] 21 Settable optical means [0127] 22 Beam splitter (for confocal beam path) [0128] 23 Pinhole [0129] EP Entrance pupil [0130] oA Optical axis [0131] nBE Nominal image plane