APPARATUS AND METHOD FOR CAPTURING IMAGE DATA
20220137383 ยท 2022-05-05
Inventors
Cpc classification
G02B21/0072
PHYSICS
G02B21/18
PHYSICS
G02B21/367
PHYSICS
G02B21/008
PHYSICS
International classification
Abstract
An apparatus includes a detection beam path, along which detection radiation is guided, and a means for splitting the detection radiation between first and second detection paths, with a detector being in each detection path. A microlens array is arranged upstream of at least one detector. The first detector has a first spatial resolution, and the second detector has a second spatial resolution that is lower than the first spatial resolution. Also, the first detector has a first temporal resolution, and the second detector has a second temporal resolution that is higher than the first temporal resolution. Captured image data and computationally combined to form a three-dimensionally resolved resulting image.
Claims
1. Apparatus (1) for capturing image data, comprising: a detection beam path, along which detection radiation of at least one microscope (11) is guided or is guidable; and a means for splitting the detection radiation (2) between a first detection path (3) and a second detection path (6); a first detector (4) in the first detection path (3) and a second detector (7) in the second detection path (6), wherein a microlens array (5, 8) is arranged upstream of at least one of the detectors (4, 7); wherein the first detector (4) has a first spatial resolution and the second detector (7) has a second spatial resolution, and the first spatial resolution is higher than the second spatial resolution; and/or the first detector (4) has a first temporal resolution and the second detector (7) has a second temporal resolution, wherein the first temporal resolution is lower than the second temporal resolution, characterized in that an evaluation unit (10) for evaluating the captured image data of the first and second detectors (4, 7) is present, wherein the evaluation unit (10) is configured in a manner such that the evaluation of the image data of both detectors (4, 7) takes place and a three-dimensionally resolved resulting image is produced from said evaluated image data.
2. Apparatus according to claim 1, characterized in that the first detector (4) has a higher spatial resolution than the second detector (7); a pinhole (23) is present in the first detection path (3) in an intermediate image plane and optically upstream of the first detector (4), with the result that confocal capturing of the detection radiation takes place by means of the first detector (4); and a microlens array (8) is arranged upstream of the second detector (7).
3. Apparatus (1) according to claim 1, characterized in that the first spatial resolution is higher than the second spatial resolution at least by a factor of 1.5, and the first temporal resolution is lower than the second temporal resolution at least by a factor of 2.
4. Apparatus (1) according to claim 1, characterized in that the means for splitting the detection radiation (2) is a beam splitter, a dichroic beam splitter, or a switchable mirror.
5. Microscope (11) comprising: a light source (15); a detection beam path, along which detection radiation of at least one microscope (11) is guided or is guidable; and a means for splitting the detection radiation (2) between a first detection path (3) and a second detection path (6); a first detector (4) in the first detection path (3) and a second detector (7) in the second detection path (6), wherein a microlens array (5, 8) is arranged upstream of at least one of the detectors (4, 7); wherein the first detector (4) has a first spatial resolution and the second detector (7) has a second spatial resolution, and the first spatial resolution is higher than the second spatial resolution; and/or the first detector (4) has a first temporal resolution and the second detector (7) has a second temporal resolution, wherein the first temporal resolution is lower than the second temporal resolution, characterized in that an evaluation unit (10) for evaluating the captured image data of the first and second detectors (4, 7) is present, wherein the evaluation unit (10) is configured in a manner such that the evaluation of the image data of both detectors (4, 7) takes place and a three-dimensionally resolved resulting image is produced from said evaluated image data.
6. Microscope (11) according to claim 5, characterized in that the light source (15), in particular a laser light source (15), and an objective (18) functioning as an illumination objective are present in an illumination beam path, wherein widefield illumination is generated.
7. Microscope (11) according to claim 5, characterized in that the light source (15), an objective (18) functioning as an illumination objective, and an apparatus for generating a light sheet (19) are present in an illumination beam path, wherein the light sheet (19) is generated on the object side upstream of the objective (18) in a sample space (20).
8. Microscope (11) according to claim 6, characterized in that the light source (15) is embodied for providing pulsed illumination light, in particular having pulse durations in the picoseconds or femtoseconds range.
9. Microscope (11) according to one of claim 7, characterized in that the apparatus for generating a light sheet (19) is a cylindrical optical unit or a scanning apparatus (17), wherein an illumination light beam of the light source (15) that is shaped due to the effect of the cylindrical lens or an illumination light beam of the light source (15) that is deflected by means of the scanning apparatus (17) is directed into an entrance location in an objective pupil (EP) of the objective (18), said entrance location lying outside of the optical axis (oA) of the objective (18).
10. Microscope (11) according to claim 5, characterized in that settable optical means (21) are present in the illumination beam path, the effect of which is that a thickness of the light sheet (19) transversely with respect to a light sheet plane is settable.
11. Method for capturing image data, in which detection radiation of at least one microscope (11) is split between a first detection path (3) and a second detection path (6), wherein a microlens array (5, 8) is present at least in one of the two detection paths (3, 6) and is arranged optically upstream of a detector (4, 7) that is likewise present there; in the first detection path (3), the detection radiation is captured by means of a first detector (4) with a first temporal resolution and a first spatial resolution, in the second detection path (6), the detection radiation is captured by means of a second detector (7) with a second temporal resolution and a second spatial resolution, wherein the first temporal resolution is lower than the second temporal resolution and/or the first spatial resolution is higher than the second spatial resolution, and the captured image data of both detectors (4, 7) are computationally combined to form a three-dimensionally resolved resulting image.
12. Method according to claim 11, characterized in that image data of the higher spatial resolution or image data of the higher temporal resolution are used for computationally increasing the spatial resolution and/or the temporal resolution of the image data with the lower spatial resolution or the lower temporal resolution, respectively.
13. Method according to claim 11, characterized in that in each case only a selected portion of the detector elements of the second detector (7) are read.
14. Method according to claim 11, characterized in that the detection radiation is guided temporally alternately along the first detection path (3) and the second detection path (6), wherein the switching time points and the time durations of the switching are established on the basis of the frame rate of the first detector (4).
15. Method according to claim 11, characterized in that the computational combination of the image data and/or the combination of the image data or of the images to form a resulting image are/is performed with the application of machine learning, in particular with the application of CNNs (convolutional neural networks).
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0078] The invention will be described in more detail below on the basis of exemplary embodiments and figures. In the figures:
[0079]
[0080]
[0081]
[0082]
[0083]
[0084]
[0085]
DETAILED DESCRIPTION
[0086] A general setup of an apparatus 1 according to the invention has along a beam path a means 2 for splitting detection radiation in the form of a beam splitter (beam splitter 2), the effect of which is to split detection radiation between a first detection path 3, having a first detector 4 and, arranged upstream thereof, a first microlens array 5, and a second detection path 6, having a second detector 7 and, arranged upstream thereof, a second microlens array 8. The microlens arrays 5 and 8 are each arranged in a pupil. If optical lenses 9 are specified in the exemplary embodiments, they optionally also stand for corresponding combinations of optical elements (lens systems).
[0087] The first detector 4 allows a first spatial resolution that is higher than the spatial resolution of the second detector 7. A temporal resolution of the first detector 4 (slowcam) is lower than the temporal resolution of the second detector 7 (fastcam). In further embodiments, the first and second detectors 4 and 7 can also be arranged in the respectively other detection path 3 or 6.
[0088] Detection radiation that comes from a microscope 11 and is focused due to the effect of a tube lens 9TL passes through an optional field stop 14 in an intermediate image plane, passes to an optical lens 9, and is split due to the effect of the beam splitter 2 between the first detection path 3 and the second detection path 6. The imaging of the pupil plane of the microscope 11, in particular of a pupil plane (back focal plane) of the microscope objective 18 (see e.g.
[0089] The image data captured by the detectors 4, 7 are fed to an evaluation unit 10 in the form of a computer or an FPGA. The latter is configured such that the evaluation of the captured image data takes place by taking into account the location information, angle information, and intensity values, and for example either the captured portions of the angle information per detector 4, 7 as image data are computationally combined to form in each case a three-dimensionally resolved image and subsequently combined to form a resulting image, or the captured portions of the angle information of both detectors 4, 7 as image data are combined to form a three-dimensionally resolved resulting image.
[0090] The evaluation unit 10 is optionally connected to a display 13, for example a monitor, on which the image data and/or the resulting image or a resulting image stack are representable. Moreover, the evaluation unit 10 is optionally connected to a control unit 12, which can be in particular a constituent part of the microscope 11. In further possible embodiments, the control unit 12 is not an integral part of the microscope 11, but can be connected to the latter in a manner suitable for transmitting data (see e.g. schematically in
[0091] The control unit 12 is configured for generating control signals on the basis of results of the evaluation unit 10. Said control signals can serve for controlling functions of the microscope 11.
[0092] In a further exemplary embodiment of the apparatus 1, the microlens arrays 5, 8 are arranged in each case in a nominal image plane nBE (
[0093] The third exemplary embodiment of the apparatus 1 likewise has the microlens arrays 5 and 8 in each case in a nominal image plane nBE (
[0094] The exemplary embodiments illustrated in
[0095] In an exemplary embodiment of the apparatus 1 according to the invention in a microscope 11 (
[0096] An optical lens 9 and the beam splitter 2, by means of which detection radiation is directed along the first detection path 3 with the first microlens array 5 and the first detector 4 and/or along the second detection path 6 with the second microlens array 8 and the second detector 7, are arranged in a detection beam path (symbolized by interrupted solid lines). The detectors 4 and 7 are connected to the evaluation unit 10 and the latter is connected to the control unit 12 in a manner suitable for exchanging data. By means of the control unit 12, it is possible to generate control commands that serve for controlling the scanning apparatus 17 (henceforth also: scanner 17). In further embodiments, the light source 15 can also be controlled by the control unit 12.
[0097] During the operation of the microscope 11 comprising the apparatus 1 according to the invention, laser light emitted by the laser light source 15 is focused and passes to the scanning apparatus 17. The scanning apparatus 17, which is controlled by the control unit 12, deflects the laser light in a controlled manner in an x-direction x and/or in a y-direction y. The scanning apparatus 17 can be used to vary the angle of incidence and an entrance location of the excitation light in the entrance pupil EP (objective pupil).
[0098] The excitation light, after passing through the dichroic color splitter 16, is directed into an entrance location in the entrance pupil EP that lies away from the optical axis oA of the objective 18. As a result, a light sheet 19 which is inclined with respect to the optical axis oA in a correspondingly inclined light sheet plane is generated on the object side by way of the objective 18. If a sample is located in a sample space 20 upstream of the objective 18, the light sheet 19 can be directed into said sample.
[0099] Optionally settable optical means 21, such as a zoom optical unit or a stop, for example, the effect of which is that a thickness of the light sheet 19 transversely with respect to the light sheet plane is settable (only shown by indication), can be present in the excitation beam path (=illumination beam path). The settable optical means 21 can be controlled by means of the control unit 12.
[0100] Due to the effect of the light sheet 19 formed from the excitation light, fluorescence can be excited in the sample and be emitted as detection light (detection radiation). Emitted detection light is collected by the objective 18, which serves both as the illumination objective and as the detection objective. In the color splitter 16, the detection light, which has a longer wavelength than the excitation light, is reflected into the further course of the detection beam path and passes via the beam splitter 2 to the first microlens array 5 and/or the second microlens array 8. The microlenses, shown by indication, can be regarded as individual imaging systems. The image points brought about by the individual microlenses are captured as image data by correspondingly positioned detector elements of the detectors 4 and 7, respectively, and are fed to the evaluation unit 10.
[0101] One further possible embodiment of the invention is illustrated in
[0102] Due to the effect of the beam splitter 2, 16, the second detection path 6 receives detection radiation that is imaged onto the second detector 7 and captured. A microlens array 8 is arranged upstream of the second detector 7. In further embodiments, the beam splitter 2, 16 can be replaced by a switchable mirror, for example. The image data captured by means of the first detector 4 and the second detector 7 are combined by the evaluation unit 10 and a three-dimensional resulting image is computed.
[0103] The method according to the invention can be carried out in two alternative configurations. In the first alternative (
[0104] In an alternative configuration of the method, the captured image data of the slowcam and of the fastcam are combined to form a resulting three-dimensionally resolved image without previously generating at least one three-dimensionally resolved image for each of the detection paths 3, 6 or for each of the detectors 4, 7 (
[0105] In further configurations of the method, the image data captured by means of the first detector 4 and the second detector 7 or respectively images calculated therefrom can be mapped to one another using a CNN (convolutional neural network) and a three-dimensionally resolved resulting image can be calculated.
LIST OF REFERENCE SIGNS
[0106] 1 Apparatus [0107] 2 Means for splitting the detection radiation/beam splitter [0108] 3 First detection path [0109] 4 First detector (slowcam) [0110] 5 First microlens array [0111] 6 Second detection path [0112] 7 Second detector (fastcam) [0113] 8 Second microlens array [0114] 9 Optical lens [0115] 10 Evaluation unit [0116] 11 Microscope [0117] 12 Control unit [0118] 13 Display [0119] 14 Field stop [0120] 16 Color splitter [0121] 15 Light source [0122] 17 Scanning apparatus [0123] 18 Objective [0124] 19 Light sheet [0125] 20 Sample space [0126] 21 Settable optical means [0127] 22 Beam splitter (for confocal beam path) [0128] 23 Pinhole [0129] EP Entrance pupil [0130] oA Optical axis [0131] nBE Nominal image plane