Scanning microscope using pulsed illumination and MSIA
11143855 · 2021-10-12
Assignee
Inventors
- A. E. Dixon (Waterloo, CA)
- Savvas Damaskinos (Waterloo, CA)
- Alfonso Ribes (Waterloo, CA)
- Jasper Hayes (Waterloo, CA)
Cpc classification
H04N25/711
ELECTRICITY
G02B21/16
PHYSICS
G02B21/0032
PHYSICS
H04N25/13
ELECTRICITY
H04N23/10
ELECTRICITY
G02B21/0084
PHYSICS
H04N23/74
ELECTRICITY
H04N23/125
ELECTRICITY
G02B21/367
PHYSICS
International classification
Abstract
According to one aspect, an instrument for scanning a specimen. The instrument includes a scanning stage for supporting the specimen, a detector having a plurality of pixels, the scanning stage and the detector movable relative to each other to move the specimen in a scan direction during a scan, and a pulsed illumination source synchronized with the motion of the specimen on the scanning stage. At least some of the pixels of the detector are operable to collect light emitted from the specimen during the scan due to the pulsed illumination source and generate corresponding image data. The instrument may further include a processor operable to perform MSIA on the image data to generate an image of the specimen.
Claims
1. An instrument for scanning a specimen, comprising: a. a scanning stage for supporting the specimen; b. a detector having a plurality of pixels, the scanning stage and the detector movable relative to each other to move the specimen in a scan direction at a constant scan speed during a scan; c. a pulsed illumination source synchronized with the motion of the specimen on the scanning stage; and d. a processor operable to perform Moving Specimen Image Averaging (MSIA) on image data to generate an image of the specimen; e. wherein at least some of the pixels of the detector are operable to collect light emitted from or reflected by the specimen during the scan due to the pulsed illumination source; and f. wherein a color of the pulsed illumination source is changed each time the specimen has moved a distance that causes the image of the specimen projected onto the detector to move, at a constant image speed with respect to the detector, a distance equal to the distance between lines of pixels in the detector.
2. The instrument of claim 1, wherein the detector is a monochrome detector.
3. The instrument of claim 1, where the pulsed illumination source provides illumination of one or more colors.
4. The instrument of claim 1, where the pulsed illumination source provides sequential R, G and B illumination.
5. The instrument of claim 1, wherein the pulsed illumination source is configured for brightfield imaging.
6. The instrument of claim 1, wherein the pulsed illumination source is configured for fluorescence imaging.
7. The instrument of claim 1, wherein the detector is a color area detector.
8. The instrument of claim 7, wherein the color area detector array incorporates a filter array.
9. The instrument of claim 8, wherein the colors in the filter array are chosen to transmit spectrally-resolved light such that a final image is a spectrally-resolved or multispectral image of the specimen.
10. The instrument of claim 8, wherein the filter array includes one or more of an RGGB Bayer filter, a mosaic scan filter array, or a scan color filter array.
11. The instrument of claim 7, wherein the color area detector array incorporates one of a mosaic scan filter array or a scan color filter array, and the colors in the filter array are chosen to transmit spectrally-resolved light such that a final image is a spectrally-resolved or multispectral image of the specimen.
12. The instrument of claim 1, where the pulsed illumination source provides pulsed white-light.
13. The instrument of claim 1, wherein an exposure time for each image frame is set using a camera's internal or external shutter.
14. The instrument of claim 1, wherein an exposure time for each image frame is set by a pulse width of the pulsed illuminating source.
15. The instrument of claim 1, wherein the instrument scans the specimen in one of brightfield and fluorescence.
16. The instrument of claim 1, wherein the specimen is illuminated from below by a light source.
17. The instrument of claim 1, wherein the specimen is illuminated from above by a light source.
18. The instrument of claim 1, wherein the detector comprises a plurality of active areas of pixels, each active area acting as a separate MSIA detector.
19. The instrument of claim 1, where the pulsed illumination source provides illumination in three or more colors.
20. The instrument of claim 1, where the pulsed illumination source provides illumination in four or more colors.
21. The instrument of claim 1, where distortion is corrected on an active area of rows of pixels of the detector.
22. The instrument of claim 18, wherein the active areas are near the centre of the detector.
23. The instrument of claim 1, wherein a valid line of image data is defined as having been exposed at least twice for each color emitted by the pulsed illumination source.
24. The instrument of claim 1, wherein a valid line of image data is defined as having been exposed at least four times for each color emitted by the pulsed illumination source.
25. The instrument of claim 1, wherein MSIA occurs with a region of interest greater than 2 lines.
26. The instrument of claim 1, wherein MSIA occurs with a region of interest greater than n lines.
Description
BRIEF DESCRIPTION OF THE DIAGRAMS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
DESCRIPTION OF VARIOUS EMBODIMENTS
(15) Turning now to
(16) In some cases, the tissue specimen 100 may be a biological specimen, which is commonly covered with a transparent cover slip (not shown).
(17) Light passing through the specimen is collected by infinity-corrected microscope objective 115 which is focused on the specimen by motorized positioner 120. The microscope objective 115 and tube lens 125 form a real image of the specimen on a monochrome area detector array 304, inside camera 305.
(18) An image of the specimen is collected by moving the microscope slide at constant speed using motorized stage 105 (controlled by computer 320) through connection 340 (or other communication mechanism including Wi-Fi or Bluetooth) in a direction perpendicular to the rows of detector pixels of the monochrome detector array 304 in camera 305.
(19) The shutter in camera 305 (which may be part of detector array 304) is actuated by frame grabber 310, which is controlled by computer 320 through connection 315 (or other communication mechanism). The shutter is open for only a short time for each exposure, for example for the time it takes for the image projected onto the detector array by tube lens 125 to move a distance across the detector array that is less than or equal to half the distance between adjacent rows in the array.
(20) Computer 320 combines a sequence of contiguous area images from the array to construct an image of one strip across the specimen, as shown in
(21) Turning now to
(22) In
(23) When the entire array is used for imaging, geometric distortion makes it difficult to assemble the strip image, and a distortion correction is commonly performed on each frame image (i.e., using software) before assembling the frame images to produce image strips.
(24) One technique to avoid the effect of geometric distortion without using a software geometric distortion correction when assembling an image strip is to define a small detector area of interest that has the width of the detector (all columns are included) but only includes a small number of rows near the center of the array (64 rows for example), and to assemble the image strip using, for this example, frames that are 4000 columns wide but only 64 rows high.
(25)
(26)
(27) At the same time, the shutter in camera 305 is actuated, for example by the falling edge of the camera trigger signal at the top of
(28) This particular example matches the scan example shown in
(29) Turning now to
(30) Notably, only one pass is needed to generate an RGB strip image. The illumination color cycles from red in frame 1, to green in frame 2, to blue in frame 3, back to red in frame 4, and so on. A ⅔ overlap between frames is very efficient (minimizes the required frame rate and illumination pulse rate). If you only look at one color at a time the capture scheme is the same as
(31) In this example, the exposure time is set by the camera shutter, and is typically less than or equal to the time for the specimen to move a distance such that the image of the specimen projected onto the detector array moves a distance equal to 1/10 of the distance between adjacent pixel positions on the array.
(32) Since the frame images of the three colors overlap, only one color has been measured in the first set of lines in the strip image (in this case corresponding to the top ⅓ of a frame), and data for the other two colors will have been acquired in the last ⅓ of a frame. The bold black arrow indicates the first row where all colors are available. In other words, a series of lines at the beginning of each image strip will be invalid (will not contain data for all 3 colors).
(33) From a computer memory point of view each color can be accumulated separately. Frame 1 (red) goes into the ‘red’ memory, frame 2 into the ‘green’ memory, etc. Every third frame goes back into the same memory (e.g. frames 1, 4, 7, etc. go back into the ‘red’ memory).
(34) The key advantage of a single pass scheme (
(35) The single pass scheme shown in
(36)
(37) Turning now to
(38) In
(39) As before, the 8H×12V detector area is used for simple illustration only—commonly one can use sensors with 3000 rows and 4000 columns, for example, (and where geometric distortion correction of each image frame is commonly performed in software before assembling the image strips). A scan velocity of 4 lines/sec has been chosen for this example, but higher and lower scan velocities are may commonly be used.
(40) Often only an active area of rows of pixels near the center of the sensor is used so that no geometric distortion correction is necessary (when using an active area of 128 rows×4000 columns, for example, the frame rate of the detector array is higher than when the entire detector array is used, so that scan speed of the scanning instrument can be maintained even though a smaller number of lines of image pixel data are collected in each frame).
(41) As before, during scanning there is typically specimen motion only in the same direction as the columns of pixels in the detector array; frame image data has been moved to the right in the diagram for illustration purposes only.
(42) Combining Pulsed Illumination with Moving Specimen Image Averaging (MSIA):
(43)
(44) Note that each small square in a frame in this figure represents one entire row of pixels. Frame 1 represents the first frame of data collected with R illumination. Frame 2 represents the first frame with G illumination (with the frame moved to the right in the diagram for illustration, however the pixel data represented by Frame 2 lies directly on top of the data for Frame 1, just translated 2 rows in the scan direction).
(45) Frame capture continues as the scan proceeds, but note that when the data for Frame 6 is acquired, all pixels in the last two rows of pixels of Frame 1 and the first two rows of Frame 6 have been exposed twice in R, twice in B, and twice in G.
(46) Frame 6 is the first frame in the scan in which the first two lines have been exposed six times, resulting in the first two lines of MSIA data in the scanned image. These can be called the first two “valid” lines of data, since they are the first lines in the strip image that have been fully exposed (twice in R, twice in G, and twice in B).
(47) When data from the first pixel of line 11 in Frame 1 is averaged with the data from the first pixel in line 5 of Frame 4, the result is an averaged value for R in the pixel at top left of the final strip image. The G value for that same pixel is the average of the pixel data exposed in G from the left pixel position in Frame 2, line 9 and Frame 5, line 3. The B value for that same pixel position comes from averaging the data from the left pixel position in Frame 3, line 7, and Frame 6, line 1.
(48) As the scan continues, an image strip is stored in computer memory in which each image pixel contains R, G and B information, and each color value is the result of averaging two measurements. This results in pixels that have Signal/Noise increased by the square root of the number of measurements averaged (in this case the square root of 2, so the increase in S/N is a factor of 1.4). Generally, all of the measurements to acquire an RGB image strip are made in a single scan, and an image of the entire specimen can be assembled by butting together strips from a series of scans.
(49)
(50) Frame 1 represents the first frame of data collected with R illumination. Frame 2 represents the first frame with G illumination (with the frame moved to the right in the diagram for illustration, however the pixel data represented by Frame 2 lies directly on top of the data for Frame 1, just translated 1 row in the scan direction). Frame capture continues as the scan proceeds, but note that when the data for Frame 12 is acquired, all pixels in the last rows of pixels of Frame 1 and the first row of Frame 12 have been exposed four times in R, four times in B, and four times in G. Frame 12 is the first frame in the scan in which the first line has been exposed 12 times, resulting in the first line of MSIA data in the scanned image.
(51) We call this the first “valid” line of data, since this is the first line in the strip image that has been fully exposed (4 times in R, 4 times in G, and 4 times in B). The four R values are averaged, the four G values are averaged, and the four B values are averaged for each pixel in that line of data, resulting in the first line of RGB pixel values in the strip image.
(52) As the scan continues, an image strip is stored in computer memory in which each image pixel contains R, G and B information, and each color value is the result of averaging four measurements. This results in pixels that have Signal/Noise increased by the square root of the number of measurements averaged (in this case the square root of 4, so the increase in S/N is a factor of 2). All of the measurements to acquire an RGB image strip are made in a single scan, and an image of the entire specimen can be assembled by butting together strips from a series of scans.
(53) Commonly, we use an active area of 126 lines at the center of a 3000×4000 pixel sensor array, so each color at each pixel position in the resulting strip image can be exposed 126/3=42 times if the illumination color is changed and the shutter is triggered each time the specimen has moved a distance that causes the image of the specimen projected onto the sensor array to move a distance equal to the distance between lines of pixels in the array (resulting in a S/N increase of a factor of the square root of 42, or 6.5).
(54) As a further example using an active area of 126 lines, if the illumination color is changed and the shutter is triggered each time the specimen has moved a distance that causes the image of the specimen projected onto the sensor array to move a distance equal to twice the distance between lines of pixels in the sensor array, each pixel in the resulting strip image will be illuminated 126/6=21 times, and the increase in S/N is a factor of the square root of 21, or 4.6.
(55) Note—the active area that can be chosen for data acquisition in a particular sensor array may not match the number of lines required by the MSIA calculation. In this case, an active area that has more lines than necessary may be chosen, and data from rows of pixels at the top and bottom of the active area are commonly discarded to match the number of rows of data with the number required for the MSIA calculation.
(56) In the example described immediately above, if the sensor array allows an active area of 128 lines to be chosen, one line of data at the top and one at the bottom of each acquired frame image can be discarded to match the 126-line requirement for the MSIA calculation described above. This is the “required frame image” as used below.
(57)
(58)
(59) However, this is not always the case. For example, when the third camera trigger signal in line 1 triggers the shutter to open, the illumination time has been set for a shorter time than the time the shutter is open, so in this case the effective exposure time (line 4) has been determined by the illumination time, not the time the shutter was open.
(60) When the fourth camera trigger signal opens the camera shutter, this example shows that the camera shutter is opened before the illumination is turned on, but is closed before the illumination is turned off (i.e., there is an offset in the timing), resulting in the effective exposure time shown in line 4 which has been determined by a combination of the two events in lines 2 and 3.
(61)
(62) In this example, each image pixel position has been exposed twice for each of the four colors, the two measurements for each color are averaged together, and the final strip image is comprised of four colors for each pixel position, each of which has its SN increased by a factor of 1.4 because of averaging (this is an MSIA=2 image).
(63) The microscope in
(64)
(65) In some cases, the tissue specimen 1201 may be a biological specimen, which is commonly stained with a fluorescent stain (including those containing quantum dots) and is commonly covered with a transparent cover slip (not shown). Light from the illumination source 1200 is reflected toward specimen 1201 by beamsplitter 1210 and passes through microscope objective 115 to illuminate the area of the specimen being frame imaged.
(66) Fluorescent light emitted from the specimen is collected by infinity-corrected microscope objective 115 which is focused on the specimen by motorized positioner 120, passes through beamsplitter 1210 and fluorescence emission filter 1220. Fluorescence emission filter 1220 is chosen to pass the fluorescence wavelengths emitted by a fluorophore (and/or other source of fluorescence) in the specimen. The microscope objective 115 and tube lens 125 form a real image of the specimen on a monochrome area detector array 304, inside camera 305. An image of the specimen is collected by moving the microscope slide at constant speed using motorized stage 105 (controlled by computer 320) through connection 340 (or other communication mechanism including Wi-Fi or Bluetooth) in a direction perpendicular to the rows of detector pixels of the monochrome detector array 304 in camera 305.
(67) The shutter in camera 305 (which may be part of detector array 304) is actuated by frame grabber 310 in synchronism with the position of the moving stage, which is controlled by computer 320 through connection 315 (or other communication mechanism). The shutter is open for only a short time for each exposure, for example for the time it takes for the image projected onto the detector array by tube lens 125 to move a distance across the detector array that is less than or equal to half the distance between adjacent rows in the array.
(68) Computer 320 combines a sequence of contiguous frame images from the array to construct an image of one strip across the specimen. When the instrument is configured for MSIA imaging, each image pixel position is imaged more than once, and the pixel data is averaged (or added, for weak fluorophores). Strips are then assembled to form a complete fluorescence image of the specimen.
(69) When there is more than one fluorophore in the specimen, multiple fluorophores can be imaged in a single scan by changing the color of the illumination from light source 1200 (setting the excitation wavelength for a particular fluorophore) and changing emission filter 1220 (setting the emission wavelength for that fluorophore); both changes can be made at the same time and synchronized with the scan. In some cases, a single beam splitter and emission filter combination may be used with multiple excitation wavelengths, enabling the imaging system to acquire multiple fluorophores merely by changing the color of the excitation source only. When the instrument is configured for MSIA imaging, each image pixel position for each fluorophore is imaged more than once, and the pixel data for each fluorophore is averaged (or added, for weak fluorophores).
(70) When the instrument shown in