System and method for determining yeast cell viability and concentration
11320362 · 2022-05-03
Assignee
Inventors
- Aydogan Ozcan (Los Angeles, CA)
- Alborz Feizi (Walnut Creek, CA, US)
- Alon Greenbaum (West Hollywood, CA, US)
Cpc classification
G01N2015/1454
PHYSICS
G06F18/2148
PHYSICS
International classification
Abstract
A lens-free microscope system for automatically analyzing yeast cell viability in a stained sample includes a portable, lens-free microscopy device that includes a housing containing a light source coupled to an optical fiber, the optical fiber spaced away several centimeters from an image sensor disposed at one end of the housing, wherein the stained sample is disposed on the image sensor or a sample holder adjacent to the image sensor. Hologram images are transferred to a computing device having image processing software contained therein, the image processing software identifying yeast cell candidates of interest from back-propagated images of the stained sample, whereby a plurality of spatial features are extracted from the yeast cell candidates of interest and subject to a trained machine learning model to classify the yeast cell candidates of interest as live or dead.
Claims
1. A method for automatically analyzing yeast cell viability in a stained sample comprising: loading the stained sample in a portable, lens-free microscopy device that comprises a housing containing a light source at one end, and an image sensor disposed at another end of the housing, wherein a stained sample is disposed on the image sensor or a sample holder disposed adjacent to the image sensor; illuminating the stained sample with the light source and capturing a hologram image of the stained sample with the image sensor; transferring the captured hologram image to a computing device operatively connected to the lens-free microscopy device, the computing device having image processing software loaded therein for automatically analyzing yeast cell viability; and wherein the image processing software automatically analyzes yeast cell viability by (i) dividing the full hologram image field-of-view into a plurality of smaller tiled field-of-views, (ii) back-propagating the tiled field-of-views to a plurality of distances (z.sub.2) from the image sensor, (iii) identifying yeast cell candidates at each z.sub.2 distance and inputting yeast cell candidates into a pre-trained machine learning model to generate a mean classification score for the tiled field-of-views as well as classifying the yeast cell candidates as stained or unstained by classification score; (iv) selecting, for each tiled field-of-view, the distance z.sub.2 with the largest mean classification score; (v) identifying the stained and unstained yeast cells in the selected tiles from (iv) for the full hologram image field-of-view.
2. The method of claim 1, wherein the image processing software outputs for display on the computing device a viability percentage for the full hologram image field-of-view or a sub-field-of-view.
3. The method of claim 1, wherein the image processing software outputs for display on the computing device one or more of stained concentration, unstained concentration, and total yeast cell concentration.
4. The method of claim 1, wherein the full hologram image field-of-view is divided into smaller tiled field-of-views that are processed in parallel or sequentially.
5. The method of claim 1, wherein the machine learning model utilizes one or more spatial features of each yeast cell candidate selected from the group consisting of yeast cell candidate's area, perimeter or circumference, maximum pixel value on a phase image, maximum pixel value on an amplitude image, minimum pixel value on the phase image, minimum pixel value on the amplitude image, mean pixel value on the phase image, mean pixel value on the amplitude image, standard deviation of pixel values on the phase image, and the standard deviation of pixel values on the amplitude image.
6. The method of claim 1, wherein the computing device comprises one of a desktop computer, laptop computer, tablet computer, or mobile phone.
7. The method of claim 1, wherein identifying the stained and unstained yeast cells in the selected tiles from (iv) for the full hologram image field-of-view comprises removing stained and unstained yeast cells having a classification score below a threshold cutoff value.
8. The method of claim 1, wherein the image processing software labels stained and unstained yeast cells in a graphical user interface that is displayed on the computing device.
9. The method of claim 1, wherein the machine learning model comprises a Support Vector Machine (SVM) learning model.
10. A method for automatically analyzing yeast cell viability in a stained sample comprising: loading the stained sample in a portable, lens-free microscopy device that comprises a housing containing a light source at one end, and an image sensor disposed at another end of the housing, wherein a stained sample is disposed on the image sensor or a sample holder disposed adjacent to the image sensor; illuminating the stained sample with the light source and capturing a hologram image of the stained sample with the image sensor; transferring the captured hologram image to a computing device operatively connected to the lens-free microscopy device, the computing device having image processing software loaded therein for automatically analyzing yeast cell viability; and wherein the image processing software automatically analyzes yeast cell viability by (i) dividing the full hologram image field-of-view into a plurality of smaller tiled field-of-views, (ii) back-propagating each of the tiled field-of-views to pre-stored distances (z.sub.2) from the image sensor obtained in an auto-focus operation, (iii) inputting yeast cell candidates from the tiled field-of-views into a pre-trained machine learning model to generate classification scores for the yeast cell candidates, (iv) identifying the stained and unstained yeast cells in the tiled field-of-views for the full hologram image field-of-view based on the classification scores.
11. The method of claim 10, wherein the image processing software outputs for display on the computing device a viability percentage for the full hologram image field-of-view or a sub-field-of-view.
12. The method of claim 10, wherein the image processing software outputs for display on the computing device one or more of stained concentration, unstained concentration, and total yeast cell concentration.
13. The method of claim 10, wherein the full hologram image field-of-view is divided smaller tiled field-of-views that are processed in parallel or sequentially.
14. The method of claim 10, wherein the machine learning model utilizes one or more spatial features of each yeast cell candidate selected from the group consisting of yeast cell candidate's area, perimeter or circumference, maximum pixel value on a phase image, maximum pixel value on an amplitude image, minimum pixel value on the phase image, minimum pixel value on the amplitude image, mean pixel value on the phase image, mean pixel value on the amplitude image, standard deviation of pixel values on the phase image, and the standard deviation of pixel values on the amplitude image.
15. The method of claim 10, wherein the computing device comprises one of a desktop computer, laptop computer, tablet computer, or mobile phone.
16. The method of claim 10, wherein identifying the stained and unstained yeast cells in the tiled field-of-views for the full hologram image field-of-view comprises removing stained and unstained yeast cells having a classification score below a threshold cutoff value.
17. The method of claim 10, wherein the image processing software labels stained and unstained yeast cells in a graphical user interface that is displayed on the computing device.
18. The method of claim 10, wherein the machine learning model comprises a Support Vector Machine learning model.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
DETAILED DESCRIPTION OF THE ILLUSTRATED EMBODIMENTS
(13)
(14) In one embodiment, the sample 30 is held on or within an optically transparent sample holder 32. For example, a liquid sample 30 that contains the stained yeast cells to be imaged is placed between two coverslips that define a small volume or chamber between the two coverslips where the sample is held. The sample holder 32 is placed, in one embodiment, directly on the active surface if the image sensor 18 so that the lower coverslip directly rests on the image sensor 18. In another embodiment, the sample holder 32 may be omitted entirely and the sample containing the stained yeast cells is placed directly on the active surface of the image sensor 24. Any number of types of stains may be used. In experiments used herein, methylene blue was used. Methylene blue is used to identify live yeast cells; live cells are unstained while dead cells are stained by the methylene blue dye.
(15) Still referring to
(16) The computing device 40 includes or is associated with a display 44 that contains a graphical user interface (GUI) 46 that is used to run samples, analyze data, and display results to the user (see
(17) For example, as seen in
(18) The image processing software 42 that is used to automatically classify candidate yeast cells 50 as stained (dead) or unstained (live) utilizes a trained machine learning model 60 as seen in
(19) To generate the SVM model 60, training data that included known cells that were stained and unstained populated the model as seen in
(20)
(21) The sub-FOV holograms are then back-propagated to a range of distances (z.sub.2) from the plane of the image sensor 18. This may occur sequentially or in parallel. For each of the back-propagated images, yeast cell candidates 50 are identified using intensity thresholding and mathematical morphology operations. For each of the yeast cell candidates 50, the ten (10) spatial features described above are extracted from the amplitude and the phase images (best seen in
(22) In the embodiment described above, the auto-focusing operation for each tile 72 or sub-FOV is performed prior to labeling and viability calculations. In another embodiment, the image processing software 42 may already contain the optimum or ideal distances z.sub.2 for each tile 72 of sub-FOV. For example, with reference to
(23) In one preferred embodiment, prior to final labeling of candidate cells 50 as either labeled or unlabeled, a thresholding operation is performed to remove spurious or phantom candidate cells 50 that are clumps, dust particles, and twin-image artifacts in the image. These are removed based comparing the SVM classification score against a threshold value or distance from the decision boundary. Most of these micro-objects lie close to the decision boundary and have the lowest absolute SVM classification scores. An SVM score threshold was determined in order to exclude some of these false classifications from our viability calculations. The number of cell candidates eliminated based on this SVM classification score threshold is approximately 15% of the total number of yeast cell candidates 50 in a given FOV. These cell candidates 50 falling below the threshold cutoff are eliminated from consideration. The remaining cells 50 are classified into stained and unstained cell categories based on their SVM classification scores are accordingly labelled on the reconstructed image
Experimental
(24) A lens-free microscope system 10 was tested to demonstrate how a compact and cost-effective microscope imaging platform can be used to rapidly measure yeast cell concentration and viability. The system was given the name Automatic Yeast Analysis Platform (AYAP), which is based on digital in-line holography and on-chip microscopy and rapidly images a large field-of-view of 22.5 mm.sup.2. The lens-free microscope weighs ˜70 grams and utilizes a partially-coherent illumination source and an opto-electronic image sensor chip. A touch-screen user interface based on a tablet-PC was developed to reconstruct the holographic shadows captured by the image sensor and use a support vector machine (SVM) model to automatically classify live and dead cells in a yeast sample stained with methylene blue.
(25) Materials and Methods
(26) Sample Preparation
(27) Distillers Active Dry Yeast (DADY) was rehydrated in distilled water. 1:1 volume of 0.1% w/v Methylene Blue was added to the yeast solution to stain the dead cells (i.e., dead cells exhibit a dark stain). The microfluidic counting chamber that was used consists of two coverslips and an adhesive tape (CS Hyde, 45-3A-1) used as a spacer between the two coverslips. In order to build the microfluidic chamber, adhesive tape was cut in the shape of a square and was attached to a coverslip (0.13-0.17 mm thickness). Before adding the yeast solution to the chamber, a second coverslip was placed on top of the adhesive tape, with a small opening at the edge. The sample was slowly injected into the microfluidic chamber through the small opening. The yeast solution disburses through the chamber via capillary action, allowing uniform distribution of the yeast cells within the imaging FOV. Lastly, the top cover slip was slid to close the small opening and to prevent evaporation.
(28) Design of the Field-Portable Lens-Free Microscope
(29) The sample on the coverslip holder was directly placed on top of a CMOS image sensor chip (ON Semiconductor, MT9J003STM) with a pixel size of 1.67 μm. An LED with a peak wavelength of 590 nm (Kingbright, WP7113SYC/J3) was used as the illumination source. Of course, other wavelengths may also be used. A hole was drilled into the lens of the LED using a 300 μm-diameter drill bit. A multimode fiber (100 μm core diameter, Thorlabs, AFS-105/125Y) was inserted into the drilled hole and fixed using optical glue. The beam exiting the optical fiber passes through a band-pass filter (4 nm bandwidth, centered around 590 nm, Thorlabs, FB590-10) to improve the temporal coherence of the illumination light at the image sensor plane. The distance between the cleaved end of the optical fiber and the image sensor is approximately 6 cm. A 3V coin battery powers the LED. All the components fit within a 3D printed housing (3D printer: Stratasys, Dimensions Elite) made using acrylonitrile butadiene styrene (ABS) material.
(30) Hologram Reconstruction
(31) The captured holograms of the sample are back-propagated to the object plane using the angular spectrum method. The angular spectrum method is a technique for modeling the propagation of a wave field and involves expanding a complex wave field into a summation of an infinite number of plane waves. The hologram is first transformed to the spatial frequency domain using a fast Fourier transform (FFT). Then a phase factor, which is a function of the wavelength, propagation distance, and refractive index of the medium, is multiplied with the angular spectrum. Finally it is inverse-Fourier-transformed to the spatial domain to obtain the back-propagated image of the specimen. For cell viability analysis, the image reconstruction method did not perform any additional phase retrieval or twin image elimination routines, although these could also be used for refinement of the reconstructed images, if needed.
(32) Automated Counting and Labelling of Imaged Cells Using Machine-Learning
(33) A SVM-based machine-learning algorithm was used to classify stained and unstained cells from a reconstructed digital hologram and quantify cell viability and concentration. The SVM model that was used was based on ten (10) spatial features extracted from each cell candidate: area, perimeter or circumference, maximum pixel value on the phase image, maximum pixel value on the amplitude image, minimum pixel value on the phase image, minimum pixel value on the amplitude image, mean pixel value on the phase image, mean pixel value on the amplitude image, standard deviation of the pixel values on the phase image, and the standard deviation of the pixel values on the amplitude image. The training data was populated from two experiments, where 260 stained and 260 unstained cells were manually identified on the reconstructed digital hologram and individually confirmed using a high-resolution bench-top microscope (Olympus BX51, 10× objective lens with 0.3NA, and Retiga 2000R CCD camera) as ground truth. In order to validate the predictive capabilities of this library, 5-fold cross-validation was performed. Based on this cross-validation, it was found that the percentage of unstained cells correctly identified was 96.5%, the percentage of stained cells correctly identified was 96.9%, the percentage of unstained cells falsely identified as stained was 3.1%, and finally the percentage of stained cells falsely identified as unstained was 3.5%.
(34) The image processing and cell classification algorithm digitally divides the full-FOV hologram into six tiles (each with a FOV of ˜3.8 mm.sup.2—see
(35)
(36) This focus criterion described above is also used to select the reconstructed image at the optimal z.sub.2 distance for labelling and cell viability calculations using the same trained classifier. Next, among all the cell candidates within a given sub-FOV, the majority of clumps, dust particles, and twin-image related artifacts are removed based on an SVM classification score threshold. Most of these non-cell micro-objects lie close to the decision boundary (i.e., boundary separating a cell being classified as stained or unstained) and have the lowest absolute classification scores. An SVM score threshold was determined in order to exclude some of these false classifications from the viability calculations. The number of cell candidates eliminated based on this SVM classification score threshold is approximately 15% of the total number of cell candidates in a given FOV. The remaining cells that are classified into stained and unstained cell categories based on their SVM classification scores were accordingly labelled using color markings on the reconstructed image and the viability percentage of the entire FOV is calculated by dividing the number of unstained cells by the total number of cells.
(37) Touch-Screen Graphical User Interface (GUI)
(38) A custom-designed touch-screen interface based on a tablet-PC (Lenovo Yoga 2, Intel Core i7, 8 GB RAM) was created to work with the field-portable lens-free microscope. This interface allows the user to load a previously captured sample hologram or directly capture a new hologram using the field-portable microscope, automatically setting the image capture settings. For example, as seen in
(39) Next, the user has the ability to run the machine-learning algorithm on the holographic image that is captured. This may be initiated by depressing the “Find Viability” button illustrated in
(40) Results and Discussion
(41) There are a large number of methods that can be used for quantifying the viability of yeast cells. One of the established methods of determining cell viability is exclusion staining. In this method, dead cells are stained, and after counting the number of stained and unstained cells, a number between 0% and 100% is used to indicate the cell viability of the sample. There are multiple exclusion stains used in industry to perform yeast viability testing. One commonly used stain is methylene blue, which is inexpensive, can be stored at room temperature, and has a relatively low toxicity to humans. However, conventional methylene blue exclusion testing methods suffer from (1) false positive results at longer exposure times, and (2) operator subjectivity, which is an important disadvantage compared to fluorescence-based staining methods (e.g., using a fluorescent based stain such as propidium iodide).
(42) The lens-free microscope platform described herein does not suffer from these reported disadvantages of methylene blue because (1) it captures an image of the sample over a large field-of-view and volume (˜4.5 μL) in less than ten (10) seconds, therefore, reducing false positives, and (2) the machine-learning algorithm eliminates operator subjectivity. For these reasons, methylene blue provides a very good staining method for the lens-free microscope platform due to its more practical and cost-effective nature.
(43) The automated yeast viability and concentration results obtained using methylene blue in the lens-free computational imaging system were compared with manual measurements of viability and concentration based on the fluorescence staining of dead cells using propidium iodide. These two methods were compared at various levels of cell viability and concentrations.
(44) Each sample under test was divided into two sub-samples of equal volume, staining one with methylene blue and the other with propidium iodide. For each test, four to five 10× objective lens (NA=0.3) images of the propidium iodide stained samples were captured and manually labelled using benchtop fluorescence microscopy. A single lens-free image of the methylene blue sample was captured via the AYAP microscope device described herein. As described previously, AYAP divides the large FOV into six tiles and processes each tile independently. In the experiments it was found that when using the same batch of cover slips for the sample holder, the optimal propagation distances are consistent from sample holder to sample holder, eliminating the need for repeated digital auto-focusing, which makes the total analysis time for each sample less than thirty (30) seconds, even using a modest tablet-PC for computations.
(45) In the experiments, viability of the yeast cells was varied by mixing different ratios of heat-killed yeast with the original yeast solution, and linear regression analysis was performed for each method (i.e., AYAP using methylene blue vs. benchtop fluorescence microscopy using propidium iodide), the results of which are summarized in
(46) TABLE-US-00001 TABLE 1 PI Manual counting AYAP % Heat-Killed # of Total # of cells # of tiles Total # of cells Yeast 10x images identified (single FOV) identified 0% 5 418 6 1886 25% 5 457 6 1834 50% 4 229 6 1448 75% 5 538 6 1942 100% 5 361 6 1578 Linear Regression Slope Y-Intercept Slope Y-Intercept Viability −0.6977 ± 0.01508 72.34 ± 0.9237 −0.7470 ± 0.03135 77.25 ± 1.920 Concentration of −2751 ± 417.1 282876 ± 25543 −3956 ± 457.4 403343 ± 28009 unstained cells
(47) These results show that the AYAP measurements agree very well with the gold-standard fluorescence-based exclusion staining method. The slopes and Y-intercepts are also summarized in Table 1, which further illustrate the similarity of the results of these two methods.
(48) In order to test the performance of AYAP at various yeast concentrations, serial dilution was performed and analyzed using linear regression.
(49) TABLE-US-00002 TABLE 2 PI Manual counting AYAP Serial Dilution # of Total # of cells # of tiles Total # of cells Factor 10x images identified (single FOV) identified 1.000 5 1439 6 4975 0.500 5 775 6 2081 0.250 5 304 6 878 0.125 5 166 6 518 Linear Regression Slope Y-Intercept Slope Y-Intercept Viability −0.4331 ± 1.310 82.39 ± 0.7548 9.464 ± 5.238 76.90 ± 3.018 Concentration of 1127000 ± 66950 −14190 ± 38580 1265000 ± 101100 −104000 ± 58240 unstained cells
(50) Once again, AYAP measurements agree well with the fluorescence-based exclusion stain within a concentration range between approximately 1.4×10.sup.5 and 1.4×10.sup.6 cells/mL. Above this concentration range, cell overlap and clumps increase, leading to measurement and cell counting inaccuracies. Below this concentration range, on the other hand, the variability in concentration measurements due to statistical counting error increases, which is also shared by other microscopy based cell counting schemes due to the low number of cells per imaging FOV. Similarly, existing hemocytometers that are commonly used for laboratory and industrial applications claim accurate measurements between a minimum concentration of ˜2.5×10.sup.5 cells/mL and a maximum concentration of ˜8×10.sup.6 cells/mL,.sup.45 and samples with larger concentration of cells are diluted. For example, for fermentation applications, the yeast sample is typically diluted by a factor of 10 to 1,000, prior to manual counting with a hemocytometer. Therefore, the dynamic range of the cell densities of the AYAP system is quite relevant for various cell counting applications.
(51) These results illustrate that the viability percentages and concentrations measured using the AYAP microscope system are in close agreement to the gold-standard fluorescent staining method. The small differences between the two methods may be attributed to a few factors: (1) the channel height of our microfluidic sample holder or chamber may slightly vary from test to test leading to changes in the sample volume, which may cause the comparisons to have some systematic error; and (2) the machine-learning algorithm currently ignores cell clumps, whereas in the manual counting results for the fluorescent stain, the cells within the clumps were counted.
(52) In addition to these comparisons between AYAP and fluorescence based standard exclusion method, a control experiment was performed to compare the viability percentages obtained from propidium iodide manual counting and methylene blue manual counting; both using a standard benchtop microscope to better understand and only focus on the differences between the two stains, everything else being same. For this experiment, the rehydrated yeast sample was divided into six samples of equal volume. Three samples were stained via propidium iodide and three samples were stained via methylene blue. Five different 10× objective lens images were captured from each sample (fluorescence and bright-field for propidium iodide and methylene blue, respectively) and manually labelled. As seen in the graph of
(53) Notably, the AYAP's design is cost-effective and field-portable as it approximately weighs 70 grams (excluding the tablet-PC) and has dimensions of 4×4×12 cm. Furthermore, the viability stain used in our platform, methylene blue, is commercially available and does not require special storage conditions, making it especially appealing for field use. Furthermore, the platform allows for rapid assessment of yeast viability and concentration: it performs automatic labelling in 5-10 minutes when using auto-focusing mode and in <30 seconds in cases where auto-focusing is not needed (i.e., prior auto-focus settings that are stored are re-used). These processing times can be further improved by using more powerful tablet-PCs or laptops. In fact, to better put these computation times into perspective, the process of manual counting of some of our more confluent samples took more than an hour by lateral scanning using a benchtop microscope with a 10× objective lens.
(54) AYAP achieves accurate yeast viability and concentration analysis because the on-chip nature of the microscopy platform allows imaging of a large FOV of ˜22.5 mm.sup.2, which is more than an order of magnitude larger than the FOV of a typical 10× objective lens (1-2 mm.sup.2), and therefore it permits the analysis of a significantly larger number of cells in a short amount of time.
(55) Furthermore, the large imaging FOV is captured in less than ten (10) seconds, limiting the number of false positives associated with staining methods that expose cells to toxic environments. Finally, operator/user subjectivity is also eliminated in the system by using a machine-learning based statistical cell classification algorithm running on a tablet-PC.
(56) While embodiments of the present invention have been shown and described, various modifications may be made without departing from the scope of the present invention. The invention, therefore, should not be limited except to the following claims and their equivalents.