Multi-spectral Auto-fluorescence based Stainless and Slide-free Virtual histology
20230162410 · 2023-05-25
Inventors
- Rishikesh Pandey (Unionville, CT, US)
- Guoan Zheng (Vernon, CT)
- Alan Kersey (South Glastonbury, CT, US)
Cpc classification
G06T2207/20016
PHYSICS
G06T7/30
PHYSICS
G01N21/6486
PHYSICS
G06T3/40
PHYSICS
International classification
G06T3/40
PHYSICS
Abstract
A system for and method of producing a virtually stained histological tissue sample is provided that includes: a) acquiring a plurality of autofluorescence (AF) images of an unstained tissue sample, each AF image of the plurality of images produced by interrogating the tissue sample at an AF excitation wavelength configured to produce AF emissions at an AF emission wavelength, wherein the AF excitation wavelength and the AF emission wavelength used to produce each AF image of the plurality of AF images is different from the AF excitation wavelength and the AF emission wavelength used to produce the other AF images of the plurality of AF images; b) virtually staining the tissue sample using the plurality of AF images using artificial intelligence to represent a coloration of at least one histological stain; and c) producing a virtually stained histological tissue sample from the virtual staining.
Claims
1. A method of producing a virtually stained histological tissue sample, comprising: acquiring a plurality of autofluorescence (AF) images of an unstained tissue sample, each AF image of the plurality of images produced by interrogating the tissue sample at an AF excitation wavelength configured to produce AF emissions at an AF emission wavelength, wherein the AF excitation wavelength and the AF emission wavelength used to produce each AF image of the plurality of AF images is different from the AF excitation wavelength and the AF emission wavelength used to produce the other AF images of the plurality of AF images; virtually staining the tissue sample using the plurality of AF images using artificial intelligence to represent a coloration of at least one histological stain; producing a virtually stained histological tissue sample from the virtual staining.
2. The method of claim 1, further including acquiring at least one reflectance image of the tissue sample produced by interrogating the tissue sample at a reflectance excitation wavelength; and wherein said virtually staining the tissue sample includes using the at least one reflectance image.
3. The method of claim 2, wherein the AF excitation wavelengths are different from the reflectance excitation wavelength.
4. The method of claim 1, wherein the virtual staining includes virtually staining the unstained tissue sample using the plurality of AF images using artificial intelligence to represent the coloration of a first histological stain and virtually staining the unstained tissue sample using the plurality of AF images using artificial intelligence to represent the coloration of a second histological stain.
5. The method of claim 4, wherein the first histological stain is hematoxylin and eosin.
6. The method of claim 1, further comprising producing a virtually stained immunohistological tissue sample from the virtual staining.
7. The method of claim 1, further comprising focusing each said AF image of the plurality of AF images.
8. The method of claim 7, further comprising registering each said AF image of the plurality of AF images with the others of the plurality of AF images.
9. The method of claim 7, wherein each said AF image of the plurality of AF images has a resolution, and further comprising increasing the resolution of each said AF image of the plurality of AF images with one another.
10. A system for producing a virtually stained histological tissue sample, comprising: an excitation light source; one or more light detectors; and a system controller in communication with the excitation light source, the one or more light detectors, and a non-transitory memory storing instructions, which instructions when executed cause the system controller to: control the excitation light source and the one or more light detectors to acquire a plurality of autofluorescence (AF) images of an unstained tissue sample, each AF image of the plurality of images produced by interrogating the tissue sample at an AF excitation wavelength produced by the excitation light source, the AF excitation wavelength configured to produce AF emissions at an AF emission wavelength, and wherein the AF excitation wavelength and the AF emission wavelength used to produce each AF image of the plurality of AF images is different from the AF excitation wavelength and the AF emission wavelength used to produce the other AF images of the plurality of AF images; virtually stain the tissue sample using the plurality of AF images using artificial intelligence to represent a coloration of at least one histological stain; and produce a virtually stained histological tissue sample from the virtual staining.
11. The system of claim 10, wherein the instructions when executed cause the system controller to control the excitation light source and the one or more light detectors to acquire at least one reflectance image of the unstained tissue sample by interrogating the tissue sample at a reflectance excitation wavelength, and wherein the virtually staining of the tissue sample includes using the at least one reflectance image.
12. The system of claim 11, wherein the AF excitation wavelengths are different from the reflectance excitation wavelength.
13. The system of claim 10, wherein the virtual staining includes virtually staining the unstained tissue sample using the plurality of AF images using artificial intelligence to represent the coloration of a first histological stain and virtually staining the unstained tissue sample using the plurality of AF images using artificial intelligence to represent the coloration of a second histological stain.
14. The system of claim 13, wherein the first histological stain is hematoxylin and eosin.
15. The system of claim 10, wherein the instructions when executed cause the system controller to produce a virtually stained immunohistological tissue sample from the virtual staining.
16. The system of claim 10, wherein the instructions when executed cause the system controller to focus each said AF image of the plurality of AF images.
17. The system of claim 16, wherein the instructions when executed cause the system controller to register each said AF image of the plurality of AF images with the others of the plurality AF images.
18. The system of claim 16, wherein each said AF image of the plurality of AF images has a resolution, and wherein the instructions when executed cause the system controller to increase the resolution of each said AF image of the plurality of AF images with one another.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
SUMMARY
[0017] According to an aspect of the present disclosure, a method of producing a virtually stained histological tissue sample is provided that includes: a) acquiring a plurality of autofluorescence (AF) images of an unstained tissue sample, each AF image of the plurality of images produced by interrogating the tissue sample at an AF excitation wavelength configured to produce AF emissions at an AF emission wavelength, wherein the AF excitation wavelength and the AF emission wavelength used to produce each AF image of the plurality of AF images is different from the AF excitation wavelength and the AF emission wavelength used to produce the other AF images of the plurality of AF images; b) virtually staining the tissue sample using the plurality of AF images using artificial intelligence to represent a coloration of at least one histological stain; and c) producing a virtually stained histological tissue sample from the virtual staining.
[0018] In any of the aspects or embodiments described above and herein, the method may include acquiring at least one reflectance image of the tissue sample produced by interrogating the tissue sample at a reflectance excitation wavelength, and the virtually staining of the tissue sample may include using a reflectance image.
[0019] In any of the aspects or embodiments described above and herein, the AF excitation wavelengths may be different from the reflectance excitation wavelength.
[0020] In any of the aspects or embodiments described above and herein, the virtual staining may include virtually staining the unstained tissue sample using the plurality of AF images using artificial intelligence to represent the coloration of a first histological stain and virtually staining the unstained tissue sample using the plurality of AF images may use artificial intelligence to represent the coloration of a second histological stain.
[0021] In any of the aspects or embodiments described above and herein, the first histological stain may be hematoxylin and eosin.
[0022] In any of the aspects or embodiments described above and herein, the method may include producing a virtually stained immunohistological tissue sample from the virtual staining.
[0023] In any of the aspects or embodiments described above and herein, the method may include focusing each AF image of the plurality of AF images.
[0024] In any of the aspects or embodiments described above and herein, the method may include registering each AF image of the plurality of AF images with the others of the plurality of AF images.
[0025] In any of the aspects or embodiments described above and herein, each AF image of the plurality of AF images has a resolution, and the method may include increasing the resolution of each AF image of the plurality of AF images with one another.
[0026] According to another aspect of the present disclosure, a system for producing a virtually stained histological tissue sample is provided that includes an excitation light source, one or more light detectors, and a system controller. The system controller is in communication with the excitation light source, the one or more light detectors, and a non-transitory memory storing instructions. The instructions when executed cause the system controller to: a) control the excitation light source and the one or more light detectors to acquire a plurality of autofluorescence (AF) images of an unstained tissue sample, each AF image of the plurality of images produced by interrogating the tissue sample at an AF excitation wavelength produced by the excitation light source, the AF excitation wavelength configured to produce AF emissions at an AF emission wavelength, and wherein the AF excitation wavelength and the AF emission wavelength used to produce each AF image of the plurality of AF images is different from the AF excitation wavelength and the AF emission wavelength used to produce the other AF images of the plurality of AF images; b) virtually stain the tissue sample using the plurality of AF images using artificial intelligence to represent a coloration of at least one histological stain; and c) produce a virtually stained histological tissue sample from the virtual staining.
[0027] In any of the aspects or embodiments described above and herein, the instructions when executed may cause the system controller to control the excitation light source and the one or more light detectors to acquire at least one reflectance image of the unstained tissue sample by interrogating the tissue sample at a reflectance excitation wavelength, and wherein the virtually staining of the tissue sample includes using the at least one reflectance image.
[0028] In any of the aspects or embodiments described above and herein, the instructions when executed may cause the system controller to produce a virtually stained immunohistological tissue sample from the virtual staining.
[0029] In any of the aspects or embodiments described above and herein, the instructions when executed may cause the system controller to focus each AF image of the plurality of AF images.
[0030] In any of the aspects or embodiments described above and herein, the instructions when executed cause the system controller to register each AF image of the plurality of AF images with the others of the plurality AF images.
[0031] In any of the aspects or embodiments described above and herein, each AF image of the plurality of AF images has a resolution, and the instructions when executed may cause the system controller to increase the resolution of each AF image of the plurality of AF images with one another.
[0032] The foregoing features and elements may be combined in various combinations without exclusivity, unless expressly indicated otherwise. These features and elements as well as the operation thereof will become more apparent in light of the following description and the accompanying drawings. It should be understood, however, the following description and drawings are intended to be exemplary in nature and non-limiting.
DETAILED DISCLOSURE
[0033] The rapid availability of histological results is important in cancer care both in diagnosis, surgery, and pathological labs. As will be described below, the present disclosure provides a histology system and method that dramatically decreases the amount of time required to produce useful histological results and does not require multiple tissue sections, one that can be used to produce useful information for biopsy evaluation and for triaging the tissue sections in the surgical pathology lab.
[0034] The present disclosure leverages the fact that biomolecules present in different tissues provide discernible and repeatable autofluorescence [6-8] and reflectance [6] spectral patterns. The endogenous fluorescence signatures offer useful information that can be mapped to the functional, metabolic and morphological attributes of a biological sample, and can therefore be used for diagnostic purposes. Biomolecular changes occurring in the cell and tissue state during pathological processes and disease progression result in alterations of the amount and distribution of endogenous fluorophores and form the basis for classification. Tissue autofluorescence (AF) has been proposed to detect various malignancies including cancer by measuring either differential intensity [7] or lifetimes of the intrinsic fluorophores [8]. Biomolecules such as tryptophan, collagen, elastin, nicotinamide adenine dinucleotide (NADH), flavin adenine dinucleotide (FAD), porphyrins, etc. present in tissue provide discernible and repeatable autofluorescence spectral patterns. Label-free approaches based on vibrational spectroscopy such as stimulated Raman [9] and Infra-red (IR) microscopies [10] have also been proposed but they are slower, limited by smaller FOV, require sophisticated instrumentation, and are prohibitively expensive. Autofluorescence-based virtual histology which requires autofluorescence microscopic images is reported [11] but on fixed tissue sections.
[0035] The present disclosure also leverages the fact that different histological stains produce distinct colorations that are used for identification purposes. Embodiments of the present disclosure are operable to produce useful histological information by creating a plurality of “virtually stained” images of an unstained tissue section, which may include multiple virtually stained images based on a coloration that is associated with a particular histological stain, or which may include multiple virtually stained images each having a coloration that is associated with different respective histological stain. The stain coloration from a single histological stain may not, in some instances, provide enough information to differentiate all tissues, cellular structures, or chemical substances. The present disclosure makes it is possible to “virtually stain” a single tissue section to produce multiple images relating to a specific histological stain (e.g., H&E) coloration and also to produce multiple images of a single tissue section relating to multiple different histological stain colorations and thereby provide a robust means for producing the requisite information for a histological analysis.
[0036] The present disclosure system includes an excitation light source, one or more light detectors, and a system controller that is configured to perform the functionality described herein. The present disclosure system is not limited to any particular excitation light source and light detector configuration, and the system may include additional elements; e.g., light filtration elements, etc. PCT application number PCT/US2022/032526, commonly assigned with the present application and hereby incorporated by reference in its entirety, discloses an example of an acceptable light source, light detector, and system controller configuration that may be used to produce AF images and diffuse reflectance images of a tissue sample section.
[0037] The excitation light source may be configured to produce excitation light centered at a plurality of distinct wavelengths or may include a white light source coupled with filtering that enables distinct wavelengths to be produced. The excitation wavelengths are those that will produce useful AF emissions and/or useful reflectance signals from a tissue sample section; e.g., wavelengths based on the photometric properties associated with one or more biomolecules (or tissue type, etc.) of interest. Excitation light at wavelengths in the ultraviolet (UV) region (e.g., about 100-400 nm) and in the visible region (e.g., 400-700 nm) are non-limiting examples of excitation light that produce useful AF emissions and/or useful reflectance signals from a tissue sample section. Non-limiting examples of acceptable excitation light sources include lasers and light emitting diodes (LEDs) that may be centered at particular wavelengths, or a tunable excitation light source configured to selectively produce light centered at respective different wavelengths. The present disclosure is not limited to any particular type of excitation light unit.
[0038] The present disclosure system may utilize a variety of different light detector types to sense light and provide signals representative thereof. Non-limiting examples of an acceptable light detectors include those that convert light energy into an electrical signal such as photodiodes, avalanche photodiodes, a charge coupled device (“CCD”) array, an intensified charge coupled device (“ICCD”) array, a complementary metal-oxide-semiconductor (“CMOS”) image sensor, or the like. The light detector may take the form of a camera.
[0039] The system controller is in communication with other system components such as the light source and the light detector and may be in communication with other system components. The system controller may be in communication with system components to control the operation of the respective component and/or to receive signals from and/or transmit signals to that component to perform the functions described herein. The system controller may include any type of computing device, computational circuit, processor(s), CPU, computer, or the like capable of executing a series of instructions that are stored in memory. The instructions may include an operating system, and/or executable software modules such as program files, system data, buffers, drivers, utilities, and the like. The executable instructions may apply to any functionality described herein to enable the system to accomplish the same algorithmically and/or coordination of system components. The system controller includes or is in communication with one or more memory devices. The present disclosure is not limited to any particular type of memory device, and the memory device may store instructions and/or data in a non-transitory manner. Examples of memory devices that may be used include read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. The system controller may include, or may be in communication with, an input device that enables a user to enter data and/or instructions, and may include, or be in communication with, an output device configured, for example to display information (e.g., a visual display or a printer), or to transfer data, etc. Communications between the system controller and other system components may be via a hardwire connection or via a wireless connection.
[0040] Some embodiments of the present disclosure may include optical filtering elements configured to filter excitation light, or optical filtering elements configured to filter emitted light (including reflected light), or both.
[0041] An exemplary embodiment of a present disclosure system 20 is diagrammatically illustrated in
[0042] As indicated above, processes for producing AF images of a tissue sample section and processes for producing diffuse reflectance images of a tissue sample section are known, and the present disclosure is not limited to any particular processes. PCT application number PCT/US2022/032526, commonly assigned with the present application and hereby incorporated by reference in its entirety, discloses acceptable processes that may be used to produce AF images and diffuse reflectance images of a tissue sample section.
[0043] The present disclosure uses artificial intelligence (AI) techniques, including machine learning, (collectively “AI techniques”) to produce the virtually stained histological images that are representative of known histological stains. A trained present disclosure system is operable to produce a virtually stained histological image (or virtual immunohistological image) from a plurality of AF images (or a mosaic of AF images), each image acquired at different excitation and emission wavelength, and in some instances also using one or more reflectance images. Embodiments of the present disclosure may be configured to produce a virtually stained histological image representative of a particular type of histological stain (e.g., H&E), and other embodiments may be configured to produce more than one type of virtually stained histological image; e.g., a first virtually stained histological image representative of H&E stain, a second virtually stained histological image representative of Van Gieson stain, a third virtually stained histological image representative of Toluidine Blue stain, a fourth virtually stained histological image representative of Alcian Blue, and so on.
[0044] The AI aspect of the present disclosure may be trained in a variety of different ways.
[0045] The training process begins with producing a plurality of AF images (and possibly one or more reflectance images) of a tissue sample section. The method then includes generating a virtually stained image of the tissue sample section based on the AF images (and reflectance image when used) of the tissue sample. The virtually stained image is based on a selected histological stain (e.g., H&E) so that the virtually stained image is representative of the coloration that would have been produced if that tissue sample section had actually been stained by the chosen histological stain (e.g., H&E). The process of generating the virtually stained images may be described as being performed in a “generator network”.
[0046] The virtually stained image is subsequently evaluated relative to a corresponding actual image (e.g., a bright light image) of the tissue sample section stained with the selected histological stain (e.g., H&E) to identify differences between them. The process of identifying the discrepancies between the actual histological image and the generated virtual histological image may be described as being performed in a “discriminator network”. The discrepancies between the actual histological image and the generated virtual histological image may then be formulated as a loss function. The loss function may be communicated to the discriminator network and to the generator network for use in backpropagation. The generator network, in turn, may utilize the loss function to generate a corrected virtual image which is then communicated to the discriminator network. This process may be performed iteratively until the discriminator network cannot distinguish between the generated virtually stained image and the actual image; i.e., at this point the generator network “wins”. The AI process is then trained to associate certain colorations with certain AF image elements (and possibly certain reflectance image elements). This process is repeated on a number of tissue sample sections for each histological stain sufficient to produce a desired degree of accuracy. In this manner, the generator network learns the statistical transformation between the plurality of multispectral AF images and the reflectance images with corresponding bright-field histological images of the same tissue block. The input from the loss function enables the discriminator network to learn how to distinguish between a true bright field histological stained image of a tissue sample section and the generator network’s output virtual histological image. After trainings, the generator network artificially manufactures virtual histological images and the discriminator network assesses the similar of each to an actual histological image. By way of backpropagation, the discriminator network’s classification helps the generator network to update its weights and thereby fine-tune the virtual histological images being produced. Ultimately, after several iterations, the generator network begins to output higher-quality virtually stained histological images and the discriminator network becomes better at distinguishing the virtually stained histological images from the actual histological images. Once the network is trained, it can produce virtually stained images representative of histological stains including H&E or other histological stains as well as immunohistological stains from a panel of input AF and reflectance images.
[0047]
[0048]
[0049] The disclosed virtual histology method is different from the previously reported virtual autofluorescence in numerous ways. For example, the present disclosure embodiments may use a multi-modal approach that may utilize a panel of AF and reflectance images, unlike a single AF image. As another example, embodiments of the present disclosure may not require making of any slide and may be performed on non-fixed, unsectioned tissue samples. As yet another example, embodiments of the present disclosure may be described as using rapid “snapshot” imaging that is very useful and practical in clinical settings. In addition to the applicability in biopsy diagnosis and triaging tissue samples, this approach will also be applicable in frozen section analysis.
[0050]
[0051] In
[0052] In
[0053] While the principles of the disclosure have been described above in connection with specific apparatuses and methods, it is to be clearly understood that this description is made only by way of example and not as limitation on the scope of the disclosure. Specific details are given in the above description to provide a thorough understanding of the embodiments. However, it is understood that the embodiments may be practiced without these specific details.
[0054] The singular forms “a,” “an,” and “the” refer to one or more than one, unless the context clearly dictates otherwise. For example, the term “comprising a sample” includes single or plural samples and is considered equivalent to the phrase “comprising at least one sample.” The term “or” refers to a single element of stated alternative elements or a combination of two or more elements unless the context clearly indicates otherwise. As used herein, “comprises” means “includes.” Thus, “comprising A or B,” means “including A or B, or A and B,” without excluding additional elements.
[0055] It is noted that various connections are set forth between elements in the present description and drawings (the contents of which are included in this disclosure by way of reference). It is noted that these connections are general and, unless specified otherwise, may be direct or indirect and that this specification is not intended to be limiting in this respect. Any reference to attached, fixed, connected or the like may include permanent, removable, temporary, partial, full and/or any other possible attachment option.
[0056] No element, component, or method step in the present disclosure is intended to be dedicated to the public regardless of whether the element, component, or method step is explicitly recited in the claims. No claim element herein is to be construed under the provisions of 35 U.S.C. 112(f) unless the element is expressly recited using the phrase “means for.” As used herein, the terms “comprise”, “comprising”, or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
[0057] While various inventive aspects, concepts and features of the disclosures may be described and illustrated herein as embodied in combination in the exemplary embodiments, these various aspects, concepts, and features may be used in many alternative embodiments, either individually or in various combinations and sub-combinations thereof. Unless expressly excluded herein all such combinations and sub-combinations are intended to be within the scope of the present application. Still further, while various alternative embodiments as to the various aspects, concepts, and features of the disclosures—such as alternative materials, structures, configurations, methods, devices, and components, and so on-may be described herein, such descriptions are not intended to be a complete or exhaustive list of available alternative embodiments, whether presently known or later developed. Those skilled in the art may readily adopt one or more of the inventive aspects, concepts, or features into additional embodiments and uses within the scope of the present application even if such embodiments are not expressly disclosed herein. For example, in the exemplary embodiments described above within the Detailed Description portion of the present specification, elements may be described as individual units and shown as independent of one another to facilitate the description. In alternative embodiments, such elements may be configured as combined elements. It is further noted that various method or process steps for embodiments of the present disclosure are described herein. The description may present method and/or process steps as a particular sequence. However, to the extent that the method or process does not rely on the particular order of steps set forth herein, the method or process should not be limited to the particular sequence of steps described. As one of ordinary skill in the art would appreciate, other sequences of steps may be possible. Therefore, the particular order of the steps set forth in the description should not be construed as a limitation.
References
[0058] 1. L. C. Cahill et al., Rapid virtual hematoxylin and eosin histology of breast tissue specimens using a compact fluorescence nonlinear microscope. Lab Invest 98, 150-160 (2018). [0059] 2. C. Elfgen et al., Comparative analysis of confocal microscopy on fresh breast core needle biopsies and conventional histology. Diagnostic Pathology 14, 58 (2019). [0060] 3. P. Pradhan et al., Computational tissue staining of non-linear multimodal imaging using supervised and unsupervised deep learning. Biomed. Opt. Express 12, 2280-2298 (2021). [0061] 4. D.A. Orringer et al., Rapid intraoperative histology of unprocessed surgical specimens via fibre-laser-based stimulated Raman scattering microscopy, Nat Biomed Eng 1, 0027 (2017). [0062] 5. U.S. Pat. Pub. No. 2021/0043331, Method and System for Digital Staining of label-Free Fluorescence Images Using Deep Learning [0063] 6. T. M. Bydlon, R. Nachabe, N. Ramanujam, H. J. Sterenborg, B. H. Hendriks, Chromophore based analyses of steady-state diffuse reflectance spectroscopy: current status and perspectives for clinical adoption. J Biophotonics 8, 9-24 (2015). [0064] 7. M. Wang et al., Autofluorescence Imaging and Spectroscopy of Human Lung Cancer. Applied Sciences 7, 32 (2017). [0065] 8. M. Marsden et al., Intraoperative Margin Assessment in Oral and Oropharyngeal Cancer Using Label-Free Fluorescence Lifetime Imaging and Machine Learning. IEEE Transactions on Biomedical Engineering 68, 857-868 (2021). [0066] 9. B. Sarri et al., Stimulated Raman histology: one to one comparison with standard hematoxylin and eosin staining. Biomed. Opt. Express 10, 5378-5384 (2019). [0067] 10. M. Schnell et al., All-digital histopathology by infrared-optical hybrid microscopy. Proc Natl Acad Sci USA 117, 3388-3396 (2020). [0068] 11. Y. Rivenson et al., Virtual histological staining of unlabelled tissue-autofluorescence images via deep learning. Nature Biomedical Engineering 3, 466-477 (2019). [0069] 12. C. Jiang et al., Blind deblurring for microscopic pathology images using deep learning networks. arXivpreprint arXiv:2011.11879 (2020).