System and a method for imaging using lens-less holographic microscopy
12436497 ยท 2025-10-07
Assignee
Inventors
- Mohammad Mujtaba Mansoor (Irvine, CA, US)
- Jian Gao (Rancho Mission Viejo, CA, US)
- James Davis Trolinger (Costa Mesa, CA)
- Jacob George (Downey, CA, US)
Cpc classification
H04N23/81
ELECTRICITY
G03H1/0866
PHYSICS
G03H2226/02
PHYSICS
G03H2001/005
PHYSICS
G03H1/0443
PHYSICS
G03H2001/0458
PHYSICS
International classification
G03H1/00
PHYSICS
Abstract
A lens-less system for holographic imaging or a holographic imaging device is provided. The method/device includes a stationary image sensor to capture an image of a sample illuminated by light from a stationary illumination source. A reference lens-less holographic image may be captured and used as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image. Since real wavefronts produced by a diverging point source are neither perfectly spherical nor planar but a combination of both qualities, theoretical estimates for wavefront reconstruction based on perfectly planar or spherical incident waves cannot be applied accurately. The method/device here provides a solution by performing a calibrated wavefront reconstruction based on equations governing coherent light propagation for both spherical waves and planar waves with a mathematical correlation between numerical magnification and propagation depth to produce accurate three-dimensional details of the object.
Claims
1. A lens-less system for holographic imaging, comprising: a stationary image sensor configured to capture an image of a sample comprising an object of interest, the stationary image sensor being located at an object-sensor distance z.sub.1 from the sample on a first side of the sample; a stationary illumination source configured to illuminate the sample positioned at an object-source distance z.sub.2 from the sample on a second side opposite the first side of the sample, wherein the stationary image sensor captures a lens-less holographic image of the object illuminated with light originating from the stationary illumination source; and a processor configured to perform wavefront reconstruction of the lens-less holographic image of the sample to produce three-dimensional details of the object, wherein the stationary illumination source comprises a coherent light source configured to produce a divergent coherent light, wherein the divergent coherent light is scattered by the object of interest in the sample to produce scattered light that interferes with the divergent coherent light to produce interference patterns, wherein the lens-less holographic image of the sample captured by the stationary image sensor comprises the interference patterns, and wherein the processor is further configured to apply equations governing coherent light propagation for both spherical waves and planar waves in a reconstruction of the interference patterns in producing the three-dimensional details of the object.
2. The lens-less system of claim 1, wherein the sample is disposed on a first surface of a substrate, and wherein the stationary illumination source is located closer to a second surface of the substrate than the sample.
3. The lens-less system of claim 1, wherein the stationary image sensor captures a reference lens-less holographic image of the light originating from the stationary illumination source, and wherein the reference lens-less holographic image is used as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
4. The lens-less system of claim 1, wherein the sample is disposed within a container, wherein the stationary image sensor captures a reference lens-less holographic image of the container without the sample, and wherein the reference lens-less holographic image is used as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
5. The lens-less system of claim 1, wherein the object-source distance z.sub.2 is greater than the object-sensor distance z.sub.1.
6. The lens-less system of claim 1, wherein a ratio of the object-sensor distance z.sub.1 to the object-source distance z.sub.2 ranges between 0.01 and 1 or within 10-30 percent of 0.1, 0.2, 0.3, 0.4, or 0.5.
7. The lens-less system of claim 1, wherein the processor is further configured to apply a depth calibration based on a mathematical correlation between a numerical magnification m.sub.n and the object-sensor distance z.sub.1 that is specific to a use of either a spherical wave or planar wave reconstruction algorithm and a location of the stationary image sensor and a location of the stationary illumination source with respect to the sample.
8. A method of imaging, comprising: illuminating a sample comprising an object via a stationary illumination source positioned at an object-source distance z.sub.2 from the sample on a first side of the sample; capturing a lens-less holographic image of the sample illuminated with light originating from the stationary illumination source via a stationary image sensor positioned at an object-sensor distance z.sub.1 from the sample on a second side opposite the first side of the sample; and performing wavefront reconstruction of the lens-less holographic image of the sample to produce three-dimensional details of the object, wherein the stationary illumination source comprises a coherent light source configured to produce a divergent coherent light, wherein the divergent coherent light is scattered by the object of interest in the sample to produce scattered light that interferes with the divergent coherent light to produce interference patterns, wherein the lens-less holographic image of the sample captured by the stationary image sensor comprises the interference patterns, and wherein performing the wavefront reconstruction of the lens-less holographic image of the sample comprises applying equations governing coherent light propagation for both spherical waves and planar waves in a reconstruction of the interference patterns in producing the three-dimensional details the object.
9. The method of claim 8, wherein the sample is disposed on a first surface of a substrate, and wherein the stationary illumination source is located closer to a second surface of the substrate than the sample.
10. The method of claim 8, further comprising: capturing a reference lens-less holographic image of the light originating from the stationary illumination source; and applying the reference lens-less holographic image as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
11. The method of claim 8, wherein the sample is disposed within a container, the method further comprising: capturing a reference lens-less holographic image of the container without the sample; and applying the reference lens-less holographic image as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
12. The method of claim 8, wherein the object-source distance z.sub.2 is greater than the object-sensor distance z.sub.1, or a ratio of the object-sensor distance z.sub.1 to the object-source distance z.sub.2 ranges between 0.01 and 1 or within 10-30 percent of 0.1, 0.2, 0.3, 0.4, or 0.5.
13. The method of claim 8, wherein performing the wavefront reconstruction of the lens-less holographic image of the sample further comprises applying a depth calibration based on a mathematical correlation between a numerical magnification m.sub.n and the object-sensor distance z.sub.1 that is specific to a use of either a spherical wave or planar wave reconstruction algorithm and a location of the stationary image sensor and a location of the stationary illumination source with respect to the sample.
14. A holographic imaging device, comprising: a stationary illumination source configured to illuminate a sample comprising an object, the stationary illumination source positioned at an object-source distance z.sub.2 from the sample on a first side of the sample; and a stationary imaging module configured to capture a lens-less holographic image of the sample illuminated with light originating from the stationary illumination source, the stationary imaging module being located at an object-sensor distance z.sub.1 from the sample on a second side opposite the first side of the sample, wherein: the stationary imaging module is further configured to perform wavefront reconstruction of the lens-less holographic image of the sample to produce three-dimensional details of the object, the stationary illumination source comprises a coherent light source configured to produce a divergent coherent light, the divergent coherent light is scattered by the object of interest in the sample to produce scattered light that interferes with the divergent coherent light to produce interference patterns, the lens-less holographic image of the sample captured by the stationary imaging module comprises the interference patterns, and the stationary imaging module is further configured to apply equations governing coherent light propagation for both spherical waves and planar waves in a reconstruction of the interference patterns to produce the three-dimensional details of the object.
15. The holographic imaging device of claim 14, wherein: the stationary imaging module is further configured to capture a reference lens-less holographic image of the light originating from the stationary illumination source, and the reference lens-less holographic image is used as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
16. The holographic imaging device of claim 14, wherein: the interference patterns in the lens-less holographic image of the sample are processed by using a depth calibration based on a mathematical correlation between a numerical magnification m.sub.n and the object-sensor distance z.sub.1 that is specific to a use of either a spherical wave or planar wave reconstruction algorithm and a location of the stationary image sensor and a location of stationary illumination with respect to the sample.
17. The lens-less system of claim 1, wherein the sample is disposed on a first surface of a substrate, and wherein the stationary illumination source is located closer to the sample than a second surface of the substrate.
18. The method of claim 8, wherein the sample is disposed on a first surface of a substrate, and wherein the stationary illumination source is located closer to the sample than a second surface of the substrate.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) For a more complete understanding of the principles disclosed herein, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13) It is to be understood that the figures are not necessarily drawn to scale, nor are the objects in the figures necessarily drawn to scale in relationship to one another. The figures are depictions that are intended to bring clarity and understanding to various embodiments of apparatuses, systems, and methods disclosed herein. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. Moreover, it should be appreciated that the drawings are not intended to limit the scope of the present teachings in any way.
DETAILED DESCRIPTION
(14) In accordance with one or more embodiments, a lens-less system for holographic imaging (also referred to herein as a lens-less holographic imaging device, a lens-less holoscope instrument, a lens-less holoscope, or simply a holoscope) is provided. Throughout the disclosure, the terms holographic imaging system and holographic imaging device are used interchangeably and meant to describe the lens-less system for holographic imaging. As described herein, the disclosed holographic imaging system/device may be constructed as a low-cost, lens-less, digital inline microscope that allows a user to visualize a sample (e.g., a microscopic sample or a sample containing an object or an organism of interest) in three dimensions.
(15)
(16) As further illustrated in
(17) As illustrated in
(18) In one or more embodiments, the holographic imaging system 100a may include a processor (e.g., a computing device or system) configured to produce a holographic image (not shown in
(19) Now referring to
(20) As further illustrated in
(21) As illustrated in
(22) In one or more embodiments, the holographic imaging system 100b may include a processor (e.g., a computing device or system) configured to produce a holographic image (not shown in
(23) Additional details regarding imaging methodologies of the disclosed holographic imaging systems 100a/100b of
(24) In both holographic imaging systems 100a/100b, a laser or a diode, e.g., a 654-nm laser diode, can be used as a coherent illumination source (120a/120b) to produce divergent light waves or collimated light waves for optical magnification. The lights from the illumination source are scattered due to diffraction by the sample 105a or 105b positioned at a given distance z.sub.1 from the imaging sensor 110a or 110b, respectively. The scattered light can interfere with undisturbed light, forming an interferogram, also known as a hologram, comprising interference patterns, which can be captured by an imaging sensor or imaging module, such as for example, a charge-coupled device (CCD) sensor or a complementary metal-oxide semiconductor (CMOS) sensor. The lens-less nature of the disclosed holographic imaging system can enable the interference patterns to occur directly on the imaging sensor, such that the captured digital hologram may be processed numerically to reconstruct the wavefront, re-focusing it in the computational domain until the object of interest is observed, in accordance with various embodiments herein.
(25) Even though light emitted by the illumination source 120a may be composed predominantly of divergent spherical waves, the sample can be accurately reconstructed at its true depth plane from the imaging sensor by applying a depth calibration to collimated plane wave reconstruction algorithms, in accordance with one or more embodiments disclosed herein. Furthermore, the embodiments of the techniques and methodologies described herein can illustrate capabilities of the lens-less digital holographic imaging or microscopy that can be used to perform spherical wavefront reconstruction with plane wave approximations. The lens-less (or lens-free) configuration in the holographic imaging device may help avoid optical distortions and aberrations commonly found in conventional lens-based optical systems while simultaneously preventing glaring and flaring from various optical surfaces, in accordance with one or more embodiments herein. Thus, the disclosed approach may help with reduction, or even elimination, of the number of system components, particularly the complex imaging optical components, needed to lower the overall cost of such systems.
(26) In contrary to conventional microscope imaging that uses a physical imaging or focusing lens to form a two-dimensional image on an imaging sensor, where a depth of field equals to its focal plane, the disclosed holographic imaging device can capture a hologram that is representative of a three-dimensional (3D) volume without any of the conventionally used optical lenses. The disclosed embodiments illustrate various ways this can be achieved by applying a wavefront reconstruction to propagate through the 3D volumetric space and numerically focus the sample at any given depth plane. Furthermore, the disclosed holographic imaging device may produce the holographic image of the object of interest by applying equations governing coherent light propagation for either or both spherical waves or/and planar waves in the reconstruction. To better illustrate and describe the details of the disclosed device and methodologies for holographic imaging, reference is now made to the following descriptions taken in conjunction with the accompanying figures further below, particularly with respect to coherent wavefront reconstructions of the interference patterns using spherical and/or plane wave approximations (e.g., coherent spherical wavefront reconstruction using a plane wave approximation or coherent plane wavefront reconstruction using a spherical wave approximation).
(27) In various embodiments, the disclosed holographic imaging system 100a/100b may include a system configuration where the illumination source, the sample/object, and the imaging sensor are placed on the same optical axis to form an in-line digital holographic imaging setup, as illustrated in
(28)
where a(x, y) represents the absorption and (x, y) the phase distribution as the wave is scattered. The incident wave remains unchanged where there is either no object or where a(x, y) and (x, y)=0 such that the transmission function t(x, y)=1. If the object transmission function is described in terms of a perturbation {tilde over (t)}(x, y) imposed on the reference wave such that:
(29)
the wavefront distribution beyond the object can be represented mathematically using separate contributions from the reference and object waves as:
(30)
where the first term describes the reference, and the second term describes the object wave.
(31) The propagation of the wave towards the detector can be described using the Fresnel-Kirchhoff diffraction formula:
(32)
wherein {right arrow over (r)}=(x, y, z.sub.2), {right arrow over (R)}=(X, Y, z.sub.1+z.sub.2) and |{right arrow over (r)}{right arrow over (R)}|={square root over ((xX).sup.2+(yY).sup.2+(z.sub.1).sup.2 )}which denotes the distance between a point in the object plane and the detector plane. Before reconstruction the hologram is normalized by a background image (B(X, Y)=|R(X, Y)|.sup.2) that is recorded under the same experimental conditions as the hologram, but without the presence of the object. The distribution of the normalized hologram H.sub.0(X, Y) thus does not depend on incident or reference wave intensity or detector sensitivity. The reconstruction of a digital hologram can then include a multiplication of the normalized hologram with the reference wave R(X, Y) followed by back-propagation to the object plane based on the Fresnel-Kirchhoff diffraction integral:
(33)
The reconstructed wavefront corresponds to {tilde over (t)}(x, y) that can be added to 1 to obtain the transmission function t(x, y).
(34) In various embodiments described herein, a plane wave can be used in the calculations. For example, a plane wave can be represented by a complex-valued distribution exp (i(k.sub.xx+k.sub.yy+k.sub.zz)), where k.sub.x, k.sub.y and k.sub.z are its vector components (k=2/, where denotes the wavelength). k.sub.x=k.sub.y=0 for an optical axis along the plane wave propagation and choosing z=0 at the object location leads to U.sub.incident(x, y)=1. The exit wave using Eq. (3) then follows as U.sub.exit(x, y)=t(x, y). Wave propagating from the object plane (x, y) towards the detector plane (X, Y) can be described by the Fresnel-Kirchoff diffraction formula as:
(35)
Wave propagation from the object plane (x,y) towards the detector plane (X,Y) can be described by the angular spectrum method, or in other words, the propagation of its spectrum. The scattering vector components can be given as:
(36)
where and are the azimuthal and polar angles in the spherical coordinate system, respectively, correspond to the Fourier domain coordinates (u, v) as cossin=u and sinsin=v. (u, v) are the directional cosines of the vector {right arrow over (k)} and thus follow (u).sup.2+(v).sup.21. The propagation of the exit wave to the detector plane of the imaging sensor based on angular spectrum method can transform Eq. (6) into the following equation:
(37)
where the term
(38)
can be simulated. The reconstruction of the hologram can then be performed by:
(39)
(40) In various embodiments described herein, a spherical wave can be used in the calculations. For example, performing a similar derivation for spherical waves, the incident wave in the object plane is given by:
(41)
where {right arrow over (r)}=(x, y, z.sub.2) and the object-source distance z.sub.2 is the distance between the source and object plane. The exit wave based on Eq. (3) becomes:
(42)
and wave propagation from the object towards the detector plane based on the Fresnel-Kirchoff diffraction formula can be given as:
(43)
Using a paraxial approximation where
(44)
Eq. (11) can be rewritten in the form of a convolution of the transmission function:
(45)
which can be expressed in the form of a wave propagation routine that employs two Fourier transforms:
(46)
Reconstruction of a digital hologram with spherical waves can then be given by:
(47)
(48) As illustrated in
(49) The achievable lateral resolution imposed by the diffraction limit in digital in-line holography is defined as:
(50)
where N denotes the number of pixels, is the pixel size and S=N is the hologram side length. In reality, the achievable resolution is governed by the mechanical stability of the optical setup which can be evaluated by analyzing the Fourier spectrum of a hologram. If the highest frequency u.sub.max noted in the Fourier spectrum occurs at pixel P from the spectrum center, its coordinate is given by: u.sub.max=.sub.fP where
(51)
is the pixel size in the Fourier domain. Using the classical Abbe criterion given by
(52)
where .sub.max is the maximum possible angle of the scattered wave and the expression sin.sub.max=u.sub.max, we get:
(53)
Equating the theoretical resolution estimate in Eq. (15) to practical limit in Eq. (16), we obtain:
(54)
As the optical object-sensor distance z.sub.0 at which the lateral resolution intrinsic to the digital hologram is maximized.
(55) In the case of a divergent spherical wave, the optical object-sensor distance z.sub.0 can be determined by equating the physical resolution of the system based on pixel pitch to the resolution based on the numerical aperture of the system as:
(56)
where .sub.p is the physical pixel pitch,
(57)
is the geometric magnification, is the illumination wavelength and NA=N/[2{square root over ((N/2).sup.2+z.sub.1.sup.2)}] is the numerical aperture of the imaging system in which N is the pixel number of the sensor. Substituting m and NA into Eq. (18) and denoting L=z.sub.1+z.sub.2 we obtain:
(58)
which can be simplified to get:
(59)
Applying further approximations such that
(60)
and |N.sup.2.sup.4/L.sup.2|<<4.sup.2, the final solution for z.sub.0 becomes:
(61)
(62) Theorical considerations for wavefront reconstruction and theory described in the preceding sections assume that the incident light waves emitted by illumination source are either completely spherical or planar. In some instances, the light waves emitted by a divergent light source are not entirely spherical or planar but are characterized by a combination of both qualities. In one or more embodiments, the composition of light waves can be determined by examining the characteristics of the light source. For wavefront reconstruction purposes, better-suited algorithms are determined for practical application. If both algorithms are unable to reconstruct the object at its true depth, a propagation depth calibration that is specific to the device design and illumination source can be developed for accurate wavefront reconstruction depth characterization, in accordance with one or more embodiments. Similarly, theoretical estimations for hologram reconstruction resolution created by perfectly plane and spherical waves may not be applied accurately to a real wave that is neither completely planar nor spherical. In one or more embodiments, determining the reconstruction resolution specific to an illumination source and device design may include an experimental analysis that entails capturing and reconstructing digital holograms for various illumination source, sample and imaging sensor distance configurations to investigate which range of distances between each component can provide a resolution that meets the application criteria. In one or more embodiments, the holographic imaging system 100a/100b may be designed and calibrated for its wavefront reconstruction algorithms to accurately obtain the true sample depth from the imaging sensor 110a/110b.
(63) In accordance with one or more embodiments, the resolution and wavefront reconstruction algorithm can be determined based on 1) increasing distances between the sample and imaging sensor and 2) small and large distances between the source and imaging sensor. Both these elements were also explored from the perspective of developing a more ergonomic design since a larger sample-detector and source-detector distance can allow for more space to insert or retrieve a sample.
(64)
(65) In one or more embodiments, the holographic imaging system 200b may include knobs 204 to manipulate the xy-translation mechanism and wavefront reconstruction resolution with respect to a small source-imaging sensor distance (e.g., object-sensor distance z.sub.1+object-source distance z.sub.2 as illustrated in
(66)
(67)
(68) An estimate of the optimal object-sensor distance z.sub.0 obtained from Eq. 21 for spherical wavefront reconstruction at which the resolution maximizes is found to be z.sub.0=0.354 m for prototype 1 and z.sub.0=2.487 m for prototype 2. Both estimates have absolute values that are much larger than the source-sensor distance permissible by each prototype design. In addition, the meaning of negative values is difficult to explain in physical terms. In estimating the optimal object-sensor distance z.sub.0 for simple digital inline holographic (DIH) instruments employing divergent waves shows the impracticality of theories in determining accurate wavefront reconstruction resolution and its dependence on the source-sensor distance. The impracticality results from assumptions pertaining to ideal incident light wave properties that are not representative of incident wavefronts in real-world scenarios. Various experimental trial and error designs provide results that can be used to inform the design of the disclosed holographic imaging system/device that allows for an object-sensor distance z.sub.1 that is always less than optimal distance z.sub.0. A distance large than z.sub.1 would result in decreasing wavefront reconstruction resolutions.
(69) In accordance with one embodiment, the practical relevance is investigated for both the planar wave and spherical wave reconstruction theory to real incident waves emitted by a divergent light source that are neither completely spherical nor completely planar. Given the true magnification of the optical waves at each slot location of the prototype device is known, based on the ratio of the true physical pixel size .sub.p and its effective size .sub.e in the resolution target reconstruction, the numeric values of magnification used in spherical and planar wave reconstruction algorithms are adjusted until the sample can be numerically focused at its true depth distance from the imaging sensor.
(70)
(71)
(72) Since both the planar and spherical wave reconstruction algorithms show an increasing trend with respect to the sample-sensor distance z.sub.1, a correlation between the numeric magnification m.sub.n and z.sub.1 can be used to adapt either of the planar or spherical wave reconstruction algorithms for accurate depth reconstruction, in accordance with various embodiments. Since the incident waves, based on the deviation from planar and spherical wave reconstruction algorithms, are inferred to be more planar than spherical in physical nature, the planar wave reconstruction algorithm can be adapted with an m.sub.nz.sub.1 correlation for wavefront reconstruction of holograms produced by a divergent light source. In terms of applied theory, the magnification (m) has direct implications on the effective pixel size of the sensor that translates to a corresponding change in the Fourier domain coordinates (u, v) used in the simulated term:
(73)
of Eq. (8) in the plane wave reconstruction algorithm. Contrary to an m.sub.nz.sub.1 correlation, a correlation between m.sub.n and the true magnification m.sub.t, based on the ratio of the true physical pixel size .sub.p and its effective size .sub.e, could also be used. However, in this case, three inputs would be needed to perform an accurate reconstruction. First, using .sub.p and .sub.e to calculate m.sub.t and then using a correlation between m.sub.nm.sub.t to evaluate (u, v). In one embodiment, m.sub.t can be estimated using the geometric magnification
(74)
however, since the incident wave are not entirely spherical, the estimation may not be accurate. Hence, using a design and illumination specific m.sub.nz.sub.1 correlation, makes accurate reconstruction much simpler.
(75)
(76)
(77)
(78) Accordingly, the lens-less system for holographic imaging or a holographic imaging system/device is provided. The design, theory, working principle and resolution limitations of an inline lens-less digital holographic microscope are presented that adapts planar wavefront reconstruction algorithms to operate with a divergent coherent illumination source. The lens-less design helps reduce the number of optical components needed, eliminate the optical aberrations, distortions, glaring and flaring effects and reduce the overall device cost. The device offers three different slots for sample placement having sample-sensor distances of z.sub.1=9.4 mm, 17.9 mm and 26.4 mm, or any suitable distances. The sample can be placed on a microscope slide or in a cuvette in the form of a solution. Substrates, such as a microscope slide, can be placed in all three slots while the cuvette can be placed in the middle slot. The total distance between the source and sensor is z.sub.2=53.1 mm. The digital holographic imaging system/device for three-dimensional numerical focusing enables the application of planar (or spherical) wavefront reconstruction algorithms to actual wavefronts that are not completely spherical nor planar. The methodology to apply these algorithms to real-life waves using approximations based on numerical magnification and object-sensor distance correlations is a solution for practical application of these algorithms in all digital holographic devices employing divergent wavefront for reconstruction purposes. Digital holographic devices employing divergent light rays do not necessarily need to have a lens-less configuration. However, most divergent ray devices are found to be lens-less as the divergent property of rays needed for optical magnification is inherently created by the point nature of the light source employed. Hence, optical lenses are not required to expand the beam and objective lenses are not required for magnification purposes if sufficient magnification can be achieved just by geometric dilution of light for the intended imaging application.
(79)
(80) In various embodiments of the present teachings, computer system 1000 can include a bus 1002 or other communication mechanism for communicating information and a processor 1004 coupled with bus 1002 for processing information. In various embodiments, computer system 1000 can also include a memory, which can be a random-access memory (RAM) 1006 or other dynamic storage device, coupled to bus 1002 for determining instructions to be executed by processor 1004. Memory can also be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 1004. In various embodiments, computer system 1000 can further include a read only memory (ROM) 1008 or other static storage device coupled to bus 1002 for storing static information and instructions for processor 1004. A storage device 1010, such as a magnetic disk or optical disk, can be provided and coupled to bus 1002 for storing information and instructions.
(81) In various embodiments, computer system 1000 can be coupled via bus 1002 to a display 1012, such as a cathode ray tube (CRT) or liquid crystal display (LCD), for displaying information to a computer user. An input device 1014, including alphanumeric and other keys, can be coupled to bus 1002 for communication of information and command selections to processor 1004. Another type of user input device is a cursor control 1016, such as a mouse, a trackball or cursor direction keys for communicating direction information and command selections to processor 1004 and for controlling cursor movement on display 1012. This input device 1014 typically has two degrees of freedom in two axes, a first axis (i.e., x) and a second axis (i.e., y), that allows the device to specify positions in a plane. However, it should be understood that input devices 1014 allowing for 3-dimensional (x, y and z) cursor movement are also contemplated herein. In accordance with various embodiments, components 1012/1014/1016, together or individually, can make up a control system that connects the remaining components of the computer system to the systems herein and methods conducted on such systems, and controls execution of the methods and operation of the associated system.
(82) In various embodiments, the computer system 1000 includes an output device 1018. In various embodiments, the output device 1018 can be a wireless device, a computing device, a portable computing device, a communication device, a printer, a graphical user interface (GUI), a gaming controller, a joy-stick controller, an external display, a monitor, a mixed reality device, an artificial reality device, or a virtual reality device.
(83) Consistent with certain implementations of the present teachings, results can be provided by computer system 1000 in response to processor 1004 executing one or more sequences of one or more instructions contained in memory 1006. Such instructions can be read into memory 1006 from another computer-readable medium or computer-readable storage medium, such as storage device 1010. Execution of the sequences of instructions contained in memory 1006 can cause processor 1004 to perform the processes described herein. Alternatively, hard-wired circuitry can be used in place of or in combination with software instructions to implement the present teachings. Thus, implementations of the present teachings are not limited to any specific combination of hardware circuitry and software.
(84) The term computer-readable medium (e.g., data store, data storage, etc.) or computer-readable storage medium as used herein refers to any media that participates in providing instructions to processor 1004 for execution. Such a medium can take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Examples of non-volatile media can include, but are not limited to, dynamic memory, such as memory 1006. Examples of transmission media can include, but are not limited to, coaxial cables, copper wire, and fiber optics, including the wires that comprise bus 1002.
(85) Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, PROM, and EPROM, a FLASH-EPROM, another memory chip or cartridge, or any other tangible medium from which a computer can read.
(86) In addition to computer-readable medium, instructions or data can be provided as signals on transmission media included in a communications apparatus or system to provide sequences of one or more instructions to processor 1004 of computer system 1000 for execution. For example, a communication apparatus may include a transceiver having signals indicative of instructions and data. The instructions and data are configured to cause one or more processors to implement the functions outlined in the disclosure herein. Representative examples of data communications transmission connections can include, but are not limited to, telephone modem connections, wide area networks (WAN), local area networks (LAN), infrared data connections, NFC connections, etc.
(87) It should be appreciated that the methodologies described herein, flow charts, diagrams and accompanying disclosure can be implemented using computer system 1000 as a standalone device or on a distributed network or shared computer processing resources such as a cloud computing network.
(88) The methodologies described herein may be implemented by various means depending upon the application. For example, these methodologies may be implemented in hardware, firmware, software, or any combination thereof. For a hardware implementation, the processing unit may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
(89) In various embodiments, the methods of the present teachings may be implemented as firmware and/or a software program and applications written in conventional programming languages such as C, C++, Python, etc. If implemented as firmware and/or software, the embodiments described herein can be implemented on a non-transitory computer-readable medium in which a program is stored for causing a computer to perform the methods described above. It should be understood that the various engines described herein can be provided on a computer system, such as computer system 1000, whereby processor 1004 would execute the analyses and determinations provided by these engines, subject to instructions provided by any one of, or a combination of, memory components 1006/1008/1010 and user input provided via input device 1014.
(90) While the present teachings are described in conjunction with various embodiments, it is not intended that the present teachings be limited to such embodiments. On the contrary, the present teachings encompass various alternatives, modifications, and equivalents, as will be appreciated by those of skill in the art. In describing the various embodiments, the specification may have presented a method and/or process as a particular sequence of steps. However, to the extent that the method or process does not rely on the particular order of steps set forth herein, the method or process should not be limited to the particular sequence of steps described, and one skilled in the art can readily appreciate that the sequences may be varied and still remain within the spirit and scope of the various embodiments.
(91)
(92) As illustrated in
(93) Furthermore, the method S100 may optionally include, at step S150, applying the reference lens-less holographic image as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
(94) In various embodiments of the method S100, the stationary illumination source may include a coherent light source configured to produce a divergent coherent light, wherein the divergent coherent light is scattered by the object of interest in the sample to produce scattered light, which interferes with the undisturbed (un-scattered) divergent coherent light to produce interference patterns. In one or more embodiments, the lens-less holographic image of the sample captured by the stationary image sensor includes the interference patterns.
(95) In various embodiments of the method S100, performing the wavefront reconstruction of the lens-less holographic image of the sample may further include applying equations governing coherent light propagation for either spherical waves or planar waves in a reconstruction of the interference patterns to obtain three-dimensional details of the object.
(96) In various embodiments of the method S100, performing the wavefront reconstruction of the lens-less holographic image of the sample may further include applying equations governing coherent light propagation for both spherical waves and planar waves in a reconstruction of the interference patterns to produce three-dimensional details of the object.
(97) In various embodiments, the object-source distance z.sub.2 is much greater than the object-sensor distance z.sub.1. In various embodiments, a ratio of the object-sensor distance z.sub.1 to the object-source distance z.sub.2 ranges between 0.01 and 1. In various embodiments, a ratio of the object-sensor distance z.sub.1 to the object-source distance z.sub.2 ranges within 10-30 percent of 0.1, 0.2, 0.3, 0.4, or 0.5.
(98) In various embodiments, wherein performing the wavefront reconstruction of the lens-less holographic image of the sample at step S130 may include applying equations governing coherent light propagation for either spherical waves or planar waves in a reconstruction of the interference patterns in producing the three-dimensional details of the object. In one or more embodiments, wherein performing the wavefront reconstruction of the lens-less holographic image of the sample at step S130 may include applying equations governing coherent light propagation for both spherical waves and planar waves in a reconstruction of the interference patterns in producing the three-dimensional details the object. In one or more embodiments, wherein performing the wavefront reconstruction of the lens-less holographic image of the sample at step S130 may further include applying a depth calibration based on a mathematical correlation between numerical magnification m.sub.n and the object-sensor distance z.sub.1 that is specific to the use of either spherical wave or planar wave reconstruction algorithm and a location of the stationary image sensor and a location of the stationary illumination source with respect to the sample.
(99) In various embodiments, applications of the calibrated wavefront reconstruction in producing three-dimensional details of the object may include obtaining the three-dimensional edgelines, shape, geometry, morphology, phase, and location of an object within the sample. This makes the method and imaging device relevant to the analyses of object size distribution, dispersion, orientation, structure, and motion within a test sample.
EMBODIMENTS
(100) Embodiment 1. A lens-less system for holographic imaging, comprising: a stationary image sensor configured to capture an image of a sample comprising an object of interest, the stationary image sensor being located at an object-sensor distance z.sub.1 from the sample on a first side of the sample; a stationary illumination source configured to illuminate the sample positioned at an object-source distance z.sub.2 from the sample on a second side opposite the first side of the sample, wherein the stationary image sensor captures a lens-less holographic image of the object illuminated with light originating from the stationary illumination source; and a processor configured to perform wavefront reconstruction of the lens-less holographic image of the sample to produce three-dimensional details of the object.
(101) Embodiment 2. The lens-less system of embodiment 1, wherein the sample is disposed on a substrate, and wherein the stationary illumination source is located closer to the substrate than the sample.
(102) Embodiment 3. The lens-less system of embodiments 1 or 2, wherein the stationary image sensor captures a reference lens-less holographic image of the light originating from the stationary illumination source, and wherein the reference lens-less holographic image is used as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
(103) Embodiment 4. The lens-less system of embodiment 1, wherein the sample is disposed within a container, wherein the stationary image sensor captures a reference lens-less holographic image of the container without the sample, and wherein the reference lens-less holographic image is used as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
(104) Embodiment 5. The lens-less system of any one of embodiments 1-4, wherein the stationary illumination source comprises a coherent light source configured to produce a divergent coherent light, wherein the divergent coherent light is scattered by the object of interest in the sample to produce scattered light, which interferes with the undisturbed (un-scattered) divergent coherent light to produce interference patterns, and wherein the lens-less holographic image of the sample captured by the stationary image sensor comprises the interference patterns.
(105) Embodiment 6. The lens-less system of any one of embodiments 1-5, wherein the object-source distance z.sub.2 is much greater than the object-sensor distance z.sub.1.
(106) Embodiment 7. The lens-less system of any one of embodiments 1-5, wherein a ratio of the object-sensor distance z.sub.1 to the object-source distance z.sub.2 ranges between 0.01 and 1 or within 10-30 percent of 0.1, 0.2, 0.3, 0.4, or 0.5.
(107) Embodiment 8. The lens-less system of any one of embodiments 5-7, wherein the processor is further configured to apply equations governing coherent light propagation for either spherical waves or planar waves in a reconstruction of the interference patterns in producing the three-dimensional details of the object.
(108) Embodiment 9. The lens-less system of any one of embodiments 5-7, wherein the processor is further configured to apply equations governing coherent light propagation for both spherical waves and planar waves in a reconstruction of the interference patterns in producing the three-dimensional details of the object.
(109) Embodiment 10. The lens-less system of embodiments 8 or 9, wherein the processor is further configured to apply a depth calibration based on a mathematical correlation between numerical magnification m.sub.n and the object-sensor distance z.sub.1 that is specific to the use of either spherical wave or planar wave reconstruction algorithm and a location of the stationary image sensor and a location of the stationary illumination source with respect to the sample.
(110) Embodiment 11. A method of imaging, comprising: illuminating a sample comprising an object via a stationary illumination source positioned at an object-source distance z.sub.2 from the sample on a first side of the sample; capturing a lens-less holographic image of the sample illuminated with light originating from the stationary illumination source via a stationary image sensor positioned at an object-sensor distance z.sub.1 from the sample on a second side opposite the first side of the sample; and performing wavefront reconstruction of the lens-less holographic image of the sample to produce three-dimensional details of the object.
(111) Embodiment 12. The method of embodiment 11, wherein the sample is disposed on a substrate, and wherein the stationary illumination source is located closer to the substrate than the sample.
(112) Embodiment 13. The method of embodiments 11 or 12, further comprising: capturing a reference lens-less holographic image of the light originating from the stationary illumination source; and applying the reference lens-less holographic image as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
(113) Embodiment 14. The method of embodiment 11, wherein the sample is disposed within a container, the method further comprising: capturing a reference lens-less holographic image of the container without the sample; and applying the reference lens-less holographic image as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
(114) Embodiment 15. The method of any one of embodiments 11-14, wherein the stationary illumination source comprises a coherent light source configured to produce a divergent coherent light, wherein the divergent coherent light is scattered by the object of interest in the sample to produce scattered light, which interferes with the undisturbed (un-scattered) divergent coherent light to produce interference patterns, and wherein the lens-less holographic image of the sample captured by the stationary image sensor comprises the interference patterns.
(115) Embodiment 16. The method of any one of embodiments 11-15, wherein the object-source distance z.sub.2 is much greater than the object-sensor distance z.sub.1, or a ratio of the object-sensor distance z.sub.1 to the object-source distance z.sub.2 ranges between 0.01 and 1 or within 10-30 percent of 0.1, 0.2, 0.3, 0.4, or 0.5.
(116) Embodiment 17. The method of embodiments 15 or 16, wherein performing the wavefront reconstruction of the lens-less holographic image of the sample comprises applying equations governing coherent light propagation for either spherical waves or planar waves in a reconstruction of the interference patterns in producing the three-dimensional details of the object.
(117) Embodiment 18. The method of embodiments 15 or 16, wherein performing the wavefront reconstruction of the lens-less holographic image of the sample comprises applying equations governing coherent light propagation for both spherical waves and planar waves in a reconstruction of the interference patterns in producing the three-dimensional details the object.
(118) Embodiment 19. The method of embodiments 17 or 18, wherein performing the wavefront reconstruction of the lens-less holographic image of the sample further comprises applying a depth calibration based on a mathematical correlation between numerical magnification m.sub.n and the object-sensor distance z.sub.1 that is specific to the use of either spherical wave or planar wave reconstruction algorithm and a location of the stationary image sensor and a location of the stationary illumination source with respect to the sample.
(119) Embodiment 20. A holographic imaging device, comprising: a stationary illumination source configured to illuminate a sample comprising an object, the stationary illumination source positioned at an object-source distance z.sub.2 from the sample on a first side of the sample; and a stationary imaging module configured to capture a lens-less holographic image of the sample illuminated with light originating from the stationary illumination source, the stationary imaging module being located at an object-sensor distance z.sub.1 from the sample on a second side opposite the first side of the sample, wherein the stationary imaging module is further configured to perform wavefront reconstruction of the lens-less holographic image of the sample to produce three-dimensional details of the object.
(120) Embodiment 21. The holographic imaging device of embodiment 20, wherein: the stationary imaging module is further configured to capture a reference lens-less holographic image of the light originating from the stationary illumination source, and the reference lens-less holographic image is used as a base line to reduce image artifacts and/or remove noise from the lens-less holographic image of the sample.
(121) Embodiment 22. The holographic imaging device of embodiments 20 or 21, wherein: the stationary illumination source comprises a coherent light source configured to produce a divergent coherent light, the divergent coherent light is scattered by the object of interest in the sample to produce scattered light, which interferes with the undisturbed (un-scattered) divergent coherent light to produce interference patterns, and the lens-less holographic image of the sample captured by the stationary imaging module comprises the interference patterns.
(122) Embodiment 23. The holographic imaging device of embodiment 22, wherein: the stationary imaging module is further configured to apply equations governing coherent light propagation for both spherical waves and planar waves in a reconstruction of the interference patterns to produce the three-dimensional details of the object, and the interference patterns in the lens-less holographic image of the sample are processed by using a depth calibration based on a mathematical correlation between numerical magnification m.sub.n and the object-sensor distance z.sub.1 that is specific to the use of either spherical wave or planar wave reconstruction algorithm and a location of the stationary image sensor and a location of stationary illumination with respect to the sample.