3D DIFFRACTION TOMOGRAPHY MICROSCOPY IMAGING METHOD BASED ON LED ARRAY CODED ILLUMINATION

20210372916 · 2021-12-02

Assignee

Inventors

Cpc classification

International classification

Abstract

The present invention discloses a three-dimensional diffraction tomography microscopy imaging method based on LED array coded illumination. Firstly, acquiring the raw intensity images, three sets of intensity image stacks are acquired at different out-of-focus positions by moving the stage or using electrically tunable lens. And then, after acquiring the intensity image stacks of the object to be measured at different out-of-focus positions, the three-dimensional phase transfer function of the microscopy imaging system with arbitrary shape illumination is derived. Further, the three-dimensional phase transfer function of the microscopic system under circular and annular illumination with different coherence coefficients is obtained as well, and the three-dimensional quantitative refractive index is reconstructed by inverse Fourier transform of the three-dimensional scattering potential function. The scattering potential function is converted into the refractive index distribution. Thus, the quantitative three-dimensional refractive index distribution of the test object is obtained. The invention realizes high-resolution and high signal-to-noise ratio 3D diffraction tomography microscopic imaging of cells, tiny biological tissues and other samples.

Claims

1. A method of three-dimensional diffraction tomography microscopy imaging based on LED array coded illumination, characterized by the steps of: step 1: acquisition of the original intensity image, in the case of the measured thick object is in focus, and by changing the LED array code so that the shape of the illumination source is a circle with coherence coefficient S1, S2 and S3, by moving the carrier table or using the electrically tunable lens to acquire three sets of intensity image stacks at different out-of-focus positions I.sub.s1.sup.1, I.sub.s1.sup.2, . . . I.sub.s1.sup.i, . . . I.sub.s1.sup.N, I.sub.s2.sup.1, I.sub.s2.sup.2, . . . I.sub.s2.sup.i, . . . I.sub.s2.sup.N and I.sub.s3.sup.1, I.sub.s3.sup.2, . . . I.sub.s3.sup.i, . . . I.sub.s3.sup.N; step 2: by changing the LED array coding so that the illumination pattern is in the shape of a circle with a coherence factor S4, and then by moving the carrier or using an electrically tunable lens to capture the intensity image stack of the object to be measured at different out-of-focus positions I.sub.s4.sup.1, I.sub.s4.sup.2, . . . I.sub.s4.sup.i, . . . I.sub.s4.sup.N; step 3: deriving the three-dimensional phase transfer function of the microscopic imaging system with arbitrary shape illumination, extending the three-dimensional transfer function model of tilted coherent point light source to the three-dimensional transfer function model of partially coherent illumination and annular illumination, and obtaining the three-dimensional phase transfer function of the microscopic system under circular and annular illumination with different coherence parameters; step 4: 3D diffraction tomography quantitative refractive index deconvolution reconstruction, 3D Fourier transform of the acquired four groups of intensity image stacks to obtain the 3D spectrum of the four illumination cases, the four groups of 3D spectra will be summed, then divided in the frequency domain by the sum of the absolute values of the four 3D phase transfer functions to obtain the 3D scattering potential function; step 5: the quantitative three-dimensional refractive index distribution of the measured object, the inverse Fourier transform of the three-dimensional scattering potential function, and the conversion of the scattering potential function into the refractive index distribution, can be obtained from the quantitative three-dimensional refractive index distribution of the measured object.

2. The method according to claim 1, characterized in step 2: the circular illumination scheme is introduced into a conventional circular brightfield microscope by LED array encoded illumination, and a series of light intensity image stacks along the axial direction under the circular shape illumination pattern are taken I.sub.s4.sup.1, I.sub.s4.sup.2, . . . I.sub.s4.sup.i, . . . I.sub.s4.sup.N.

3. The method according to claim 2, characterized by changing the LED array coding so that the circular illumination pattern from small to large, the fourth group will change the illumination pattern to circular illumination pattern, and then in each lighting conditions, through the electrically tunable lens to collect the intensity of the object to be measured in different out-of-focus position image stack; the first row is the LED array coded illumination pattern, the second row is the image intensity map acquired under the axial Z1 position, the second row is the image stack map acquired under the axial Z2 position, and so on, the intensity image stack of the object to be measured under different out-of-focus positions in the axial direction is acquired by changing the electrically tunable lens.

4. The method according to claim 1, characterized in that step 3 is realized in the following way: the absorbance n.sub.a(r) and refractive index n.sub.P(r) of the three-dimensional object correspond to the imaginary and real parts of the complex refractive index, respectively, the relationship between n(r) the complex refractive index of the object n.sub.0(r) and the refractive index of the enclosing medium is expressed as a three-dimensional scattering potential V(r)=k.sub.0.sup.2[n.sup.2(r)−n.sub.m.sup.2]; where r is the three-dimensional spatial variable, k.sub.0 is the wave number corresponding to the wavelength in the vacuum, n.sub.n, is the refractive index of the medium in which the object is located, and in conventional transmission bright-field microscopy systems, the intensity image measured for a three-dimensional object I(r) can be expressed as
I(r)=B+P(r).Math.H.sub.P(r)+A(r).Math.H.sub.A(r) Where B is the captured transmitted light component, A(r) and P(r) are the imaginary and real parts of the three-dimensional scattering potential of the object, respectively, H.sub.A(r) and H.sub.P(r) are the point spread functions of the imaging system for the absorbed and phase parts of the object, respectively; a three-dimensional Fourier transform of the above equation is used to obtain the three-dimensional Fourier spectrum of the captured intensity map;
Ĩ(ρ)=Bδ(ρ)+{tilde over (P)}(ρ)T.sub.P(ρ)+Ã(ρ)T.sub.A(ρ) where Bδ(ρ) is the zero frequency component of the corresponding intensity image, {tilde over (P)}(ρ) and T.sub.P(ρ) are the 3D transfer functions of the spectrum and phase of the phase component of the scattering potential, respectively, while Ã(ρ) and T.sub.A(ρ) are the 3D transfer functions of the frequency component and absorption of the absorption component of the scattering potential, respectively, and the 3D transfer function corresponding to the phase component is T p ( u , v , w ) = j λ 4 π P ( u + 1 2 u , v + 1 2 v ) P * ( u - 1 2 u , v - 1 2 v ) × [ s ( u + 1 2 u , v + 1 2 v ) - S ( u - 1 2 u , v - 1 2 v ) ] × δ [ w + λ - 2 - ( u - 1 2 u ) 2 - ( v - 1 2 v ) 2 - λ 2 - ( u + 1 2 u ) 2 - ( v + 1 2 v ) 2 ] dudv where ρ=(u,v,w), λ is the corresponding illumination light source wavelength, {tilde over (S)}(u,v) is the light source distribution function, {tilde over (P)}(u,v) and {tilde over (P)}*(u,v) is a pair of conjugate light pupil function defined by the microscope objective, the absolute value of which can be expressed as .Math. P ( u ) .Math. = { 1 , if u ρ P 0 , if u > ρ P where ρ.sub.P is the normalized cutoff frequency of the pupil of the microscope objective; for a coherent point source at any point on the light source surface, i.e. {tilde over (S)}(u,v)=δ(u−ρ.sub.s,v), the corresponding three-dimensional phase transfer function for the source can be obtained by substituting the source function into the above equation as T p ( u , v , w ) = j λ 4 π P * ( ρ s - u , - v ) δ [ w - λ - 2 - ρ s 2 + λ - 2 - ( ρ s - u ) 2 - v 2 ] - j λ 4 π P ( ρ s + u , v ) δ [ w + λ - 2 + ρ s 2 - λ - 2 - ( ρ s + u ) 2 - v 2 ] the above three-dimensional transfer function is divided into δ [ w - λ - 2 - ρ s 2 + λ - 2 - ( ρ s - u ) 2 - v 2 ] and δ [ w + λ - 2 + ρ s 2 - λ - 2 - ( ρ s + u ) 2 - v 2 ] , that is, the two spherical shells moved by the illuminated light source in three-dimensional space ( w + λ - 2 + ρ s 2 ) 2 + ( ρ s + u ) 2 + v 2 = λ - 2 and ( w - λ - 2 - ρ s 2 ) 2 + ( ρ s - u ) 2 + v 2 = λ - 2 , that is, the definition function of the Ewald spherical shell; when the light source is a traditional circular pattern i.e. S ( u ) = { 1 , if u ρ s 0 , if u > ρs S(u) Substitute the expression of the light source into the three-dimensional phase transfer function to obtain the three-dimensional phase transfer function corresponding to a partially coherent illuminated circular light source at different coherence factors ρ.sub.s; when the light source is an annular light source is defined as S ( u ) = .Math. i = 0 N δ ( u i ) , .Math. u i .Math. .Math. ρ p .Math. the form of the transfer function under annular illumination is obtained by substituting the three-dimensional phase transfer function, and the three-dimensional phase transfer function of the microscope system under circular and annular illumination with different coherence parameters is obtained by extending the three-dimensional transfer function model of the tilted coherent point source to the circular partially coherent illumination and annular illumination models.

5. The method according to claim 1, characterized in step 4: the intensity stacks I.sub.s1(r), I.sub.s2(r), and I.sub.s3(r) under the circular light source of the coherence coefficient S1, S2 and S3 Fourier transform to obtain their corresponding intensity map Fourier spectrum Ĩ.sub.s1(ζ), Ĩ.sub.s2(ζ), Ĩ.sub.s3(ζ), and then transform the intensity stacks taken under the circular light source I.sub.s4(r) to their Fourier spectra to obtain Ĩ.sub.s4(ζ), the four intensity stacks obtained the sum of the Fourier spectra of the four intensity stacks is divided by the sum of the T.sub.P4 absolute values of the corresponding four three-dimensional phase transfer functions T.sub.P1, T.sub.P2, and T.sub.P3; the sum of the Fourier spectra of the four intensity stacks is
Ĩ(ζ)=Ĩ.sub.s1(ζ)+Ĩ.sub.s2(ζ)+Ĩ.sub.s3(ζ)+Ĩ.sub.s4(ζ) where Ĩ.sub.s1(ζ), Ĩ.sub.s2(ζ), Ĩ.sub.s3(ζ) and Ĩ.sub.s4(ζ) are the Fourier spectra of intensity maps obtained by Fourier transforming the intensity stacks captured under different illuminated light sources with coherence parameters of S1, S2, S3, and S4 respectively; the sum of the absolute values of the four three-dimensional transfer functions is
T.sub.P(ζ)=|T.sub.P1(ζ)|+|T.sub.P2(ζ)|+|T.sub.P3(ζ)|+|T.sub.P4(ζ)| the T.sub.P1(ζ) Fourier spectrum of the three-dimensional scattering potential function is obtained by dividing the sum of the four intensity stack Fourier spectra by the sum of the T.sub.P4 absolute values of the four corresponding three-dimensional phase transfer functions T.sub.P1, T.sub.P2, T.sub.P3 and T.sub.P2(ζ), T.sub.P3(ζ) T.sub.P4(ζ) respectively.

6. The method according to claim 1, characterized in that step five is achieved by performing a three-dimensional inverse Fourier transform of the three-dimensional scattering potential function by; P ( r ) = F - 1 ( I s 1 ( ζ ) + I s 2 ( ζ ) + I s 3 ( ζ ) + I s 4 ( ζ ) .Math. T P 1 ( ζ ) .Math. + .Math. T P 2 ( ζ ) .Math. + .Math. T P 3 ( ζ ) .Math. + .Math. T P 4 ( ζ ) .Math. ) by P(r) using the scattering potential formula, the quantitative three-dimensional refractive index distribution of the object under test is obtained.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0014] FIG. 1 is a schematic diagram of the three-dimensional diffraction tomography microscopy imaging system of the present invention.

[0015] FIG. 2 is a schematic diagram of the four image stacks of the present invention acquired under four different LED array illumination sources.

[0016] FIG. 3 is a two-dimensional schematic diagram of the corresponding three-dimensional transfer functions under four different LED array lighting sources of the present invention.

[0017] FIG. 4 is a flow chart of the 3D diffraction tomography quantitative refractive index deconvolution reconstruction method of the present invention.

[0018] FIG. 5 is a graph of the 3D imaging results of the present invention on micro-polystyrene beads, unstained Pandorina morum and Hela cells.

DESCRIPTION OF THE PREFERRED EMBODIMENT

[0019] The present invention is based on a three-dimensional diffraction tomography microscopy imaging method with LED array encoded illumination in the following process.

[0020] Step 1, build a three-dimensional diffraction tomography microscopy imaging system: combined with FIG. 1, the microscopy imaging system includes an LED array coding illumination source, a microscope objective, a barrel mirror, a plane mirror, electrically tunable lens control module, a camera and a computer, where the computer is connected to the electrically tunable lens control module and the camera via signal lines, respectively. The lighting source uses LED array coding illumination, and the LED array coding can be changed to achieve different patterns of illumination. Image acquisition is performed by CMOS camera, and the acquired image is passed to the computer for calculation and processing. The axial scan of the sample to be measured is driven by electrically tunable lens control module to acquire the image pile, and the axial scan step of the electrically tunable lens control module is 0.1 μm. four groups of plots are measured under different LED array coded illumination, each group includes 100 images, each image has a resolution of 400×400, and three circular illumination. x, y, and z spatial sampling rates are 0.065 μm, 0.065 μm and 0.1 μm. Based on the circular and annular illumination conditions with different coherence parameters, the acquisition image stacking time is 15 ms, the data processing time is 10 ms, and the camera exposure time is 30 ms. The computer is also equipped with MATLAB, and after the acquisition of the images, the processing of the images is all realized by writing code using MATLAB, which is used for the 3D The computer is also equipped with MATLAB, after the acquisition of the images, the processing of the images is realized by using the code written in MATLAB.

[0021] Step 2, acquisition of the original intensity image: in the case of the thick object sample to be measured as the focus state, by changing the LED array coding so that the illumination source shape for the coherence parameter S1, S2 and S3 the circular shape, by moving the carrier table or using the electrically tunable lens to acquire three sets of intensity image stacks at different out-of-focus positions I.sub.s1.sup.1, I.sub.s2.sup.2, . . . , I.sub.s1.sup.i, . . . I.sub.s1.sup.N, I.sub.s2.sup.1, I.sub.s2.sup.2, . . . , I.sub.s2.sup.i, . . . I.sub.s2.sup.N and I.sub.s3.sup.1, I.sub.s2.sup.2, . . . , I.sub.s3.sup.i, . . . I.sub.s3.sup.N. The intensity image stacks of the object to be measured at different out-of-focus positions are acquired by moving the carrier or using the electrically tunable lens I.sub.s4.sup.1, I.sub.s2.sup.2, . . . , I.sub.s4.sup.i, . . . I.sub.s4.sup.N S4 four different image stacks based on the LED array coding illumination can be obtained by the CMOS camera, i.e., the circular illumination scheme is introduced into the conventional circular shape by LED array coding illumination. The bright-field microscope was used to capture a series of light intensity image stacks along the axial direction under the circular shape illumination pattern I.sub.s4.sup.1, I.sub.s2.sup.2, . . . , I.sub.s4.sup.i, . . . I.sub.s4.sup.N.

[0022] FIG. 2 shows the axial image stacks under the illumination of four different LED array codes, respectively. By changing the LED array coding so that the circular illumination pattern from small to large, the fourth group will change the illumination pattern to circular illumination pattern, and then under each lighting condition, the intensity image stack of the object to be measured under different out-of-focus positions is collected by the electrically tunable lens. The first row is the LED array coded illumination pattern, the second row is the image intensity map acquired under the axial Z1 position, the second row is the image stack map acquired under the axial Z2 position, and so on, and the intensity image stack of the object to be measured under different out-of-focus positions in the axial direction is acquired by changing the electrically tunable lens. For each LED array coded illumination pattern illumination pattern, 100 intensity maps with a resolution of 400×400 are acquired respectively.

[0023] In step 3, the three-dimensional phase transfer function of the microscopic imaging system with arbitrarily shaped pupil illumination is derived: from the three-dimensional transfer function model of tilted coherent point light source to the three-dimensional transfer function model of partially coherent illumination and annular illumination, the three-dimensional phase transfer function of the microscopic system under circular and annular illumination with different coherence parameters is obtained. The absorbance n.sub.a(r) and refractive index n.sub.P(r) of the three-dimensional object correspond to the imaginary and real parts of the complex refractive index n(r), respectively, and the relationship between the n(r) complex refractive index of the object n.sub.0(r) and the refractive index of the enclosing medium can be expressed as the three-dimensional scattering potential

[00001] V ( r ) = k 0 2 [ n 2 ( r ) - n m 2 ] .

where r is the three-dimensional spatial variable, k.sub.0 is the number of waves in the vacuum corresponding to the wavelength, and n.sub.m is the refractive index of the medium in which the object is located.

[0024] In a conventional transmission bright-field microscopy system, the intensity image measured for a three-dimensional object I(r) can be expressed as


I(r)=B+P(r).Math.H.sub.P(r)+A(r).Math.H.sub.A(r)

where B is the captured transmitted light component, A(r) and P(r) are the imaginary and real parts of the object's 3D scattering potential, respectively. H.sub.A(r) and H.sub.P(r) are the point spread functions of the imaging system for the absorbed and phase parts of the object, respectively.

[0025] A three-dimensional Fourier transform of the above equation yields the three-dimensional Fourier spectrum of the captured intensity map;


Ĩ(ρ)=Bδ(ρ)+{tilde over (P)}(ρ)T.sub.P(ρ)+Ã(ρ)T.sub.A(ρ)

where Bδ(ρ) is the zero frequency component of the corresponding intensity image. {tilde over (P)}(ρ) and T.sub.P(ρ) are the three-dimensional transfer functions of the spectrum and phase of the phase component of the scattering potential, respectively, while Ã(ρ) and T.sub.A(ρ) are the three-dimensional transfer functions of the frequency component and absorption of the absorption component of the scattering potential, respectively. the three-dimensional transfer function corresponding to the phase component is

[00002] T ~ p ( u , v , w ) = j λ 4 π P ~ ( u + 1 2 u , v + 1 2 v ) P ~ * ( u - 1 2 u , v - 1 2 v ) × [ S ~ ( u + 1 2 u , v + 1 2 v ) - S ~ ( u - 1 2 u , v - 1 2 v ) ] × δ [ w + λ - 2 - ( u - 1 2 u ) 2 - ( v - 1 2 v ) 2 - λ - 2 - ( u + 1 2 u ) 2 - ( v + 1 2 v ) 2 ] dudv

where ρ=(u,v,w), λ is the corresponding illumination source wavelength, {tilde over (S)}(u,v) is the light source distribution function, {tilde over (P)}(u,v) and {tilde over (P)}*(u,v) is a pair of conjugate light pupil function defined by the microscope objective, whose absolute value can be expressed as

[00003] .Math. P ( u ) .Math. = { 1 , if u ρ P 0 , if u > ρ P

where ρ.sub.P is the normalized cutoff frequency of the pupil of the microscope objective.

[0026] For a coherent point source at any point on the source plane, that is {tilde over (S)}(u,v)=δ(u−ρ.sub.s,v). Substituting this light source function into the above equation, the corresponding three-dimensional phase transfer function for this light source is obtained as

[00004] T ~ p ( u , v , w ) = j λ 4 π P ~ * ( ρ s - u , - v ) δ [ w - λ - 2 - ρ s 2 + λ - 2 - ( ρ s - u ) 2 - v 2 ] - j λ 4 π P ~ ( ρ s + u , v ) δ [ w + λ - 2 + ρ s 2 - λ - 2 - ( ρ s + u ) 2 - v 2 ]

The above three-dimensional transfer function of can be divided into

[00005] δ [ w - λ - 2 - ρ s 2 + λ - 2 - ( ρ s - u ) 2 - v 2 ]

and

[00006] δ [ w + λ - 2 + ρ s 2 - λ - 2 - ( ρ s + u ) 2 - v 2 ] ,

that is, the two spherical shells moved by the illuminated light source in three-dimensional space

[00007] ( w + λ - 2 + ρ s 2 ) 2 + ( ρ s + u ) 2 + v 2 = λ - 2

and

[00008] ( w - λ - 2 - ρ s 2 ) 2 + ( ρ s - u ) 2 + v 2 = λ - 2 ,

that is, the definition function of the Ewald spherical shell.

[0027] When the light source is a traditional circular pattern i.e.

[00009] S ( u ) = { 1 , if u ρ s 0 , if u > ρs

S(u) Substituting the expression for the light source into the three-dimensional phase transfer function, the corresponding three-dimensional phase transfer function for a partially coherent illuminated circular light source with different coherence factors ρ.sub.S can be obtained.

[0028] When the light source is an annular light source can be defined as

[00010] S ( u ) = .Math. i = 0 N δ ( u i ) , .Math. u i .Math. .Math. ρ p .Math.

The form of the transfer function under annular illumination is obtained by substituting the three-dimensional phase transfer function. By extending the 3D transfer function model from the tilted coherent point source to the circular partially coherent illumination and annular illumination models, the 3D phase transfer function of the microscope system under circular and annular illumination with different coherence parameters is obtained.

[0029] Combined with FIG. 3 is a two-dimensional schematic of the three-dimensional phase transfer function in the u-w plane based on different LED array coded illumination. From left to right the circular illumination pattern under LED array coded illumination conditions increases from small to large, and the last group is circular illumination. As seen in the first three sets of result plots, the high-frequency information of the image in its three-dimensional phase transfer function is compensated as the coherence parameter increases under circular illumination conditions. However, due to the increase in the numerical aperture, the high-frequency information is compensated while the low-frequency information is weakened. Therefore, the results in Figure (d) use circular illumination to achieve the effect of high-frequency signal compensation and low-frequency signal enhancement, which fully illustrates that the three-dimensional phase transfer function can be accurately derived based on circular illumination, proving the feasibility and accuracy of this circular illumination method.

[0030] Step 4, three-dimensional diffraction laminar quantitative refractive index deconvolution reconstruction: the acquired four groups of intensity image stacks are subjected to three-dimensional Fourier transform to obtain the three-dimensional spectra under four illumination cases. The obtained four sets of 3D spectra are summed and then divided by the sum of the absolute values of the four 3D phase transfer functions in the frequency domain to obtain the 3D scattering potential function.

[0031] The Fourier transform is performed I.sub.s1(r) on the intensity stacks of the circular light source with coherence parameters of S1. I.sub.s1(r), I.sub.s2(r), I.sub.s3(r) and I.sub.s4(r), and the Fourier spectra of the intensity maps are obtained as Ĩ.sub.s1(ζ), Ĩ.sub.s2(ζ), Ĩ.sub.s3(ζ) and Ĩ.sub.s4(ζ). Then the intensity stacks of the circular light source with coherence parameters of S1, S2, S3 and S4 are transformed I.sub.s4(r) to their Fourier domain. The sum of the Fourier spectra of the four intensity stacks obtained is divided by the sum of the T.sub.P4 absolute values of the four 3D phase transfer functions T.sub.P1, T.sub.P2, T.sub.P3 and T.sub.P4. The sum of the four intensity stack Fourier spectra is


Ĩ(ζ)=Ĩ.sub.s1(ζ)+Ĩ.sub.s2(ζ)+Ĩ.sub.s3(ζ)+Ĩ.sub.s4(ζ)

where Ĩ.sub.s1(ζ), Ĩ.sub.s2(ζ), Ĩ.sub.s3(ζ) and Ĩ.sub.s4(ζ) are the Fourier spectra of intensity maps obtained by Fourier transforming the intensity stacks captured under different illuminated light sources with coherence parameters of S1, S2, S3, and S4 respectively.

[0032] The sum of the absolute values of the four three-dimensional transfer functions is


T.sub.P(ζ)=|T.sub.P1(ζ)|+|T.sub.P2(ζ)|+|T.sub.P3(ζ)|+|T.sub.P4(ζ)|

where T.sub.P1(ζ), T.sub.P2(ζ), T.sub.P3(ζ) and T.sub.P4(ζ) correspond to four different illumination coherence parameters under the three-dimensional phase transfer function. The sum of the four intensity Fourier spectra is divided by the sum of the corresponding four three-dimensional phase transfer functions to obtain the Fourier spectrum of the three-dimensional scattering potential function of the measured object.

[0033] Step 5: Quantitative three-dimensional refractive index distribution of the measured object. The inverse Fourier transform is performed on the three-dimensional scattering potential function of the object under test, and the scattering potential function is converted into the refractive index distribution to obtain the quantitative three-dimensional refractive index distribution of the object under test.

[00011] P ( r ) = F - 1 ( I s 1 ( ζ ) + I s 2 ( ζ ) + I s 3 ( ζ ) + I s 4 ( ζ ) .Math. T P 1 ( ζ ) .Math. + .Math. T P 2 ( ζ ) .Math. + .Math. T P 3 ( ζ ) .Math. + .Math. T P 4 ( ζ ) .Math. )

By P (r) using the scattering potential formula, the quantitative three-dimensional refractive index distribution of the object under test is obtained.

[0034] Combined with FIG. 4 is a block diagram representation of the three-dimensional diffraction tomography quantitative refractive index deconvolution reconstruction method. The numerator in the dashed box represents the sum of the image stacks acquired under four different LED array coded illumination for Fourier transform, and the denominator represents the sum of the absolute values of the corresponding three-dimensional phase transfer functions, and the two are divided to obtain the Fourier spectrum of the three-dimensional scattering potential function. The three-dimensional inverse Fourier transform of the three-dimensional scattering potential function can be used to obtain the quantitative three-dimensional refractive index distribution of the measured object.

[0035] FIG. 5 shows the 3D diffraction tomography microscopy imaging results based on LED array encoded illumination of micro-polystyrene beads, unstained solid spherical algae and Hera cells respectively. Fig. (a) shows the quantitative 3D refractive index distribution of 6 μm diameter micro-polystyrene beads, where Fig. (a1) and Fig. (a2) are schematic 3D slices in the X-Y plane and Y-Z plane, respectively, for the micro-polystyrene beads; Fig. (a3) is a schematic 3D Fourier spectrum of the final reconstruction; Fig. (a4) is the axial and transverse 3D refractive index profiles of the micro-polystyrene beads. Figure (b) is a schematic diagram of the distribution of 3D diffraction laminarization in the X-Y plane, X-Z plane and Z-Y plane for unstained real algae, a green algae composed of 8, 16 or 32 cells, with internal cells forming ellipsoids with distinct anterior-posterior polarity, which also makes real algae an ideal object for 3D diffraction deconvolution, respectively. Figure (c) is a schematic diagram of the 3D diffraction lamination of Pandorina morum. cells in the X-Y plane, X-Z plane and Z-Y plane, respectively, and as shown in FIG. 5, we can clearly see how the individual cells inside the micro-polystyrene beads, solid spherical algae cells are distributed and the morphology of the cells at each angle, further confirming the This further confirms the feasibility and accuracy of this multi-frequency synthesis-based 3D deconvolution of diffraction tomography.