Method of coherent flow imaging using synthetic transmit focusing and acoustic reciprocity
20170281121 · 2017-10-05
Inventors
Cpc classification
A61B8/5223
HUMAN NECESSITIES
G01N29/221
PHYSICS
G01S7/52036
PHYSICS
G16H50/30
PHYSICS
G01S7/52046
PHYSICS
G01S7/52077
PHYSICS
A61B8/5207
HUMAN NECESSITIES
G01N29/44
PHYSICS
International classification
A61B8/00
HUMAN NECESSITIES
Abstract
Acoustic imaging based on angular coherence is provided. The target is insonified with collimated acoustic beams at several different incidence angles. The resulting images are processed to determine angular coherence averaged over angle, and then integration of the angular coherence for relatively small angular differences is used to provide the output angular coherence image. In cases where flow imaging is done, the images are first filtered to suppress signals from stationary features of the target, multiple acquisitions are acquired, and the final flow image is computed by summing the squares of the angular coherence images (on a pixel by pixel basis).
Claims
1. A method for acoustic imaging, the method comprising: providing an acoustic imaging system including an acoustic transducer array and a processor; emitting collimated acoustic radiation from the acoustic transducer array at a target at three or more distinct incidence angles; receiving scattered acoustic radiation from the target with the acoustic transducer array; determining acoustic images of the target from the scattered acoustic radiation corresponding to each of the three or more incidence angles; computing an angular coherence image with the processor by i) averaging the acoustic images vs. angle to estimate an angular coherence function at each spatial point of the acoustic images; ii) integrating the angular coherence function over a predetermined angular range to provide the angular coherence image; providing the angular coherence image as an output.
2. The method of claim 1, wherein the predetermined angular range is less than or equal to 30% of a total angular range of the three or more distinct incidence angles.
3. The method of claim 1, wherein the acoustic images are provided as complex-valued functions of two spatial variables or as real-valued functions of two spatial variables.
4. The method of claim 3, wherein the complex-valued functions are represented as having real and imaginary components in the averaging the acoustic images vs. angle.
5. The method of claim 3, wherein the complex-valued functions are represented as having in-phase and quadrature components in the averaging the acoustic images vs. angle.
6. The method of claim 3, wherein the real-valued functions represent radio-frequency ultrasound signal intensity in the averaging the acoustic images vs. angle.
7. The method of claim 1, wherein the averaging the acoustic images vs. angle further comprises spatial averaging over a predetermined spatial range.
8. The method of claim 7, wherein the acoustic imaging system provides an axial resolution and wherein the spatial averaging is done over an axial range substantially equal to the axial resolution.
9. The method of claim 7, wherein the acoustic imaging system provides a lateral resolution and wherein the spatial averaging is done over a lateral range substantially equal to the lateral resolution.
10. The method of claim 1, wherein the acoustic images are 2-D brightness mode images.
11. The method of claim 1, wherein the three or more distinct incidence angles are seven or more distinct incidence angles.
12. A method of flow acoustic imaging comprising: performing the method of claim 1 for three or more acquisitions, wherein the acoustic images are flow acoustic images that are filtered to suppress signals from stationary parts of the target, and wherein the acquisitions are separated by a predetermined time delay; providing an output flow image by summing squares of the angular coherence image for each acquisition.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
DETAILED DESCRIPTION
[0015]
i) averaging the acoustic images vs. angle to estimate an angular coherence function at each spatial point of the acoustic images; and
ii) integrating the angular coherence function over a predetermined angular range to provide the angular coherence image. This angular coherence image can be provided as an output.
[0016] Here a collimated acoustic beam is an acoustic beam with substantially planar wavefronts in all locations within the field-of-view. The half divergence angle of such a beam in homogeneous media is smaller than or equal to three times the limit imposed by the diffraction of the acoustic aperture that is used to generate the beam. For a Gaussian beam with half width ω and wavelength λ, the intended half divergence angle θ is roughly θ≦3λ/(πw). With inhomogeneous media, aberration may increase the half divergence angle.
[0017] The predetermined angular range is preferably less than or equal to 30% of a total angular range of the three or more distinct incidence angles.
[0018] The acoustic images can be provided as complex-valued functions of two spatial variables or as real-valued functions of two spatial variables. Complex-valued functions can be represented as having real and imaginary components in the averaging the acoustic images vs. angle. Alternatively, complex-valued functions can be represented as having in-phase and quadrature components in the averaging the acoustic images vs. angle. Real-valued functions can represent radio-frequency ultrasound signal intensity in the averaging the acoustic images vs. angle. Radio-frequency signals in the context of ultrasound imaging are the ultrasound echoes recorded by the transducers as a function of time or depth.
[0019] Averaging the acoustic images vs. angle can further include spatial averaging over a predetermined spatial range. For example, if the acoustic imaging system provides an axial resolution the spatial averaging can be done over an axial range substantially equal to the axial resolution. Similarly, if the acoustic imaging system provides a lateral resolution the spatial averaging can be done over a lateral range substantially equal to the lateral resolution. As used herein, “substantially equal” means equal to within +/−10%.
[0020] The three or more distinct incidence angles are preferably seven or more distinct incidence angles. The acoustic images can be 2-D brightness mode images. Alternatively, the acoustic images can be flow acoustic images that are filtered to suppress signals from stationary parts of the target. In such cases, it is preferred to perform angular coherence imaging for three or more acquisitions that are separated by a predetermined time delay, and to provide an output flow image by summing squares of the angular coherence image for each acquisition.
[0021] More specifically, flow imaging according to the present approach can be accomplished as follows
1) Plane waves with different transmit angles are emitted, each of which produces one acoustic image. The acoustic images produced in this step are denoted as (Angle 1, Acquisition 1), (Angle 2, Acquisition 1), (Angle 3, Acquisition 1), and etc.
2) Wait for a fixed amount of time (e.g., 1 ms).
3) Repeat step 1 and 2 for at least two more times (at least 3 times in total). The images produced in this step are denoted as (Angle 1, Acquisition 2), (Angle 2, Acquisition 2), (Angle 3, Acquisition 2), and (Angle 1, Acquisition 3), (Angle 2, Acquisition 3), (Angle 3, Acquisition 3), and etc.
4) Filter the acoustic images to remove stationary signals. The filtering is conducted among images produced with the same angle index but different acquisition indices. For example, Angle 1 images in all acquisitions, including (Angle 1, Acquisition 1), (Angle 1, Acquisition 2), (Angle 1, Acquisition 3), and so on, are filtered as one ensemble; and then Angle 2 images in all acquisitions; and so on. The result is one filtered flow image corresponding to each of the acquired acoustic images.
5) Produce one angular coherence image from the filtered images in Acquisition 1, including (Angle 1, Acquisition 1), (Angle 2, Acquisition 1), (Angle 3, Acquisition 1), and so on as described above. Then similarly produce one angular coherence image for each of the other acquisitions.
6) Sum the squares of the angular coherence images.
[0022]
[0023]
[0024]
[0025]
[0026]
Mathematical Development
[0027] To better appreciate the present invention, the following exemplary mathematical development is provided. The method can be regarded as including 4 major steps.
1. Tissue insonification with a synthetic transmit focusing technique, such as virtual source and plane wave synthetic aperture. (
2. Next, the transmitted wave is backscattered by the tissue (
3. The process described in 1 and 2 is repeated with plane waves at M different angles into the tissue (
4. For the same point in each of the images produced from different transmit angles α, the normalized coherence (i.e. a function that computes the similarity of the signals) of every pair of signals received at different plane wave angles are computed as a function of the difference between angles (i.e. the spatial coherence is computed across the angles of f(x,y,α.sub.1).
in which, Δα=α.sub.1−α.sub.2 R(x,y,z,Δα) is then averaged across the angles α to produced an averaged coherence function
[0028] For the computation of normalized coherence, various techniques can be used to produce similar results. First of all, instead of RF data, the complex IQ (in-phase and quadrature) data can be used as an alternative. Using IQ data, the computation can be represented as
where IQ(x,y,z,α) represent the complex IQ signal at location (x,y,z) with transmit angle α; IQ*(x,y,z,α) represent the complex conjugate of IQ(x,y,z,α); and ∥ represent the l.sub.2 norm of the IQ signal.
[0029] In implementation with discrete-time signals, various techniques can be used. For example,
in which, the angular range is from −α.sub.0 to α.sub.0. The IQ signal IQ(x,y,z,α) can be replaced with RF signal f(x,y,z,α) according to the previous description.
[0030] Alternatively, the average can be calculated as
in which, N represent the number of angles α.sub.1 between the range −α.sub.0 and α.sub.0 used in the computation. Additionally, a spatial kernel can be used in any of the implementations above. For example, using an axial kernel in z dimension in the implementation follows
in which the axial kernel length is 2z.sub.0, and z.sub.i is the summation variable. The function sqrt( ) represent the square root function. Another example is
Similar kernels in x and y dimensions can be used as well.
[0031] The pixel value of the resulting image point, g(x,y), is then calculated by integrating or summing the normalized spatial coherence function between 0 and 30% of the maximums difference between the angles.
g(x,y)=∫.sub.0.sup.ρ
in which, ρ≈A.Math.Δα.sub.max, where A is a fraction, usually between 0.01 and 0.3 and represents the fraction of the aperture width or fraction of the total angle encompassing all transmits.
[0032] The process is carried out for each pixel (x, y), and a B-mode image can be produced (
[0033] The normalized angular coherence function for plane-wave transmits A.sub.PWT can be expressed as
[0034] where Δp=p.sub.1−p.sub.2, Δq=q.sub.1−q.sub.2, C.sub.rx is the autocorrelation of the receive aperture function, k is the wave number and p and q are normalized spatial frequencies (i.e., p and q are effectively angles).
[0035] The physical implication of Eq. (8) is that the cross-correlation function of the backscattered signals from plane-wave transmits at different angles and a spatially incoherent homogeneous medium is proportional to the normalized autocorrelation of the receive aperture function. This can be considered as an extension to the van Cittert Zernike theorem.
[0036] The transmit angular spacing (kΔp, kΔq) in Eq. 8 can be expressed as fractions of the maximum angle sampled by the receive aperture (η.sub.p, η.sub.q)=(kΔ.sub.p/kp.sub.max, kΔq/q.sub.max) as
in which, 0≦η.sub.p, η.sub.q≦1. If the transmit angular spacing is greater than the maximum angle sampled by the receive aperture (i.e., η.sub.p or η.sub.q>1), A.sub.PWT(η.sub.p, η.sub.q)=0.
[0037] In cases where flow imaging is performed, transmission from the 17 angles or virtual elements are repeated multiple times, and the images g(x,y,i) from the acquisitions are summed using a power estimator,
P(x,y)=Σ.sub.i=1.sup.Ng.sup.2(x,y,i), (10)
in which, g(x,y,i) is the angular coherence image produced from the i.sub.th acquisition, and N is the number of acquisitions. P(x,y) is the flow image (
[0038] In addition, both the B-mode image g(x,y) and the flow image P(x,y) can be computed using the “recursive” method. That is, the signals from the same angle or virtual element, but previous cycle, are updated with the values from the new transmission, and P(x,y) are recalculated, thus improving frame rate and continuity of the image.