ULTRASOUND IMAGING SYSTEM USING COHERENCE ESTIMATION OF A BEAMFORMED SIGNAL
20220022848 · 2022-01-27
Inventors
Cpc classification
A61B8/4483
HUMAN NECESSITIES
G01S15/8925
PHYSICS
G01S15/8977
PHYSICS
A61B8/5207
HUMAN NECESSITIES
G01S7/52046
PHYSICS
International classification
A61B8/00
HUMAN NECESSITIES
Abstract
Improved ultrasound imaging using coherence estimation of a beamformed signal. Ultrasound imaging using coherence estimation of a beamformed signal as described herein may be performed by applying a plurality of filters to the beamformed signal to generate a plurality of filtered beamformed signals. Normalized cross-correlation may be performed on a plurality of pairs of filtered beamformed signals to determine a coherence coefficient corresponding to each pixel of an ultrasound image, which may be used to construct a coherence estimation ultrasound image.
Claims
1. A method of ultrasound imaging using coherence estimation, comprising: receiving, by a processor, a plurality of beamformed ultrasound signals; assembling, by the processor, the plurality of beamformed ultrasound signals into an RF signal matrix; generating, by the processor, at least two filtered RF signal matrices using a plurality of spatial filters and the RF signal matrix; performing, by the processor, a normalized cross-correlation on at least one pair of the filtered RF signal matrices to determine at least one cross-correlation coefficient corresponding to each pixel in an image of a target; determining, by the processor, a coherence coefficient using the at least one cross-correlation coefficient; constructing, by the processor, a coherence estimation image of the target using the plurality of coherence coefficients corresponding to each pixel; and displaying, by the processor and on a display, the coherence estimation image of the target.
2. The method of claim 1, further comprising: transmitting an ultrasound signal towards an imaging target using an array of ultrasound transducer elements; receiving a plurality of reflected ultrasound signal using the array of ultrasound transducer elements; and beamforming the plurality of ultrasound signals using the array of ultrasound transducer elements.
3. The method of claim 1, further comprising transforming the RF signal matrix into a k-space representation of the RF signal matrix using a frequency transform to generate the at least two RF signal matrices.
4. The method of claim 3, wherein the frequency transform is a 2D or 3D Fast Fourier Transform.
5. The method of claim 1, wherein generating the at least two filtered RF signal matrices includes: multiplying the k-space representation of the RF signal matrix by the plurality of spatial filters to generate a plurality of k-space representations of at least two filtered RF signal matrices.
6. The method of claim 5, further comprising transforming the plurality of k-space representations of the at least two filtered RF signal matrices into time domain representations of the at least two filtered RF signal matrices using an inverse frequency transform.
7. A computer readable medium storing program instructions, the program instructions comprising program instructions to configure at least one processor to: receive a plurality of beamformed ultrasound signals and assemble the plurality of beamformed ultrasound signals into an RF signal matrix; generate a plurality of filtered RF signal matrices by applying a plurality of spatial filters to the RF signal matrix; perform normalized cross-correlation on a plurality of pairs of the filtered RF signal matrices to determine a plurality of cross-correlation coefficients corresponding to each pixel in the image of the target; determine a coherence coefficient using the cross-correlation coefficients; construct a coherence estimation image of the target using the coherence coefficients corresponding to each pixel; and display the coherence estimation image of the target on a display.
8. The computer readable medium of claim 7, wherein the program instructions further comprise program instructions to transmit an ultrasound signal towards an imaging target using an array of ultrasound transducer elements and receive a plurality of reflected ultrasound signals using the array of ultrasound transducer elements.
9. The computer readable medium of claim 8, wherein the program instructions further comprise program instructions to beamform the plurality of ultrasound signals.
10. The computer readable medium of claim 7, wherein the program instructions further comprise program instructions to transform the RF signal matrix into a k-space representation of the RF signal matrix using a frequency transform.
11. The computer readable medium of claim 10, wherein the program instructions to generate the plurality of filtered RF signal matrices include generating a plurality of k-space representations of the plurality of filtered RF signal matrices by multiplying the k-space representation of the RF signal matrix by the plurality of spatial filters.
12. The computer readable medium of claim 11, wherein the program instructions further comprise program instructions to transform the plurality of k-space representations of the plurality of filtered RF signal matrices into time domain representations of the plurality of filtered RF signal matrices using an inverse frequency transform.
13. An ultrasound imaging system using coherence estimation, comprising: a display screen; an ultrasound probe having an array of transducer elements and configured to: transmit a beamformed ultrasound signal towards an imaging target using the array of transducer elements, receive a plurality of reflected ultrasound signals using the array of transducer elements, and beamform the plurality of received ultrasound signals using the array of transducer elements; a memory configured to store data; and a processor coupled to the memory, the processor configured to: assemble the plurality of beamformed ultrasound signals into an RF signal matrix, generate a plurality of filtered RF signal matrices using a plurality of spatial filters and the RF signal matrix, perform normalized cross-correlation on a plurality of pairs of the filtered RF signal matrices to determine a plurality of cross-correlation coefficients corresponding to each pixel in an image of the imaging target, determine a coherence coefficient corresponding to each pixel in the ultrasound image of the target using the plurality of cross-correlation coefficients, construct a coherence estimation image of the target using the coherence coefficients corresponding to each pixel, and display the coherence estimation image on the display screen.
14. The ultrasound imaging system of claim 13, wherein the coherence estimation image has a higher contrast-to-noise ratio than an ultrasound image constructed from an amplitude of a received echo.
15. The ultrasound imaging system of claim 13, wherein the coherence estimation image has a higher signal-to-noise ratio than an ultrasound image constructed from an amplitude of a received echo.
16. The ultrasound imaging system of claim 13, wherein performance of the normalized cross-correlation is on segments of data approximately 1-4 wavelengths long in an axial direction for each filtered RF matrix in a given pair of filtered RF signals.
17. The ultrasound imaging system of claim 13, wherein forming the coherence estimation image includes using a grayscale with a coherence coefficient of 0 mapped to total black and a coherence coefficient of 1 mapped to total white.
18. The ultrasound imaging system of claim 13, wherein determining the coherence coefficient for a given pixel includes summing the cross-correlation coefficients for the given pixel.
19. The ultrasound imaging system of claim 13, wherein determining the coherence coefficient for a given pixel includes weighing and summing the cross-correlation coefficients for the given pixel.
20. The ultrasound imaging system of claim 13, wherein determining the coherence coefficient for a given pixel includes averaging the cross-correlation coefficients for the given pixel.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0018] Other systems, methods, features, and advantages of the present invention will be apparent to one skilled in the art upon examination of the following figures and detailed description. Component parts shown in the drawings are not necessarily to scale and may be exaggerated to better illustrate the important features of the present invention.
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
DETAILED DESCRIPTION
[0048] The ultrasound imaging systems, methods of using the same, and the computer readable medium described herein apply multiple filters to a beamformed signal to generate multiple filtered beamformed signals. The beamformed signal is a summation of channel signals, and the filtered beamformed signals reproduce the channel signals that were summed together. Beamforming may advantageously reduce data size of the channel signals and increase imaging efficiency. Normalized cross-correlation may be performed on multiple pairs of filtered beamformed signals to determine a coherence coefficient corresponding to each pixel of an ultrasound image, which may be used to construct a coherence estimation ultrasound image. Coherence is a measurement of similarity of echoes produced, and an image constructed based on coherence estimation may advantageously produce higher quality and higher contrast images of target regions, especially target regions that return a low amplitude echo (e.g., heart chamber, blood vessels near heart chamber). Coherence based systems produce images that show such target regions as black as opposed to clutters, and thus a user may better identify the regions and perform more accurate measurements on the images.
[0049]
[0050] The processor 104 may include a single processor or multiple processors and may be configured to execute machine-readable instructions. The processor 104 may execute instructions to operate the array of transducers elements 102 to control an amount of power delivered to the array of transducer elements 102 and perform the coherence estimation and/or the reconstruction of the image. The processor 104 may be a microprocessor or a microcontroller by example.
[0051] The user interface 105 may be displayed on the display 107. The user interface 105 may be used to input parameters and control the ultrasound imaging system 100. By example and not limitation, an input may be received by the display 107 (i.e., touchscreen), buttons, keys, knobs, one or more cameras, or a microphone. The input may be touch, visual, and/or auditory. The received input may be biometric information, the user's voice, and/or the user's touch.
[0052] The memory 106 may be a non-transitory computer readable storage medium. For example, the memory 106 may be a random-access memory (RAM), a disk, a flash memory, optical disk drives, hybrid memory, or any other storage medium that can store data. The memory 106 may store program code that are executable by the processor 104. The memory 106 may store data in an encrypted or any other suitable secure form.
[0053] The ultrasound imaging system may include the UPS 110 or be coupled to the UPS 110 as shown in
[0054] In some embodiments, the UPS 110 may generate and/or beamform one or more signals that are transmitted towards a target by the array of transducer elements 102. In some embodiments, such beamforming may be performed using various beamforming methods known in the art, including but not limited to analog beamforming, digital beamforming, hybrid beamforming (part analog and part digital), Fresnel-based beamformer, minimum-variance beamformer, Capon beamformer, Wiener beamformer, and delay-and-multiply beamforming. The transmit beamforming applies appropriate time delays and weighings to an ultrasound signal for each transducer element in the array of transducer elements 102 in order to focus the transmitted ultrasound beam at the intended target.
[0055] In some embodiments, the UPS 110 may beamform a reflected signal received by the array of transducer elements 102. In some embodiments, the transmit and/or receive beamforming, or portions thereof, may be performed by components of the probe electronics 103, or other dedicated components of the ultrasound imaging system 100, such as an ASIC or an FPGA. In some embodiments, the UPS 110, by itself or in conjunction with the probe electronics 103, may be configured to populate an RF signal matrix based on the beamformed received signal.
[0056] The array of transducer elements 102 transmit beamformed ultrasonic signals into a target area or the tissue of a patient being examined. The ultrasonic signals reflect off structures in the body, like blood cells or muscular tissue, to produce echoes that return to the array of transducer elements 102. The echoes are converted into electrical signals, or RF signal data, by the transducer elements, and the received RF signal data is received by the processor 104. In some embodiments, the processor 104 assembles the received RF signal data into an RF signal matrix.
[0057] The UPS 110 may apply multiple spatial filters to the RF signal matrix. The spatial filters may be applied either in the time domain or the spatial frequency domain. The spatial frequency domain may also be referred to as k-space. To apply the spatial filters in k-space, a frequency transform (e.g., a 2D FFT) is applied to the RF matrix to produce the k-space representation of the RF matrix. The spatial filters are applied by multiplying the k-space representation of the RF matrix by the k-space representations of the multiple spatial filters, to produce k-space representations of multiple filtered RF matrices. An inverse frequency transform (e.g., a 2D Inverse Fast Fourier transform (2D IFFT)) is applied to the k-space representations of the multiple filtered RF matrices to produce multiple filtered RF matrices. In embodiments in which the spatial filters are applied in the time domain, convolution of the time domain representation of the RF matrix and the time domain representations of the multiple spatial filters is performed to produce multiple filtered RF matrices.
[0058] The multiple spatial filters may have varying degrees of overlap in k-space with each other. These varying degrees of overlap among the multiple spatial filters are used to estimate coherence of the received ultrasound wave. The UPS 110 may estimate such coherence by performing normalized cross-correlation of the k-space output of pairs of filtered data.
[0059] In some embodiments, the UPS 110 may perform the normalized cross-correlation on pairs of filtered RF matrices generated using spatial filters having between approximately 40 and 99.9% overlap with each other. In some embodiments, the UPS 110 may perform the normalized cross-correlation on pairs of filtered RF matrices with 80% overlap with each other. In some embodiments, the UPS 110 may perform the normalized cross-correlation on pairs of filtered RF matrices with 85% overlap with each other. In some embodiments, the UPS 110 may perform the normalized cross-correlation on pairs of filtered RF matrices with 90% overlap with each other. In some embodiments, the UPS 110 may perform the normalized cross-correlation on pairs of filtered RF matrices with 95% overlap with each other.
[0060] In some embodiments, the UPS 110 may perform the normalized cross-correlation on between 10 and 100 pairs of filtered RF matrices. In some embodiments, the UPS 110 may perform the normalized cross-correlation on less than 10 pairs of filtered RF matrices. In some embodiments, the UPS 110 may perform the normalized cross-correlation on more than 100 pairs of filtered RF matrices.
[0061] By applying the spatial filters to the beamformed signal, instead of processing channel data from individual transducer elements as is done in other coherence estimation techniques, embodiments of the improved ultrasound imaging system described herein are able to improve the contrast-to-noise and signal-to-noise ratios of a standard ultrasound image without requiring the system to process large amounts of data in real time. For example, a 64-channel beamformer with 12-bit A/Ds running at 40 MHz, requires transferring data from the probe at a rate of 3.58 gigabytes/second in order to perform coherence calculations on individual channel data. In comparison, an embodiment of the improved ultrasound imaging system 100 described herein, using beamformed data to perform coherence calculations on the received signal, may only require transferring data from the probe at a rate of 57.2 megabytes/second, while still providing improved contrast-to-noise and signal-to-noise ratios of a standard ultrasound image. Such improved image quality may also permit the use of the array of transducer elements 102 with a lower number of elements than would otherwise be required for a desired image quality.
[0062] In some embodiments, the user interface 105 may be used to receive user input to control operation of the ultrasound imaging system 100, including to control various parameters of ultrasound imaging system, such as imaging start depth, imaging end depth, number of lines, line spacing, and/or sampling frequency, and to control various parameters of the display 107, such as gain and/or contrast of a displayed image.
[0063] In some embodiments, the UPS 110 may process the received RF data and prepare ultrasound images for display on the display 107. In some embodiments, processing of the received RF data to prepare an ultrasound image may include applying multiple spatial filters to the received RF data to generate filtered RF data. Processing of the received RF data to prepare an ultrasound image may include performing normalized cross-correlation on the filtered RF data to determine a coherence coefficient corresponding to each pixel of an ultrasound image from which a coherence estimation ultrasound image using coherence estimation is constructed. In some embodiments, the multiple spatial filters may be stored on memory 106. In some embodiments, the prepared ultrasound images may be stored on memory 106 prior to being displayed on display 107. The memory 106 may comprise any known data storage medium.
[0064] Some embodiments of the improved ultrasound imaging system 100 described herein may include multiple processors to handle the processing tasks. For example, a first processor may be utilized to demodulate and decimate the ultrasound signal while a second processor may be used to further process the data prior to displaying an image. It should be appreciated that other embodiments may use a different arrangement of processors.
[0065] In some embodiments of the improved ultrasound imaging system 100 described herein, ultrasound signals may be processed in various ways according to program instructions, including beamforming, de-noising, and filtering ultrasound signals, or any portion or combination thereof. The UPS 110 may include program instructions that may be implemented on one or more processors 104, alone or in combination with the probe electronics 103. The program instructions may be stored on memory 106 of the system. In some embodiments, the program instructions correspond to the processes and functions described herein, and may be executed by a processor, such as the processor 104. In some embodiments, the program instructions may be implemented in C, C++, JAVA, or any other suitable programming language. In some embodiments, some or all of the portions of the program instructions may be implemented in application specific circuitry including ASICs and FPGAs, and such application specific circuitry may be part of the probe electronics 103, or another part of the ultrasound imaging system 100.
[0066] For example, in an embodiment, programming instructions may be provided to generate B-mode image frames and corresponding RF matrices based on a reflected and received ultrasound signal, spatial filters and corresponding filtered RF matrices, and final improved coherence estimation image frames based on the filtered RF matrices. The image frames may be stored along with timing information indicating a time at which the image frame was acquired in the memory 106 may be recorded with each image frame. Programming instructions may be provided to retrieve stored image frames from the memory 106 and to display the image frames on the display 107.
[0067]
[0068] The UPS 110 (see
[0069] In some embodiments, the spatial filters may be represented as matrices. In some embodiments, the matrices may be populated with values equal to 0 or 1, or any value in between. In some embodiments, spatial filters could be complex-valued, having real and imaginary components. In some embodiments, each of the multiple spatial filters may be randomly or pseudo-randomly generated. In some embodiments, each of the multiple spatial filters may be randomly or pseudo-randomly generated such that a certain predetermined percentage of the entries in a given filter matrix are equal to 0 and a certain predetermined percentage of the entries in a given filter matrix are equal to 1. For example, in an embodiment, each of the multiple matrices may have 80% of its entries equal to 1, and 20% of its entries equal to 0, with the distribution of such entries within the matrix randomized or pseudo-randomized. In such an example, any two of the multiple filters would be expected to have approximately 64% overlap with one another. In some embodiments, the multiple spatial filters are generated such that the cumulative coverage of the spatial filters in k-space overlaps with all or most of the k-space representation of the ultrasound RF data. In some embodiments, the multiple spatial filters may be generated such that each spatial filter has at least approximately 40% overlap with adjacent spatial filter(s). In some embodiments, spatial filters may overlap with adjacent spatial filters in the axial direction, the lateral direction, or a combination thereof. In some embodiments, the multiple spatial filters may be generated such that each spatial filter has between approximately 40% and approximately 99.9% overlap with at least one other spatial filter. Certain embodiments of k-space representations of spatial filters are shown in
[0070] The UPS 110 (see
[0071] Next, the UPS 110 (see
[0072] Next, the UPS 110 (see
[0073] In some embodiments, the UPS 110 (see
[0074] In some embodiments, the correlation between two signals may be quantified using the following equation, where s.sub.i is the time domain signal from filter i, and s.sub.j is the time domain signal from filter j:
[0075] In some embodiments, the coherence between two signals may be quantified using the normalized cross-correlation:
[0076] In some embodiments, a coherence estimate may be performed using the following equation, where ξ is the set of signals that are cross-correlated with signal i:
[0077] The cross-correlation coefficients for each pair of filtered RF signals are used to determine a coherence coefficient 206 corresponding to each pixel of a coherence estimation ultrasound image. In some embodiments, the cross-correlation coefficients are scan converted. In some embodiments, the cross-correlation coefficients relating to a given pixel are summed to determine a coherence coefficient for that pixel. In some embodiments, the cross-correlation coefficients relating to a given pixel are weighed and then summed to determine a coherence coefficient for that pixel. In other embodiments, the cross-correlation coefficients relating to a given pixel are averaged to determine a coherence coefficient for that pixel.
[0078] The coherence coefficients 206, each corresponding to a pixel of an ultrasound image, are used to generate a coherence estimation ultrasound image based on coherence estimation 207. In some embodiments, the coherence coefficients 206 are used to generate a coherence estimation ultrasound image based on coherence estimation using a grayscale conversion in which a coherence coefficient of 0 is mapped to total black, and a coherence coefficient of 1 is mapped to total white. The grayscale may be linear or non-linear. In some embodiments, the coherence estimation ultrasound image based on coherence estimation 207 may be displayed on a screen for visualization by a user, such as a physician or an ultrasound technician, depending on the application.
[0079] By applying the spatial filters to the beamformed signal, instead of processing channel data from individual transducer elements as is done in other coherence estimation techniques, embodiments of the improved ultrasound imaging systems described herein may be able to improve the contrast-to-noise and signal-to noise ratios of a standard ultrasound image without requiring the system to process large amounts of data in real time. For example, a 64-channel beamformer with 12-bit A/Ds running at 40 MHz, requires transferring data from the probe at a rate of 3.58 gigabytes/second in order to perform coherence calculations on individual channel data. In comparison, an embodiment of the improved ultrasound imaging system described herein, using beamformed data to perform coherence calculations on the received signal, may only require transferring data from the probe at a rate of 57.2 megabytes/second, while still providing improved contrast-to-noise and signal-to-noise ratios of a standard ultrasound image.
[0080]
[0081] In block 320, the UPS 110 (see
[0082] In some embodiments, the spatial filters may be represented as matrices with dimensions such that the filters are applied by multiplying k-space representation of each filter by the k-space representation of the RF signal matrix to generate multiple k-space representations of filtered RF signal matrices.
[0083] In some embodiments, the spatial filters may be represented as matrices. In some embodiments, the matrices may be populated with values equal to 0 or 1, or any value in between. In some embodiments, each of the multiple spatial filters may be randomly or pseudo-randomly generated. In some embodiments, each of the multiple spatial filters may be randomly or pseudo-randomly generated such that a certain predetermined percentage of the entries in a given filter matrix are equal to 0 and a certain predetermined percentage of the entries in a given filter matrix are equal to 1. For example, in an embodiment, each of the multiple matrices may have 80% of its entries equal to 1, and 20% of its entries equal to 0, with the distribution of such entries within the matrix randomized or pseudo-randomized. In such an example, any two of the multiple filters would be expected to have approximately 64% overlap with one another. In some embodiments, the multiple spatial filters are generated such that the cumulative coverage of the spatial filters in k-space overlaps with all or most of the k-space representation of the ultrasound RF data. In some embodiments, the multiple spatial filters may be generated such that each spatial filter has at least approximately 40% overlap with adjacent spatial filter(s). In some embodiments, spatial filters may overlap with adjacent spatial filters in the axial direction, the lateral direction, or a combination thereof. In some embodiments, the multiple spatial filters may be generated such that each spatial filter has between approximately 40% and approximately 99.9% overlap with at least one other spatial filter. Certain embodiments of k-space representations of spatial filters are shown in
[0084] The spatial filters may be generated prior to imaging and stored in computer memory accessible by the ultrasound imaging system, or the spatial filters may be generated during the imaging process, or a combination thereof. In some embodiments, between approximately 10 and 100 spatial filters may be used. In some embodiments, less than 10 spatial filters may be used. In some embodiments, more than 100 spatial filters may be used.
[0085] By applying the spatial filters to the beamformed signal, instead of processing channel data from individual transducer elements as is done in other coherence estimation techniques, embodiments of the improved ultrasound imaging methods described herein may be able to improve the contrast-to-noise and signal-to-noise ratios of a standard ultrasound image without requiring the system to process large amounts of data in real time. For example, a 64-channel beamformer with 12-bit A/Ds running at 40 MHz requires transferring data from the probe at a rate of 3.58 gigabytes/second in order to perform coherence calculations on individual channel data. In comparison, an embodiment of the improved ultrasound imaging system described herein, using beamformed data to perform coherence calculations on the received signal, may only require transferring data from the probe at a rate of 57.2 megabytes/second, while still providing improved contrast-to-noise and signal-to-noise ratios of a standard ultrasound image.
[0086] In block 330, the UPS 110 (see
[0087] In some embodiments, the UPS 110 (see
[0088] In some embodiments, the correlation between two signals may be quantified using the following equation, where s.sub.i is the time domain signal from filter i, and s.sub.j is the time domain signal from filter j:
[0089] In some embodiments, the coherence between two signals may be quantified using the following normalized cross-correlation:
[0090] In some embodiments, a coherence estimate may be performed using the following equation, where is the set of signals that are cross-correlated with signal i:
[0091] In block 340, the UPS 110 (see
[0092] In block 350, the UPS 110 (see
[0093]
Additionally, x.sub.0 may De the lateral aperture coordinate, x may be the lateral beam coordinate, X may be the ultrasound wavelength, and z may be the focal depth. The double-sided arrow may indicate that the two expressions are Fourier Transform pairs. The transmit-receive convolutional process may be represented by the below expression.
[0094] Due to the multiplicative process of transmit and receive and assuming the same aperture is used in transmit and receive, the transmit-receive point spread function may be as shown below.
[0095] A Fourier Transform may be performed to obtain the frequency representation of the transmit-receive point spread function indicated by F{ } on the transmit-receive point spread function to yield the below equation.
[0096] In the above expression, tri may be a triangle function defined as
and u.sub.x may be the lateral spatial frequency. The frequency domain representation of the point spread function may be also referred to as a k-space representation or a transfer function of the ultrasound imaging system 100. Notably, the width of the triangle function may be proportional to twice the width of the aperture D. The same triangle function may be arrived at through convolution of the two rectangular apertures each having a width 401 of D because of the Fourier Transform relationship between the aperture and the point spread function. Asterisk 403 of
[0097]
and its k-space representation may be a rectangle function of width 401 of D centered at a location corresponding to the element position. This process may be expressed by the below equation.
[0098] Based on the processes shown in
[0099]
[0100] After H.sub.ele(u.sub.x) has been obtained for all elements, estimates of channel data may be obtained for any imaging target when F.sub.sys(u.sub.x) is replaced with the 2D DFT of the beamformed RF matrix. The estimated channel data may be obtained by multiplying the 2D DFT of the beamformed RF matrix with H.sub.ele(u.sub.x) and then takin the inverse 2D DFT. Coherence estimation may be performed using the estimated channel data in same manner as SLSC.
[0101] After filtering, a 2D inverse DFT may be performed to obtain the filter outputs in the space/time domain. The filter outputs may be intended to provide estimates of the RF channel data. Having varying degrees of overlap among the filters and performing normalized cross-correlation of the filter output data may be used to approximate the coherence of the received ultrasound wave.
[0102] If the cross-correlation coefficient is high, signals may be estimated to be highly coherent. If the cross-correlation coefficient is low, signals may be estimated to have low coherence. If several dozens of filters are used, many combinations of pairs of filtered data may be used to produce many cross-correlation coefficients for a single pixel. The coefficients may then be summed to produce a final pixel value in the coherence image using the below equation.
[0103]
[0104]
[0105] The contour 504 may indicate k-space coverage when using all 64 elements in transmit and receive. The k-space coverage may be of a 2.5 MHz, 64-element phased array with 50%-6 dB fractional bandwidth. A contour of the magnitude of the Fourier Transform may appear similar to the contour 504.
[0106]
[0107]
[0108]
[0109]
[0110]
TABLE-US-00001 TABLE 1 CNR SNR Figure DAS SLSC LACE DAS SLSC LACE 12 A 3.82 5.86 11.96 2.08 7.81 16.79 12 B 3.81 4.47 9.33 2.50 8.82 13.38 12 C 3.75 5.20 10.65 2.53 7.99 14.52
[0111]
[0112]
[0113]
[0114]
[0115]
[0116]
[0117]
[0118]
[0119]
[0120]
[0121]
[0122] Exemplary embodiments of the methods/systems have been disclosed in an illustrative style. Accordingly, the terminology employed throughout should be read in a non-limiting manner. Although minor modifications to the teachings herein will occur to those well versed in the art, it shall be understood that what is intended to be circumscribed within the scope of the patent warranted hereon are all such embodiments that reasonably fall within the scope of the advancement to the art hereby contributed, and that that scope shall not be restricted, except in light of the appended claims and their equivalents.