Ultrasonic imaging device and image processing device
11141137 · 2021-10-12
Assignee
Inventors
Cpc classification
A61B8/4494
HUMAN NECESSITIES
G01S15/8945
PHYSICS
G01S7/52077
PHYSICS
A61B8/4477
HUMAN NECESSITIES
G01S15/8995
PHYSICS
G01S15/8913
PHYSICS
G01S7/52038
PHYSICS
A61B8/5207
HUMAN NECESSITIES
A61B8/5253
HUMAN NECESSITIES
International classification
Abstract
The invention is to provide an ultrasonic image with a clear tissue structure while reducing speckle noise of the ultrasonic image. An ultrasonic wave is transmitted from the transducer to the subject, and an echo generated in the subject is received. The first ultrasonic image and the second ultrasonic image are generated using a reception signal. The second ultrasonic image is an image smoother than the first ultrasonic image. The image processing unit calculates filter coefficients using pixel values of corresponding pixels of the first ultrasonic image and the second ultrasonic image, and generates an output image by processing one of the first ultrasonic image and the second ultrasonic image using the filter coefficients.
Claims
1. An ultrasonic imaging device, comprising: a processor coupled to a memory storing instructions that when executed configure the processor to: transmit an ultrasonic wave from one or more transducers to a subject by outputting a transmission signal to the one or more transducers, at the same time receive a reception signal output by the plurality of transducers that received an echo generated in the subject and perform a predetermined processing; generate a first ultrasonic image and a second ultrasonic image using the processed reception signal; generate and output an output image using the first ultrasonic image and the second ultrasonic image, generate the second ultrasonic image which is smoother than the first ultrasonic image, generate the output image by calculating filter coefficients using pixel values of corresponding pixels of the first ultrasonic image and the second ultrasonic image, and process one of the first ultrasonic image and the second ultrasonic image by the filter coefficients, wherein the processor is further configured to: use a coefficient a and a constant b as the filter coefficients, calculate the coefficient a and the constant b, with which a difference between a value obtained by multiplying a pixel value of a pixel of the first ultrasonic image by the coefficient a and adding the constant b and the pixel value of the corresponding pixel of the second ultrasound image is minimum, using the pixel values of a plurality of pixels in a window set for the first ultrasonic image and the second ultrasonic image, and calculate a pixel value of the output image by multiplying the pixel value of the pixel in the window of the first ultrasonic image by the calculated coefficient a and the constant b.
2. The ultrasonic imaging device according to claim 1, wherein the processor is further configured to: generate the first ultrasonic image such that a boundary of a tissue structure of the subject is enhanced, and generate the second ultrasonic image such that a speckle noise is reduced.
3. The ultrasonic imaging device according to claim 1, wherein the processor is further configured to process the first ultrasonic image by the filter coefficients.
4. The ultrasonic imaging device according to claim 1, wherein the processor is further configured to set a plurality of windows in the first ultrasonic image and the second ultrasonic image, and calculate the filter coefficients for each window.
5. The ultrasonic imaging device according to claim 4, wherein the processor is further configured to set the plurality of windows so as to partially overlap with each other, and calculate a pixel value of the output image using a value obtained by combining the coefficient a and the constant b obtained for the overlapping windows for the pixels located in the region where the windows overlap.
6. The ultrasonic imaging device according to claim 1, wherein at least one pair of the plurality of transducers are disposed at positions facing each other across the subject, and may receive a transmitted wave of the ultrasonic wave transmitted to the subject, and wherein the processor is further configured to generate a transmitted wave image of the subject using the reception signal of the transmitted wave.
7. The ultrasonic imaging device according to claim 1, wherein the plurality of transducers are arranged in an array, wherein the processor is further configured to generate the first ultrasonic image by delaying and adding the reception signals output from the plurality of transducers in the reception aperture set in advance, and wherein the processor is further configured to divide the reception aperture into a plurality of sub reception apertures, and delay and add the reception signals output from the plurality of transducers in the sub reception aperture, for each sub reception aperture, thereby generating the second ultrasonic image by obtaining ultrasonic images for each sub reception aperture and combining the obtained ultrasonic images for each sub reception aperture.
8. An image processing device, comprising: a processor coupled to a memory storing instructions that when executed configure the processor to: generate a first ultrasonic image and a second ultrasonic image by receiving a reception signal output by a plurality of transducers that received an echo generated in a subject transmitted with an ultrasonic wave, or an ultrasonic image generated from the reception signal; generate and output an output image using the first ultrasonic image and the second ultrasonic image, generate the second ultrasonic image which is smoother than the first ultrasonic image, and generate the output image by calculating filter coefficients using pixel values of corresponding pixels of the first ultrasonic image and the second ultrasonic image, and process one of the first ultrasonic image and the second ultrasonic image by the filter coefficients, wherein the processor is further configured to: use a coefficient a and a constant b as the filter coefficients, calculate the coefficient a and the constant b, with which a difference between a value obtained by multiplying a pixel value of a pixel of the first ultrasonic image by the coefficient a and adding the constant b and the pixel value of the corresponding pixel of the second ultrasound image is minimum, using the pixel values of a plurality of pixels in a window set for the first ultrasonic image and the second ultrasonic image, and calculate a pixel value of the output image by multiplying the pixel value of the pixel in the window of the first ultrasonic image by the calculated coefficient a and the constant b.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
DESCRIPTION OF EMBODIMENTS
(12) The ultrasonic imaging device according to an embodiment of the invention will be described using the drawings.
(13)
(14) As shown in
(15) One or more transducer 1 is connected to the transmission/reception unit 10. The transmission/reception unit 10 outputs a transmission signal to one or more transducer 1. As a result, an ultrasonic wave is transmitted from the transducer 1 to the subject 2, and an ultrasonic echo is generated in the subject 2. The generated echo is received by the transducer 1, and the transducer 1 outputs a reception signal. The transmission/reception unit 10 receives a reception signal output from the transducer 1 and performs a predetermined processing such as an A/D conversion.
(16) The image generation unit 20 generates the first ultrasonic image and the second ultrasonic image using the reception signal processed by the transmission/reception unit 10. The image generation unit 20 generates the image smoother than the first ultrasonic image as the second ultrasonic image. An example of the first ultrasonic image and the second ultrasonic image is illustrated in
(17) The image processing unit 30 calculates the filter coefficient using the pixel values of the corresponding pixels of the first ultrasonic image and the second ultrasonic image, and generates the output image by processing one of the first ultrasonic image and the second ultrasonic image according to the filter coefficient.
(18) Since the second ultrasonic image is a smoother image than the first ultrasonic image, the speckle noise is reduced compared to the first ultrasonic image, but the contour of the tissue structure of the subject 2 tends to be more blurred than the first ultrasonic image. On the other hand, the speckle noise increases since the first ultrasonic image is not smoother than the second ultrasonic image, but the contour of the tissue structure of the subject 2 tends to appear clearly with high contrast. Thus, the image of the contour of the speckle noise and the tissue structure appropriately determine the filter coefficient using two types of the ultrasonic images that appear indifferent characteristics, and it is possible to generate an image in which the contour of the tissue structure is clear while reducing the speckle noise by processing either the first ultrasonic image or the second ultrasonic image using the filter coefficient.
(19) Therefore, it is desirable that the first generation unit 21 of the image generation unit 20 generates the first ultrasonic image such that the boundary of the tissue structure of the subject 2 is enhanced, and the second generation unit 22 generates the second ultrasonic image such that speckle noise is reduced.
(20) In order to obtain an output image by the image processing unit 30, it is desirable that the image processed according to the filter coefficient is a first ultrasonic image in which the contour of the tissue structure of the subject 2 appears clearly.
(21) The image processing unit 30 uses coefficient a and constant b as filter coefficients, for example. A coefficient a and a constant b, with which the difference between a value (aE.sub.i+b) and the pixel value S.sub.i of the corresponding i-th pixel of the second ultrasound image is as small as possible, are calculated by the image processing unit 30 using the pixel values E.sub.i and S.sub.i of the plurality of pixels in the window set for the first ultrasonic image and the second ultrasonic image, the value (aE.sub.i+b) being obtained by multiplying the pixel value E.sub.i of the i-th pixel of the first ultrasonic image by the coefficient a and adding the constant b. The image processing unit 30 multiplies the pixel value E.sub.i of the pixel in the window of the first ultrasonic image by the calculated coefficient a and adds the constant b to obtain the pixel value of the output image O.sub.i.
(22) It is desirable that the image processing unit 30 sets a plurality of windows in the first ultrasonic image and the second ultrasonic image, and calculates a filter coefficient for each window. For example, as shown in
(23) For example, the image processing unit 30 can use an optimization method such as a least-square method to obtain a coefficient a and a constant b that give a minimum value of the difference between (aE.sub.i+b) and S.sub.i. When this optimization is executed, it is possible to stabilize the solution by adding a penalty term for stabilizing the solution, so it is possible to reduce the false image in the output image.
(24) In addition, when performing the above-described optimization, the image processing unit 30 may calculate a solution that gives the minimum value using a sequential calculation method such as a steepest descent method, or may calculate a solution that gives the minimum value by analytical calculation if an analysis solution that gives the minimum value is obtained.
(25) In addition, the degree of speckle reduction and the degree of clarification of the contour of the tissue boundary change according to the size of the window. When the window size is too small, the output image becomes equal to the speckle suppressed image, and if the window size is too large, the whole output image becomes a smoothed image. Therefore, for example, it is desirable to set various sizes of windows in advance using a plurality of ultrasonic images to calculate the filter coefficient to generate the output image, and it is desirable to determine an appropriate window size by using a method of selecting a window size having a large degree of reduction of the speckle and a large degree of clarification of the contour of the tissue boundary in advance, or a method of accepting a designation of the window size from a user, and the like.
(26) In addition, as shown in
(27) The ultrasonic imaging device 100 of the embodiment may be configured as an ultrasonic CT device having a function of generating a transmitted wave image from the transmitted wave of the ultrasonic wave transmitted through the subject 2 as well as a function of generating the image by an echo of the ultrasonic wave. In this case, as shown in
Specific Embodiments
(28) A specific configuration of the ultrasonic imaging device 100 of the embodiment will be described with reference to
(29) As shown in
(30) In the example in
(31) An imaging condition of the ultrasonic CT device are set by a user through the touch panel of the input unit 60 or the like.
(32) The transmission/reception unit 10, the image generation unit 20, the image processing unit 30, and the control unit 50 may be configured to realize each of the functions by a software, or realize a part or all of the functions by hardware. In the case of realizing by software, each unit is configured to include a processor (for example, a Central Processing Unit (CPU) or Graphics Processing Unit (GPU)), and a memory in which a program is stored in advance, and the processor realizes the functions by reading and executing the program. In the case of realizing by hardware, for example, a part or whole of each unit is constituted using a custom. IC such as Application Specific Integrated Circuit (ASIC) or a programmable IC such as Field-Programmable Gate Array (FPGA), and a circuit design may be performed so as to realize the operation.
(33) The operation of the ultrasonic CT device 100 will be described below using the flowchart of
(34) For the ultrasonic CT device in the embodiment, when the power is turned on, the control unit 50 takes in the temperature of the water in the water tank 103 from the thermometer, heats the water by the heating device until the temperature of the water reaches a predetermined temperature (about body temperature), and deaerates the water by the deaerator. Thus, the water tank 103 is filled with deaerated water adjusted to a predetermined temperature. In a state in which the subject 2 is not inserted into the water tank 103, the control unit 50 transmits and receives the ultrasonic wave under predetermined conditions, and acquires in advance the reception data before the subject 2 is inserted.
(35) As illustrated in
(36) Here, the speckle suppressed image is the second ultrasonic image described above, and is an image generated by an image generation method such as a spatial compound for reducing speckle, an image subjected to image processing such as smoothing processing for reducing speckle with respect to an image once generated by beam forming by an image generation method such as the delay addition method. The structure enhanced image is the above-described first ultrasonic image, and is an image obtained by beam forming by the delay addition method, or a once generated image subjected to the process of further enhancing the boundary or the like by the delay addition method or the like. The structure enhanced image may be an image in which the contour of the tissue structure of the subject 2 clearly appears with high contrast, and is not limited to an image subjected to image processing such as boundary enhancement processing. The filter processed image is an image generated by the image processing unit 30 calculating a filter coefficient using pixel values of corresponding pixels of the first ultrasonic image and the second ultrasonic image, and processing one of the first ultrasonic image and the second ultrasonic image using a filter coefficient. Thus, the filter processed image is an image with a clear tissue structure while reducing speckle.
(37) When the user selects an image of the type generated by the image display unit 40 by pressing any one of the buttons 61, 62, and 63, the control unit 50 displays a display prompting the subject 2 to lie on the bed 102 and to insert one breast into the water tank 103 on the image display unit 40. If the control unit 50 confirms that the breast of the subject 2 is inserted into the water tank 103 by operating the input unit 60 by the subject 2, the control unit 50 transmits and receives the ultrasonic waves from the transducer array 101 to the subject 2 (step 501). Specifically, under the control of the control unit 50, the transmission/reception unit 10 generates the transmission signal based on a condition input from the input unit 60 or a predetermined imaging condition, and outputs the transmission signal to one or a plurality of transducers 1 constituting the transducer array 101. As a result, as illustrated in
(38) As illustrated in
(39) The transmission/reception unit 10 converts the received signal (RF signal) into a digital signal by sampling.
(40) When the selection button of the image type accepted by the input unit 60 is the button 61 that selects the filter processed image (step 503), the control unit 50 causes image generation unit 20 to generate the structure enhanced image E and the speckle suppressed image S (step 504), and generates a filter processed image using the structure enhanced image E and the speckle suppressed image S (step 505).
(41) First, as illustrated in
(42) The first generation unit 21 will be further described in more details. As illustrated in
(43) On the other hand, as illustrated in
(44) Next, the image processing unit 30 calculates the filter coefficient using the structure enhanced image E and the speckle suppressed image S generated in step 504, and filter processes the structure enhanced image E using the calculated filter coefficient to generate a filter processed image. As a result, it is possible to generate the filter processed image in which the speckle is reduced and the contour of the tissue structure is clear. This process will be described in detail with reference to the flow of
(45) The image processing unit 30 sets a plurality of windows 23 at positions corresponding to the structure enhanced image E and the speckle suppressed image S as shown in
(46) Equation (1) representing the i-th pixel value O.sub.i of the output image is obtained for all the pixels in the kth window 23 using the pixel value E.sub.i of the i-th pixel in one kth window 23 of the structure enhanced image E, and the coefficient a.sub.k and the constant b.sub.k of the filter coefficient. In addition, expression (2) representing the i-th pixel value O.sub.i of the output image is obtained for all the pixels in the kth window 23 using the pixel value S.sub.i of the i-th pixel in the kth window 23 of the speckle reduced image S and unnecessary value n.sub.i such as noise for each pixel. As a result, equation (1) and equation (2) equal in number to the number of pixel m in the kth window are obtained.
O.sub.i=a.sub.kE.sub.i+b.sub.k (1)
O.sub.i=S.sub.i−n.sub.i (2)
(47) The image processing unit 30 calculates, by obtaining the solution in which n.sub.i is minimized, the coefficient a.sub.k and the constant b.sub.k determined for the kth window and n.sub.i determined for each pixel using m equation (1) and equation (2), respectively. As a result, the filter coefficients (coefficient a.sub.k, constant b.sub.k, and n.sub.i) for the window are calculated (step 1002). For example, the image processing unit 30 calculates the filter coefficients using the optimization method such as the least-square method.
(48) This is repeated until the filter coefficients are obtained for all the windows 23 (step 1003). Further, as shown in
(49) The structure enhanced image E is processed with the filter coefficient for each window to generate a filter processed image (output image) O (step 1004). Specifically, the pixel value O.sub.i of the filter processed image (output image) O is calculated by calculating the equation (1), using the pixel value E.sub.i of the structure enhanced image E and the filter coefficients (coefficient a.sub.k and constant b.sub.k) of the window 23 to which the pixel belongs.
(50) This filter process transfers a remarkable structure of the structure enhanced image E to the speckle suppressed image S, and has an effect of smoothing the speckle suppressed image with respect to a part having no significant structure in the structure enhanced image. As a result, it is possible to generate a filter processed image (output image) that achieves both the reduction of the speckle and the clarification of the contour image of the tissue structure.
(51) The image generation unit 30 proceeds to step 506 in
(52) On the other hand, if the selection button of the image type accepted from the user by the input unit 60 is not the button 61 for selecting the filter processed image in step 503, the process proceeds to step 507. If the user presses the button 63 for selecting the structure enhanced image E in step 507, the image generation unit 20 generates the structure enhanced image E in step 508. The generate processing of the structure enhanced image E is as described in step 504. Then, the process proceeds to step 509, in which the image processing unit 20 displays the structure enhanced image E on the image display unit 40.
(53) If the user presses the button 63 for selecting the speckle suppressed image S in step 507, the process proceeds to step 510, and the image generation unit 20 generates the speckle suppressed image by the processing method described in step 504. Then, in step 511, the image processing unit 20 displays the speckle suppressed image S on the image display unit 40.
(54) As described above, the ultrasonic CT device 100 according to the embodiment can display any one of the filter processed image, the structure enhanced image E, and the speckle suppressed image S that achieves both speckle reduction and clarification of contour image of tissue structure at the same time by the selection of the user.
(55) When the transducer array 101 is moved to a predetermined position (slice) by the driving unit 202 and the reflected wave signal is received, the processing in
(56) The ultrasonic CT device 100 can also generate a transmitted wave image (attenuation rate image, sound velocity image) of the subject 2 using the received signal of the transmitted wave. This will be briefly described below.
(57) The image generation unit 20 obtains an amplitude of each transducer 1 for the transmission signal received in a state in which the subject 2 is inserted in each view. On the other hand, the image generation unit 20 obtains the amplitude of the received signal of each transducer 1 received without inserting the subject 2. The image generation unit 20 calculates a difference in logarithm of the amplitude before and after the insertion of the subject 2 for each view and each reception channel. This collection of data is referred to as a sinogram. The image generation unit 20 reconstructs a tomographic image of the subject 2 by processing the sinogram of the difference in the logarithm of the amplitude with Filtered Back Projection (FBP) or the like widely used in the field of X-ray CT. Thus, a distribution image of the difference in the attenuation rate before and after insertion of the subject 2 is obtained. The image generation unit uses a predetermined value (estimated value) as the attenuation rate of water, thereby generates an image (attenuation image) that shows the attenuation rate (unit: dB/MHz/cm) distribution of the subject 2 from the distribution image of the difference in the attenuation rate.
(58) The image generation unit 20 performs Hilbert transformation in the time direction with respect to the transmission signal output from each transducer 1 in each view, and obtains the reception timing of the maximum amplitude of the received wave. The image generation unit 20 similarly obtains the reception timing of the maximum amplitude for the reception signals of each transducer 1 received before the insertion of the subject 2. The image generation unit 20 calculates the difference in reception timing before and after the insertion of the subject 2 for each view and each reception channel respectively to obtain the sinogram. The image generation unit 20 reconstructs the tomographic image by processing the sinogram of a difference in reception timing by a filter correction inverse projection method or the like. This tomographic image is the distribution image of the difference in “Slowness” of the ultrasonic wave before and after insertion of the subject 2. The “Slowness” is the reciprocal number of the sound velocity. The image generation unit 20 generates a distribution image (sound velocity image) of the sound velocity of the subject 2 from the distribution image of the difference of “Slowness” using the sound velocity value (estimated value) of water.
(59) A three-dimensional attenuation image and/or a sound velocity image can be generated by repeating the generation of the attenuation image and/or the generation of the sound velocity image for each slice in which the transducer array 101 is moved by the driving unit 202.
(60) According to the selection by the user, the ultrasonic imaging device (ultrasonic CT device) of the embodiment can generate one or more image of the filter processed image, the structure enhanced image E, and the speckle suppressed image S that achieves both speckle reduction and clarification of contour image of tissue structure, and the attenuation image and the sound velocity image, and display the images on the image display unit. Therefore, the ultrasound imaging device can assist the doctor in diagnosing the presence or absence of a tumor included in the tissue structure of the subject 2 by these images.
(61) In the above-described embodiment, an example in which the ring-shaped transducer array 101 as illustrated in
REFERENCE SIGN LIST
(62) 1: transducer 2: subject 9: input unit 10: transmission/reception unit 20: image generation unit 21: first generation unit 22: second generation unit 30: image processing unit 40: image display unit 50: control unit 60: input unit 100: ultrasonic imaging device (ultrasonic CT device) 101: transducer array 102: bed 103: water tank