Providing user interface in ultrasound system
11406362 · 2022-08-09
Assignee
Inventors
Cpc classification
A61B8/463
HUMAN NECESSITIES
G01S7/52073
PHYSICS
G01S7/52071
PHYSICS
A61B8/465
HUMAN NECESSITIES
International classification
A61B8/00
HUMAN NECESSITIES
G10K11/34
PHYSICS
Abstract
There are provided embodiments for providing a user interface for performing a filtering process upon a vector Doppler image. In one embodiment, by way of non-limiting example, an ultrasound system comprises: a processing unit configured to form vector information of a target object based on ultrasound data corresponding to the target object and form a user interface for performing the filtering process upon the vector Doppler image based on the vector information.
Claims
1. An ultrasound image processing apparatus comprising: a display configured to display a first user interface, wherein the first user interface comprises a first region corresponding to a vector Doppler image, and a second region corresponding to a graphic user interface for receiving a selection of vector information of a motion of an object, wherein the graphic user interface is circular and of a touch type; an input device comprising a touch screen, and configured to receive a first touch input to select a first region of interest in the first region and to receive second touch inputs in the circular graphic user interface to respectively select sector regions of the circular graphic user interface as second regions of interest within the graphic user interface; and a processor configured to: form vector information of the motion of the object corresponding to the first region of interest based on Doppler mode ultrasound data, wherein the vector information includes a direction and a velocity of the motion of the object, when one of the second touch inputs in the graphic user interface has a first distance to a center of the circular graphic user interface, determine a first center angle of one of the sector regions selected by the one of the second touch inputs, perform a first filtering process upon the vector Doppler image so as to filter out unselected vector information other than vector information, corresponding to the one of the sector regions having the first center angle as one of the second regions of interest, from among the formed vector information, and update the vector Doppler image based on the result of the first filtering process, and when another of the second touch inputs in the graphic user interface having a second distance to the center of the circular graphic user interface is less than the first distance, determine a second center angle of another of the sector regions selected by the another of the second touch inputs to be greater than the first center angle, perform a second filtering process upon the vector Doppler image so as to filter out unselected vector information other than vector information, corresponding to the another of the sector regions having the second center angle as another of the second regions of interest, from among the formed vector information, and update the vector Doppler image based on the result of the second filtering process.
2. The ultrasound image processing apparatus of claim 1, wherein the second region maps the vector information to colors.
3. The ultrasound image processing apparatus of claim 1, wherein the input device is configured to receive the first input which selects a plurality of first regions of interest within the vector Doppler image.
4. The ultrasound image processing apparatus of claim 1, wherein the processor is configured to display the vector information corresponding to a selected one of the second regions of interest on the vector Doppler image by using at least one arrow and at least one color.
5. The ultrasound image processing apparatus of claim 1, wherein the second region comprises a plurality of regions having colors, and the colors in the plurality of regions correspond to the vector information of the motion of the object.
6. A method of processing an ultrasound image, comprising: displaying a first user interface, wherein the first user interface comprises a first region corresponding to a vector Doppler image, and a second region corresponding to a graphic user interface for receiving a selection of vector information of a motion of an object, wherein the graphic user interface is circular and of a touch type; receiving a first touch input to select a first region of interest in the first region and second touch inputs in the circular graphic user interface to respectively select sector regions of the circular graphic user interface as second regions of interest within the graphic user interface; forming vector information of the motion of the object corresponding to the first region of interest based on Doppler mode ultrasound data, wherein the vector information includes a direction and a velocity of the motion of the object; when one of the second touch inputs in the graphic user interface has a first distance to a center of the circular graphic user interface, determining a first center angle of one of the sector regions selected by the one of the second touch inputs, performing a first filtering process upon the vector Doppler image so as to filter out unselected vector information other than vector information, corresponding to the one of the sector regions having the first center angle as one of the second regions of interest, from among the formed vector information and updating the vector Doppler image based on the result of the first filtering process; and when another of the second touch inputs in the graphic user interface having a second distance to the center of the circular graphic user interface is less than the first distance, determining a second center angle of another of the sector regions selected by the another of the second touch inputs to be greater than the first center angle, performing a second filtering process upon the vector Doppler image so as to filter out unselected vector information other than vector information, corresponding to the another of the sector regions having the second center angle as another of the second regions of interest, from among the formed vector information, and updating the vector Doppler image based on the result of the second filtering process.
7. The method of claim 6, wherein the second region maps the vector information to colors.
8. The method of claim 6, wherein the receiving the first input comprises receiving the first input which selects a plurality of first regions of interest within the vector Doppler image.
9. The method of claim 6, wherein the updating the vector Doppler image comprises displaying the vector information corresponding to a selected one of the second regions of interest on the vector Doppler image by using at least one arrow and at least one color.
10. The method of claim 6, wherein the second region comprises a plurality of regions having colors, and the colors in the plurality of regions correspond to the vector information of the motion of the object.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
DETAILED DESCRIPTION
(13) A detailed description may be provided with reference to the accompanying drawings. One of ordinary skill in the art may realize that the following description is illustrative only and is not in any way limiting. Other embodiments of the present invention may readily suggest themselves to such skilled persons having the benefit of this disclosure.
(14) Referring to
(15) The user input unit 110 may be configured to receive input information from a user. In one embodiment, the input information may include first input information for setting a first region of interest ROI.sub.1 on a brightness mode image BI, as shown in
(16) The ultrasound system 100 may further include an ultrasound data acquiring unit 120. The ultrasound data acquiring unit 120 may be configured to transmit ultrasound signals to a living body. The living body may include the target object (e.g., blood flow, blood vessel, heart, etc.). The ultrasound data acquiring unit 120 may be further configured to receive ultrasound signals (i.e., ultrasound echo signals) from the living body to acquire ultrasound data corresponding to an ultrasound image.
(17)
(18) The ultrasound probe 310 may include a plurality of elements 311 (see
(19) The ultrasound data acquiring unit 120 may further include a transmitting section 320. The transmitting section 320 may be configured to control the transmission of the ultrasound signals. The transmitting section 320 may be also configured to generate electrical signals (hereinafter referred to as “transmission signals”) in consideration of the elements 311.
(20) In one embodiment, the transmitting section 320 may be configured to generate transmission signals (hereinafter referred to as “brightness mode transmission signals”) for obtaining the brightness mode image BI in consideration of the elements 311. Thus, the ultrasound probe 310 may be configured to convert the brightness mode transmission signals provided from the transmitting section 320 into the ultrasound signals, transmit the ultrasound signals to the living body, and receive the ultrasound echo signals from the living body to output reception signals (hereinafter referred to as “brightness mode reception signals”).
(21) The transmitting section 320 may be further configured to generate transmission signals (hereinafter referred to as “Doppler mode transmission signals”) corresponding to an ensemble number in consideration of the elements 311 and at least one transmission direction of the ultrasound signals (i.e., transmission beam). Thus, the ultrasound probe 310 may be configured to convert the Doppler mode transmission signals provided from the transmitting section 320 into the ultrasound signals, transmit the ultrasound signals to the living body in the at least one transmission direction, and receive the ultrasound echo signals from the living body to output reception signals (hereinafter referred to as “Doppler mode reception signals”). The ensemble number may represent the number of transmitting and receiving the ultrasound signals to and from the living body.
(22) As one example, the transmitting section 320 may be configured to generate the Doppler mode transmission signals corresponding to the ensemble number in consideration of a transmission direction Tx and the elements 311, as shown in
(23) As another example, the transmitting section 320 may be configured to generate first Doppler mode transmission signals corresponding to the ensemble number in consideration of a first transmission direction Tx.sub.1 and the elements 311, as shown in
(24) In another embodiment, the transmitting section 320 may be configured to generate the brightness mode transmission signals for obtaining the brightness mode image BI in consideration of the elements 311. Thus, the ultrasound probe 310 may be configured to convert the brightness mode transmission signals provided from the transmitting section 320 into the ultrasound signals, transmit the ultrasound signals to the living body, and receive the ultrasound echo signals from the living body to output the brightness mode reception signals.
(25) The transmitting section 320 may be further configured to generate the Doppler mode transmission signals corresponding to the ensemble number in consideration of the at least one transmission direction and the elements 311. Thus, the ultrasound probe 310 may be configured to convert the Doppler mode transmission signals provided from the transmitting section 320 into the ultrasound signals, transmit the ultrasound signals to the living body, and receive the ultrasound echo signals from the living body to output the Doppler mode reception signals. The ultrasound signals may be transmitted in an interleaved transmission scheme. The interleaved transmission scheme will be described below in detail.
(26) For example, the transmitting section 320 may be configured to generate the first Doppler mode transmission signals in consideration of the first transmission direction Tx.sub.1 and the elements 311, as shown in
(27) Thereafter, the transmitting section 320 may be configured to generate the first Doppler mode transmission signals based on the pulse repeat interval, as shown in
(28) As described above, the transmitting section 320 may be configured to generate the first Doppler mode transmission signals and the second Doppler mode transmission signals corresponding to the ensemble number.
(29) In yet another embodiment, the transmitting section 320 may be configured to generate the brightness mode transmission signals for obtaining the brightness mode image BI in consideration of the elements 311. Thus, the ultrasound probe 310 may be configured to convert the brightness mode transmission signals provided from the transmitting section 320 into the ultrasound signals, transmit the ultrasound signals to the living body, and receive the ultrasound echo signals from the living body to output the brightness mode reception signals.
(30) The transmitting section 320 may be further configured to generate the Doppler mode transmission signals corresponding to the ensemble number in consideration of the at least one transmission direction and the elements 311. Thus, the ultrasound probe 310 may be configured to convert the Doppler mode transmission signals provided from the transmitting section 320 into the ultrasound signals, transmit the ultrasound signals to the living body in the at least one transmission direction, and receive the ultrasound echo signals from the living body to output the Doppler mode reception signals. The ultrasound signals may be transmitted according to the pulse repeat interval.
(31) For example, the transmitting section 320 may be configured to generate the first Doppler mode transmission signals in consideration of the first transmission direction Tx.sub.1 and the elements 311 based on the pulse repeat interval, as shown in
(32) As described above, the transmitting section 320 may be configured to generate the first Doppler mode transmission signals and the second Doppler mode transmission signals corresponding to the ensemble number based on the pulse repeat interval.
(33) Referring back to
(34) In one embodiment, the receiving section 330 may be configured to perform the analog-digital conversion upon the brightness mode reception signals provided from the ultrasound probe 310 to form sampling data (hereinafter referred to as “brightness mode sampling data”). The receiving section 330 may be further configured to perform the reception beam-forming upon the brightness mode sampling data to form reception-focused data (hereinafter referred to as “brightness mode reception-focused data”).
(35) The receiving section 330 may be further configured to perform the analog-digital conversion upon the Doppler mode reception signals provided from the ultrasound probe 310 to form sampling data (hereinafter referred to as “Doppler mode sampling data”). The receiving section 330 may be further configured to perform the reception beam-forming upon the Doppler mode sampling data to form reception-focused data (hereinafter referred to as “Doppler mode reception-focused data”) corresponding to at least one reception direction of the ultrasound echo signals (i.e., reception beam).
(36) As one example, the receiving section 330 may be configured to perform the analog-digital conversion upon the Doppler mode reception signals provided from the ultrasound probe 310 to form the Doppler mode sampling data. The receiving section 330 may be further configured to perform the reception beam-forming upon the Doppler mode sampling data to form first Doppler mode reception-focused data corresponding to a first reception direction Rx.sub.1 and second Doppler mode reception-focused data corresponding to a second reception direction Rx.sub.2, as shown in
(37) As another example, the receiving section 330 may be configured to perform the analog-digital conversion upon the first Doppler mode reception signals provided from the ultrasound probe 310 to form first Doppler mode sampling data corresponding to the first transmission direction Tx.sub.1, as shown in
(38) The reception beam-forming may be described with reference to the accompanying drawings.
(39) In one embodiment, the receiving section 330 may be configured to perform the analog-digital conversion upon the reception signals provided through a plurality of channels CH.sub.k, wherein 1≤k≤N, from the ultrasound probe 310 to form sampling data S.sub.i,j, wherein the i and j are a positive integer, as shown in
(40) For example, the receiving section 330 may be configured to set a curve (hereinafter referred to as “reception beam-forming curve”) CV.sub.6,3 for selecting pixels, which the sampling data S.sub.6,3 are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements 311 and the orientation of the respective pixels of the ultrasound image UI with respect to the elements 311, as shown in
(41) Thereafter, the receiving section 330 may be configured to set a reception beam-forming curve CV.sub.6,4 for selecting pixels, which the sampling data S.sub.6,4 are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements 311 and the orientation of the respective pixels of the ultrasound image UI with respect to the elements 311, as shown in
(42) The receiving section 330 may be configured to perform the reception beam-forming (i.e., summing) upon the sampling data, which are cumulatively assigned to the respective pixels P.sub.a,b of the ultrasound image UI to form the reception-focused data.
(43) In another embodiment, the receiving section 330 may be configured to perform the analog-digital conversion upon the reception signals provided through the plurality of channels CH.sub.k from the ultrasound probe 310 to form the sampling data S.sub.i,j, as shown in
(44) For example, the receiving section 330 may be configured to set the reception beam-forming curve CV.sub.6,3 for selecting pixels, which the sampling data S.sub.6,3 are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements 311 and the orientation of the respective pixels of the ultrasound image UI with respect to the elements 311, as shown in
(45) The receiving section 330 may be configured to perform the reception beam-forming upon the sampling data, which are cumulatively assigned to the respective pixels P.sub.a,b of the ultrasound image UI to form the reception-focused data.
(46) In yet another embodiment, the receiving section 330 may be configured to perform the analog-digital conversion upon the reception signals provided through the plurality of channels CH.sub.k from the ultrasound probe 310 to form the sampling data S.sub.i,j, as shown in
(47) For example, the receiving section 330 may be configured to set the sampling data S.sub.1,1, S.sub.1,4, . . . S.sub.1,t, S.sub.2,1, S.sub.2,4, S.sub.2,t, S.sub.p,t as the sampling data set (denoted by a box) for selecting the pixels, which the sampling data S.sub.i,j are used as the pixel data thereof, during the reception beam-forming, as shown in
(48) The receiving section 330 may be further configured to detect the pixels corresponding to the respective sampling data of the sampling data set based on the positions of the elements 311 and the positions (orientation) of the respective pixels of the ultrasound image UI with respect to the elements 311. That is, the receiving section 330 may select the pixels, which the respective sampling data of the sampling data set are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements 311 and the orientation of the respective pixels of the ultrasound image UI with respect to the elements 311. The receiving section 330 may be further configured to cumulatively assign the sampling data to the selected pixels in the same manner as the above embodiments. The receiving section 330 may be also configured to perform the reception beam-forming upon the sampling data, which are cumulatively assigned to the respective pixels of the ultrasound image UI to form the reception-focused data.
(49) In yet another embodiment, the receiving section 330 may be configured to perform down-sampling upon the reception signals provided through the plurality of channels CH.sub.k from the ultrasound probe 310 to form down-sampled data. As described above, the receiving section 330 may be further configured to detect the pixels corresponding to the respective sampling data based on the positions of the elements 311 and the positions (orientation) of the respective pixels of the ultrasound image UI with respect to the elements 311. That is, the receiving section 330 may select the pixels, which the respective sampling data are used as the pixel data thereof, during the reception beam-forming based on the positions of the elements 311 and the orientation of the pixels of the ultrasound image UI with respect to the elements 311. The receiving section 330 may be further configured to cumulatively assign the respective sampling data to the selected pixels in the same manner as the above embodiments. The receiving section 330 may be further configured to perform the reception beam-forming upon the sampling data, which are cumulatively assigned to the respective pixels of the ultrasound image UI to form the reception-focused data.
(50) However, it should be noted herein that the reception beam-forming may not be limited thereto.
(51) Referring back to
(52) In one embodiment, the ultrasound data forming section 340 may be configured to form ultrasound data (hereinafter referred to as “brightness mode ultrasound data”) corresponding to the brightness mode image based on the brightness mode reception-focused data provided from the receiving section 330. The brightness mode ultrasound data may include radio frequency data.
(53) The ultrasound data forming section 340 may be further configured to form ultrasound data (hereinafter referred to as “Doppler mode ultrasound data”) corresponding to the first region of interest ROI.sub.1 based on the Doppler mode reception-focused data provided from the receiving section 330. The Doppler mode ultrasound data may include in-phase/quadrature data. However, it should be noted herein that the Doppler mode ultrasound data may not be limited thereto.
(54) For example, the ultrasound data forming section 340 may form first Doppler mode ultrasound data based on the first Doppler mode reception-focused data provided from the receiving section 330. The ultrasound data forming section 340 may further form second Doppler mode ultrasound data based on the second Doppler mode reception-focused data provided from the receiving section 330.
(55) Referring back to
(56)
(57) The processing unit 130 may be configured to set the first region of interest ROL on the brightness mode image BI based on the input information (i.e., first input information) provided from the user input unit 110, at step S1504 in
(58) The processing unit 130 may be configured to form vector information based on the Doppler mode ultrasound data provided from the ultrasound data acquiring unit 120, at step S1506 in
(59) Generally, when the transmission direction of the ultrasound signals is equal to the reception direction of the ultrasound echo signals and a Doppler angle is 0, the following relationship may be established:
(60)
(61) In equation 1, X represents a reflector velocity (i.e., velocity of target object), C.sub.0 represents a sound speed in the living body, f.sub.d represents a Doppler shift frequency, and f.sub.0 represents an ultrasound frequency.
(62) The Doppler shift frequency f.sub.d may be calculated by the difference between a frequency of the ultrasound signals (i.e., transmission beam) and a frequency of the ultrasound echo signals (i.e., reception beam). Also, the velocity component X cos θ projected to the transmission direction may be calculated by equation 1.
(63) When the transmission direction of the ultrasound signals (i.e., transmission beam) is different from the reception direction of the ultrasound echo signals (i.e., reception beam), the following relationship may be established:
(64)
(65) In equation 2, θ.sub.T represents an angle between the ultrasound signals (i.e., transmission beam) and the blood flow, and θ.sub.R represents an angle between the ultrasound echo signals (i.e., reception beam) and the blood flow.
(66)
{right arrow over (α.sub.1)}{right arrow over (X)}=α.sub.11x.sub.1+α.sub.12x.sub.2=y.sub.1=X cos θ (3)
(67) In equation 3, {right arrow over (α.sub.1)}=(α.sub.11,α.sub.12) represents a unit vector of the first direction D1, {right arrow over (X)}=(x.sub.1,x.sub.2) represents variables, and y.sub.1 is calculated by equation 1.
(68) When the ultrasound signals (i.e., transmission beam) are transmitted in a second direction D2 and the ultrasound echo signals (i.e., reception beam) are received in a third direction D3, the following relationship may be established:
(α.sub.21+α.sub.31)x.sub.1+(α.sub.22+α.sub.32)x.sub.2=(y.sub.2+y.sub.3)=X cos θ.sub.2+X cos θ.sub.3 (4)
(69) Equations 3 and 4 assume a two-dimensional environment. However, equations 3 and 4 may be expanded to a three-dimensional environment. That is, when expanding equations 3 and 4 to the three-dimensional environment, the following relationship may be established:
α.sub.11x.sub.1+α.sub.12x.sub.2+α.sub.13x.sub.3=y (5)
(70) In the case of the two-dimensional environment (i.e., two-dimensional vector), at least two equations are required to calculate the variables x.sub.1 and x.sub.2. For example, when the ultrasound signals (i.e., transmission beam) are transmitted in the third direction D3 and the ultrasound echo signals (i.e., reception beam) are received in the second direction D2 and a fourth direction D4 as shown in
(α.sub.31+α.sub.21)x.sub.1+(α.sub.32+α.sub.22)x.sub.2=(y.sub.3+y.sub.2)
(α.sub.31+α.sub.41)x.sub.1+(α.sub.32+α.sub.42)x.sub.2=(y.sub.3+y.sub.4) (6)
(71) The vector {right arrow over (X)}=(x.sub.1,x.sub.2) may be calculated by the equations of equation 6.
(72) When the reception beam-forming is performed in at least two angles (i.e., at least two reception directions), at least two equations may be obtained and represented as the over-determined problem, as shown in
(73) The processing unit 130 may be configured to form a vector Doppler image VDI as shown in
(74) Optionally, the processing unit 130 may be configured to compound the brightness mode image BI and the vector Doppler image VDI to form a compound image.
(75) The processing unit 130 may be configured to form the user interface GUI as shown in
(76) In operation S1512, the processing unit 130 may determine a second region of interest in the user interface GUI, based on input information (i.e., second input information).
(77) As an example, as shown in
(78) As another example, as shown in
(79) The processing unit 130 may be configured to perform a filtering process upon the vector Doppler image VDI based on the second region of interest ROI.sub.2, at step S1514 in
(80) In the above embodiment, the filtering process for representing only the vector information corresponding to the second region of interest ROI.sub.2 is performed upon the vector Doppler image VDI. However, it should be noted herein that the filtering process may not be limited thereto.
(81) Referring back to
(82) The ultrasound system 100 may further include the display unit 150. The display unit 150 may be configured to display the brightness mode image BI formed by the processing unit 130. The display unit 150 may be also configured to display the vector Doppler image VDI formed by the processing unit 130. The display unit 150 may be additionally configured to display the user interface GUI formed by the processing unit 130.
(83) Although embodiments have been described with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More particularly, numerous variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.