Ultrasonic imaging apparatus and control method thereof
11083434 · 2021-08-10
Assignee
Inventors
Cpc classification
A61B8/463
HUMAN NECESSITIES
A61B8/5223
HUMAN NECESSITIES
G01S7/52073
PHYSICS
A61B8/523
HUMAN NECESSITIES
A61B8/085
HUMAN NECESSITIES
A61B8/483
HUMAN NECESSITIES
G01S7/52074
PHYSICS
A61B8/5253
HUMAN NECESSITIES
International classification
A61B8/00
HUMAN NECESSITIES
Abstract
Disclosed herein are an ultrasonic imaging apparatus of successively displaying a plurality of slice images of an object at predetermined frame rate, and a control method of the ultrasonic imaging apparatus. According to an embodiment of the ultrasonic imaging apparatus, the ultrasonic imaging apparatus may include: an image processor configured to extract a target in an object based on volume data of the object; a controller configured to determine a region of interest in the object, based on the extracted target; and a display unit configured to successively display a plurality of slice images of the object, including the region of interest.
Claims
1. An ultrasonic imaging apparatus comprising: an image processor configured to extract a target in an object based on volume data of the object and generate a plurality of slice images of the object using the volume data; a controller configured to determine a region of interest in the object based on the extracted target; and a display configured to successively display, by a control of the controller, the plurality of slice images of the object, including the region of interest under controlling of the controller, wherein the controller controls the display to: successively display the plurality of slice images of the object based on a predetermined frame rate, successively displays a first plurality of slice images of the object, among the plurality of slice images, for an entire section of the object, the entire section of the object decided with respect to the volume data of the object, in response to detecting a first slice image including the region of interest, stop the successive display of the first plurality of slice images and display the detected first slice image on a first plane, after the successive display of the first plurality of slice images has stopped while the detected first slice image is being displayed on the first plane, successively displays a second plurality of slice images which is included in a boundary line of the region of interest on a second plane, among the plurality of slice images, when the display successively displays a plurality of the first slice images of the object including the region of interest, mark a position of the first slice image being currently displayed in a second slice image of the object, and display the first plurality of the slice images on the first plane perpendicular to a first direction, and display the second plurality of slice images of the object on the second plane perpendicular to a second direction that is perpendicular to the first direction.
2. The ultrasonic imaging apparatus according to claim 1, wherein the display highlights the extracted target in the plurality of slice images of the object.
3. The ultrasonic imaging apparatus according to claim 1, wherein the display displays the region of interest as a section in the plurality of slice images.
4. The ultrasonic imaging apparatus according to claim 1, wherein the controller determines the region of interest based on a size of the extracted target.
5. The ultrasonic imaging apparatus according to claim 1, wherein the controller determines, as the region of interest, a region satisfying a predetermined condition in the remaining area of the object not extracted as the target.
6. The ultrasonic imaging apparatus according to claim 1, wherein the display displays the plurality of slice images of the object, including the region of interest, at the same time.
7. The ultrasonic imaging apparatus according to claim 1, further comprising at least one of a keyboard, a foot switch, and a foot pedal configured to receive at least one command of a command for determining the region of interest as the target, and a command for cancelling the extracted target.
8. The ultrasonic imaging apparatus according to claim 1, wherein the display marks the position of the first slice image being currently displayed, in a third slice image of the object, wherein the third slice image of the object is perpendicular to a third direction being perpendicular to the first direction and the second direction.
9. The ultrasonic imaging apparatus according to claim 1, wherein the display marks a position of a slice image being currently displayed, in a 3Dimensional (3D) image of the object.
10. A method of controlling an ultrasonic imaging apparatus which comprises a controller configured to perform: transmitting ultrasonic waves to an object, and obtaining volume data based on echo ultrasonic waves reflected from the object; extracting a target in the object based on the volume data of the object; generating a plurality of slice images of the object using the volume data; determining a region of interest in the object, based on the extracted target; controlling a display to successively display the plurality of slice images of the object, including the region of interest based on a predetermined frame rate; and controlling the display to successively display a first plurality of slice images of the object, among the plurality of slice images, for an entire section of the object, the entire section of the object decided with respect to the volume data of the object, wherein, in response to detecting a first slice image including the region of interest, the controller controls the display to stop the successive display of the first plurality of slice images and to display the detected first slice image on a first plane, wherein, after the successive display of the first plurality of slice images has stopped while the detected first slice image is being displayed on the first plane, the controller controls the display to successively display a second plurality of slice images which is included in a boundary line of the region of interest on a second plane, among the plurality of slice images, wherein the successively displaying of the first plurality of slice images comprises: successively displaying a first plurality of the slice images of the object, including the region of interest; and marking a position of the first slice image being currently displayed, in a second slice image of the object, and wherein the controller controls the display to display the first plurality of the slice images on the first plane perpendicular to a predetermined first direction, and to display the second plurality of slice images of the object on the second plane perpendicular to a second direction that is perpendicular to the first direction.
11. The method according to claim 10, wherein the successively displaying of the plurality of slice images of the object, including the region of interest, comprises highlighting the extracted target in the plurality of slice images of the object.
12. The method according to claim 10, wherein the successively displaying of the plurality of slice images of the object, including the region of interest, comprises displaying the region of interest as a section in the plurality of slice images.
13. The method according to claim 10, wherein the determining of the region of interest in the object comprises determining the region of interest based on a size of the extracted target.
14. The method according to claim 10, wherein the determining of the region of interest in the object comprises determining, as the region of interest, a region satisfying a predetermined condition in the remaining area of the object not extracted as the target.
15. The method according to claim 10, further comprising displaying the plurality of slice images of the object, including the region of interest, at the same time, according to an input from a user.
16. The method according to claim 10, further comprising determining the region of interest as the target or cancelling the extracted target, according to an input from a user.
17. The method according to claim 10, wherein the successively displaying of the second plurality of slice images of the object, comprises marking the position of the first slice image being currently displayed, in a third slice image of the object, wherein the third slice image of the object is perpendicular to a third direction being perpendicular to the first direction and the second direction.
18. The method according to claim 10, wherein the successively displaying of the plurality of slice images of the object, including the region of interest, comprises marking a position of a slice image being currently displayed, in a 3Dimensional (3D) image of the object.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) These and/or other aspects of the disclosure will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
DETAILED DESCRIPTION
(12) Reference will now be made in detail to the embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
(13) Hereinafter, an ultrasonic imaging apparatus and a control method thereof according to embodiments of the present disclosure will be described in detail with reference to the accompanying drawings.
(14) In the following description, an “ultrasound image” means an image of an object acquired using ultrasonic waves. Also, the “object” means a human body, an animal, a metal, a nonmetal, or a part thereof. For example, the object may include vessels or organs, such as a liver, a heart, a uterus, a brain, breasts, abdomen, etc. Also, the object may include a phantom, and the phantom means a material having a volume that is very close to an effective atomic number and a density of a living body.
(15) Also, in the following description, a “target” may be a part of an object, which a user wants to examine through an ultrasound image.
(16) Also, in the following description, a “user” may be a medical professional, such as a doctor, a nurse, a medical technologist, or a medical imaging professional, or may be a technician that repairs medical equipment, although not limited to them.
(17)
(18) In one side of the main body 100, one or more female connectors 145 may be provided. A male connector 140 connected to a cable 130 may be physically coupled with one of the female connectors 145.
(19) Meanwhile, at the bottom of the main body 100, a plurality of castors (not shown) may be provided to move the ultrasound imaging apparatus. The plurality of castors may fix the ultrasound imaging apparatus at a specific location or move the ultrasound imaging apparatus in a specific direction. The ultrasound imaging apparatus is called a cart-type ultrasound imaging apparatus.
(20) However, the ultrasound imaging apparatus may be a portable ultrasound imaging apparatus that can be possessed by a user even when he/she moves to a long distance. The portable ultrasound imaging apparatus may not include castors. Examples of such a portable ultrasound imaging apparatus include a Picture Archiving and Communication System (PACS) viewer, a smart phone, a laptop computer, PDA, and a tablet PC, although not limited to these.
(21) The ultrasound probe 200 may contact the surface of an object to transmit and receive ultrasonic waves. More specifically, the ultrasound probe 200 may transmit ultrasonic waves to the inside of an object according to a transmission signal received from the main body 100, receive echo ultrasonic waves reflected from a specific part inside the object, and then transfer the received echo ultrasonic waves to the main body 100.
(22) The ultrasound probe 200 may be connected to one end of the cable 120, and the other end of the cable 130 may be connected to the male connector 140. The male connector 140 connected to the other end of the cable 130 may be physically coupled with the female connector 145 of the main body 100.
(23) Alternatively, the ultrasound probe 200 may be connected to the main body 100 in a wireless fashion. In this case, the ultrasound probe 200 may transmit echo ultrasonic waves received from an object to the main body 100 in the wireless fashion. Also, a plurality of ultrasound probes may be connected to the main body 100.
(24) Meanwhile, in the main body 100, an image processor 300 (see
(25) The image processor 300 may perform scan conversion on echo ultrasonic waves to create an ultrasound image. The ultrasound image may be a gray sale image acquired by scanning an object in Amplitude mode (A-mode), Brightness mode (B-mode) or Motion mode (M-mode), or a Doppler image that represents a moving object using the Doppler effect. The Doppler image may include a blood flow Doppler image (or called a color Doppler image) showing flow of blood, a tissue Doppler image showing movement of a tissue, and a spectral Doppler image showing moving speed of an object as a waveform.
(26) The image processor 300 may extract B-mode components from echo ultrasonic waves received by the ultrasound probe 200 in order to create a B-mode image. The image processor 300 may create an ultrasound image in which intensity of echo ultrasonic waves is represented to be bent, based on the B-mode components.
(27) Likewise, the image processor 300 may extract Doppler components from echo ultrasonic waves, and create a Doppler image in which movement of an object is represented as a color or waveform, based on the Doppler components.
(28) Also, the image processor 300 may perform volume rendering on volume data acquired through echo ultrasonic waves to create a 3Dimensional (3D) ultrasound image, or create an elastic image resulting from imaging a degree of deformation of an object according to pressure. In addition, the image processor 300 may represent various additional information in the form of text or graphic images on the ultrasound image.
(29) Meanwhile, the ultrasound image may be stored in an internal memory of the main body 100 or in an external memory. Alternatively, the ultrasound image may be stored in a web storage or a cloud server that performs a storage function on the web.
(30) The input unit 150 is used to receive commands related to operations of the ultrasound imaging apparatus. For example, the input unit 150 may receive a mode selection command for selecting a mode, such as an A mode, a B mode, a M mode, or a Doppler mode. Also, the input unit 150 may receive a diagnosis start command.
(31) A command input through the input unit 150 may be transferred to the main body 100 through wired/wireless communication.
(32) The input unit 150 may include at least one of a keyboard, a foot switch, and a foot pedal. The keyboard may be hardwarily implemented, and disposed on the upper part of the main body 100. The keyboard may include at least one(s) of a switch, keys, a joystick, and a trackball. As another example, the keyboard may be softwarily implemented as a graphic user interface (GUI). In this case, the keyboard may be displayed through a sub display unit 162 or a main display unit 161. The foot switch or the foot pedal may be provided in the lower part of the main body 100, and a user may control operations of the ultrasonic imaging apparatus using the foot pedal.
(33) The display unit 160 may include the main display unit 161 and the sub display unit 162.
(34) The sub display unit 162 may be mounted on the main body 100. In
(35) The main display unit 161 may be also mounted on the main body 100. In
(36) Also, in
(37) Meanwhile, the ultrasound imaging apparatus may further include a communication unit. The communication unit may connect to a network in a wired/wireless fashion to communicate with an external device or a server. The communication unit may receive/transmit data from/to a hospital server or other medical apparatuses in a hospital, connected through Picture Archiving and Communication System (PACS). Also, the communication unit may perform data communication according to a Digital Imaging and Communications in Medicine (DICOM) standard.
(38) The communication unit may transmit/receive data related to diagnosis of an object, such as an ultrasound image, echo ultrasonic waves, and Doppler data of the object, through the network. Also, the communication unit may transmit/receive medical images photographed by another medical apparatus, such as a CT scanner, a MRI apparatus, an X-ray apparatus, etc., through the network. In addition, the communication unit may receive information about a patient's diagnosis history, therapeutic schedule, etc., from a server, and use the information for diagnosis of an object. Furthermore, the communication unit may perform data communication with a doctor's or patient's mobile terminal, as well as a server or a medical apparatus in a hospital.
(39) The communication unit may connect to the network in a wired/wireless fashion to receive/transmit data from/to a server, a medical apparatus, or a mobile terminal. The communication unit may include one or more components to enable communications with external devices, and may include a short-range communication module, a wired communication module, and a mobile communication module.
(40) The short-range communication module may be a module for short-range communication within a predetermined distance. The short-range communication may be Wireless LAN (WLAN), Wireless-Fidelity (Wi-Fi), Bluetooth, Zigbee, Wi-Fi Direct (WFD), Ultra Wideband (UWB), Infrared Data Association (IrDA), Bluetooth Low Energy (BLE), or Near Field Communication (NFC), although the short-range communication is not limited to these.
(41) The wired communication module may be a module for communication based on electrical signals or optical signals, and may be a pair cable, a coaxial cable, an optical fiber cable, or an Ethernet cable.
(42) The mobile communication module may transmit/receive radio signals from/to at least one of a base station, an external terminal, and a server over a mobile communication network. Herein, the radio signals may include voice call signals, video call signals, or various kinds of data according to text/multimedia message transmission/reception.
(43)
(44) Referring to
(45) The ultrasound probe 200 may irradiate ultrasonic signals to an object, and receive echo ultrasonic waves reflected from the object. Since ultrasonic waves are reflected with different degrees of reflectance according to medium, the ultrasound probe 200 may acquire information about the inside of an object by collecting echo ultrasonic waves reflected from the object.
(46) The ultrasound probe 200 may be implemented in various ways within the technical concept of acquiring volume data of an object. For example, if the transducer elements of the ultrasound probe 200 has a 1 Dimensional (1D) arrangement, the ultrasound probe 200 may acquire volume data according to a freehand method. Also, the ultrasound probe 200 may acquire volume data according to a mechanical method, without having to receive a user's manipulation. If the transducer elements of the ultrasound probe 200 has a 2Dimensional (2D) arrangement, the ultrasound probe 200 may acquire volume data by controlling the transducer elements.
(47) The image processor 300 may create an ultrasound image of the object using the volume data of the object. At this time, the image processor 300 may create 3D ultrasound images of the object, as well as 2D ultrasound images about sections of the object.
(48) In order to create a 3D ultrasound image, the image processor 300 performs volume rendering on the volume data. The image processor 300 may volume-render the 3D volume data using one of volume rendering methods well-known in the art. In detail, volume rendering may be classified into surface rendering and direct volume rendering.
(49) The surface rendering is to extract surface information from volume data based on predetermined scalar values and amounts of spatial changes, to convert the surface information into a geometric factor, such as a polygon or a surface patch, and then to apply a conventional rendering technique to the geometric factor. Examples of the surface rendering are a marching cubes algorithm and a dividing cubes algorithm.
(50) The direct volume rendering is to directly render volume data without converting volume data into a geometric factor. The direct volume rendering is useful to represent a translucent structure since it can visualize the inside of an object as it is. The direct volume rendering may be classified into an object-order method and an image-order method according to a way of approaching volume data.
(51) The image-order method is to sequentially decide pixel values of an image. An example of the image-order method is volume ray casting. The object-order method is to directly project volume data on an image. An example of the object-order method is splatting.
(52) Also, the image processor 300 may extract a target from volume data. For example, if an object is a human's uterus and targets are follicles in the uterus, the image processor 300 may extract follicles using volume data.
(53) The image processor 300 may be implemented in various ways within the technical concept of extracting a target inside an object based on volume data. For example, the image processor 300 may extract, as a target, a volume data region having brightness values that are within a predetermined range. Also, the image processor 300 may extract a target by determining whether or not the size of a volume data region having predetermined brightness values is within a predetermined range.
(54) The display unit 160 may display the ultrasound image created by the image processor 300. More specifically, the display unit 160 may display a slice image or a 3D image of the object created by the image processor 300, or may display the slice image and the 3D image of the object together.
(55) At this time, the display unit 160 may highlight a target extracted from the slice image being displayed. For example, the display unit 160 may change the color or shade of an area corresponding to the target, or may change the color or shade of the boundary line of the area corresponding to the target. Alternatively, the display unit 160 may display a marker indicating a location of the area corresponding to the target.
(56) Hereinafter, a method in which the display unit 160 displays ultrasound images under the control of the controller 400 will be described in detail.
(57) The controller 400 may control the display unit 160 to successively display a plurality of slice images of an object for an entire section of the object decided by volume data of the object. The display unit 160 may successively display a plurality of slice images of the object for the entire section of the object, at predetermined frame rate, under the control of the controller 400.
(58) Herein, the entire section of the object means a section of the object that can be displayed as slice images by acquired volume data. A distance between two successive slice images of the object and the predetermined frame rate may be decided by a user's input or by the internal computation of the ultrasonic imaging apparatus.
(59) As such, by displaying a plurality of slice images of the object for the entire section of the object, information about the inside of the object can be provided to a user, and also by scanning the inside of the object, the user can determine a region of interest which will be described.
(60) Hereinafter, a method of displaying a plurality of slice images of an object for an entire section of the object will described.
(61)
(62) Hereinafter, for convenience of description, a direction in which ultrasonic waves are irradiated to an object is assumed to be a z-axis direction, and directions which are perpendicular to the z-axis direction and which are perpendicular to each other are assumed to be an x-axis direction and a y-axis direction. Specifically, in a 1D array probe, a direction in which elements are aligned is assumed to be an x-axis direction, and in a 2D array probe, a direction in which elements are arranged is assumed to be an x-axis direction, and another direction in which the elements are arranged is assumed to be a y-axis direction.
(63) Also, the z-axis direction is assumed to be a first direction, the x-axis direction is assumed to be a second direction, and the y-axis direction is assumed to be a third direction.
(64) Referring to
(65) Referring to
(66) Meanwhile, the controller 400 may control the display unit 160 to successively mark positions of the plurality of first slice images respectively in a plurality of slice images (also referred to as a plurality of second slice images) of the object, the second slice images being perpendicular to a second direction.
(67) Referring to
(68) With elapse of time from
(69) As such, by successively displaying the first slice images and simultaneously providing the second slice images in which the positions of the first slice images are respectively marked, the display unit 160 can help a user accurately recognize a location and shape of the target.
(70) In addition, the controller 400 may control the display unit 160 to successively display a plurality of slice images (also referred to as a plurality of third slice images) which are perpendicular to a third direction and in which the positions of the plurality of first slice images are respectively marked.
(71) Referring to
(72) With elapse of time from
(73)
(74) Unlike the embodiment shown in
(75) Also, the controller 400 may control the display unit 160 to display a first slice image or a third slice image in which a marker indicating a position of a second slice image being currently displayed is displayed.
(76) Referring to
(77) As such, by successively providing a plurality of slice images of a target acquired in various directions, and marking the position of each slice image in the other slice images, the display unit 160 can help a user accurately recognize a shape and location of the target.
(78)
(79) The controller 400 may control the display unit 160 to successively display positions of a plurality of slice images of an object respectively in a plurality of 3D images of the object, for the entire section of the object.
(80) For example, the controller 400 may control the display unit 160 to successively display a plurality of first slice images of an object at predetermined frame rate, while successively displaying positions of the first slice images being currently displayed, respectively, in a plurality of 3D images of the object. More specifically, the controller 400 may display a marker indicating a position of a first slice image of an object being displayed on the display unit 160, in a 3D image of the object.
(81) As a result, the display unit 160 may successively display the plurality of first slice images of the object at the predetermined frame rate, and simultaneously display a plurality of 3D images of the object in which the positions of the first slice images are marked, as shown in
(82) In
(83) As such, by marking a position of a target slice image being displayed in a 3D image of the target, the display unit 160 can help a user accurately recognize a shape and location of the target.
(84) Also, when the display unit 160 displays a plurality of slice images for an entire section of an object, the display unit 160 may detect a region of interest from a slice image. At this time, the controller 400 may control the display unit 160 to stop successively displaying the plurality of slice images for the entire section of the object.
(85) The controller 400 may determine the region of interest based on targets extracted by the image processor 300 (see
(86) For example, the controller 400 may determine a target satisfying a predetermined condition among the extracted targets, as a region of interest. If the extracted targets are follicles, the controller 400 may determine the largest target among the extracted targets as a region of interest so that a user can examine the largest follicle among the extracted follicles.
(87) As another example, the controller 400 may determine a region satisfying a predetermined condition in the remaining area not extracted as targets, as a region of interest. When follicles are extracted as targets, there may be a follicle not extracted through volume data. The controller 400 may determine a region satisfying a predetermined condition from a slice image, as a region of interest, to allow a user to determine whether or not the region of interest is a follicle.
(88) The predetermined condition for determining a region of interest may have been decided in advance by a user's input or by the internal computation of the ultrasonic imaging apparatus.
(89)
(90) As described above with reference to
(91) For example, referring to
(92) In
(93) Since the display unit 160 keeps displaying the first slice image including the region of interest, a user can determine whether there is a target not extracted by the image processor 300.
(94) After the display unit 160 stops successively displaying the plurality of slice images of the object for the entire section of the object, the controller 400 may control the display unit 160 to successively display a plurality of slice images of the object for a section of interest that is decided by the region of interest.
(95) A distance between two successive slice images of the object and the predetermined frame rate may be decided by a user's input or by the internal computation of the ultrasonic imaging apparatus.
(96)
(97) As described above, if a slice image displayed on the display unit 160 includes a region of interest, the controller 400 may control the display unit 160 to stop successively displaying a plurality of slice images. As a result, the display unit 160 may keep displaying the slice image including the region of interest that needs to be examined.
(98) In
(99) Accordingly, the controller 400 may control the display unit 160 to display a plurality of slice images for a section of interest S that is decided by the region of interest. Herein, the section of interest S may be a section decided in advance to provide slice images of the region of interest. After determining the region of interest, the controller 400 may set a section of interest S based on boundary lines of the region of interest.
(100) As shown in
(101) Thereby, the display unit 160 may provide the user with information about a shape and size of the region of interest. Even when a region of interest is determined from an area not exacted as a target, if the display unit 160 displays a plurality of slice images including the region of interest, the user can determine whether the region of interest corresponds to a target.
(102)
(103)
(104) According to the embodiment of
(105) After the plurality of slice images including the region of interest are provided to the user, as described above, the input unit 150 may receive at least one of a command for determining the region of interest as a target and a command for canceling an extracted target, from the user.
(106)
(107) In the embodiment of
(108)
(109) If the user determines that the region of interest is a target, the user may input a command for determining a region of interest as a target through the input unit 150. For example, as shown in
(110) According to the command from the user, the image processor 300 may filter and label the slice image including the region of interest to extract the region of interest determined as a target.
(111) As a result, the display unit 170 may highlight the region of interest newly determined as a target, and display the highlighted, region of interest in the corresponding slice image. For example, the display unit 160 may highlight the boundary line of the region of interest newly determined as a target.
(112) As such, since a region of interest can be added as a target according to a user's input in addition to targets automatically extracted by the image processor 300, the ultrasonic imaging apparatus can provide an environment for more accurate diagnosis.
(113) Meanwhile, the image processor 300 may delete at least one target from extracted targets according to a user's input. If a user inputs a command for cancelling at least one target among extracted targets, the display unit 160 may remove the highlight of the corresponding target.
(114) The above description is given under the assumption that volume data of an object is acquired in real time by the ultrasound probe 200, as shown in
(115)
(116) First, ultrasonic waves may be transmitted and received to acquire volume data of an object, in operation 500. At this time, an ultrasound probe may be used to acquire volume data of the object.
(117) Then, one or more targets in the object may be extracted based on the volume data of the object, in operation 510. Herein, the target may be a part of the object, which a user wants to examine through an ultrasound image. For example, if the object is a human's uterus, the target may be a follicle in the uterus.
(118) A method of extracting a target inside an object based on volume data of the object may be at least one of well-known technologies. For example, a volume data area having brightness values in a predetermined brightness range may be extracted as a target. As another example, a target may be extracted by determining whether the size of a volume data area having predetermined brightness values is within a predetermined range.
(119) Then, a region of interest in the object may be determined based on the extracted targets, in operation 520. The region of interest may be a region that needs to be additionally examined by a user in the extracted targets or the other area.
(120) For example, the region of interest may be a target satisfying a predetermined condition among the extracted targets. As another example, the region of interest may be a region satisfying a predetermined condition in the remaining area not extracted as the targets.
(121) Finally, a plurality of slice images of the object, including the region of interest, may be successively displayed, in operation 530. The plurality of slice images may be images about a plurality of sections of the object, which are perpendicular to a predetermined direction.
(122) In this way, by extracting a region of interest automatically, and successively displaying a plurality of slice images including the region of interest, the ultrasonic imaging apparatus can help a user diagnose the region of interest.
(123) Therefore, according to an aspect of the ultrasonic imaging apparatus and the control method thereof as described above, a plurality of slice images of an object, acquired in different directions, can be displayed automatically without a user's manipulations.
(124) Also, according to still another aspect of the ultrasonic imaging apparatus and the control method thereof as described above, a position of a slice image being currently displayed can be marked in another slice image acquired in a different direction so that a user can easily recognize a location, shape, and size of a target.
(125) Although a few embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the disclosure, the scope of which is defined in the claims and their equivalents.