System and Method for Automatic Association and Display of Video Loop Subject Matter for Enhanced Identification
20230181163 · 2023-06-15
Inventors
Cpc classification
A61B8/5223
HUMAN NECESSITIES
A61B8/465
HUMAN NECESSITIES
A61B8/5207
HUMAN NECESSITIES
International classification
Abstract
An imaging system and a method for displaying information regarding the subject matter of an ultrasound image or ultrasound video loop on a display includes the steps of detecting one or more organs in the ultrasound image or ultrasound video loop, creating a representative thumbnail image utilizing the ultrasound image or a frame of the ultrasound video loop, selecting an organ icon representing the one or more organs detected in the ultrasound image or ultrasound video loop, and presenting the organ icon in association with the thumbnail image on the display. The system and method can also create search-identifiable information relating to the one or more organs detected in the ultrasound image or ultrasound video loop, and store the search-identifiable information in the electronic memory in association with the image or the image video loop and the thumbnail image with the organ icon.
Claims
1. A method for displaying information regarding the subject matter of an ultrasound image or ultrasound video loop on a display, the method comprising the steps of: detecting one or more organs in the ultrasound image or ultrasound video loop; creating a representative thumbnail image utilizing the ultrasound image or a frame of the ultrasound video loop; selecting an organ icon representing the one or more organs detected in the ultrasound image or ultrasound video loop; and presenting the organ icon in association with the thumbnail image on the display.
2. The method of claim 1, wherein the step of detecting the one or more organs in the ultrasound image or ultrasound video loop comprises analyzing the ultrasound image or individual frames of the ultrasound video loop to locate one or more organs in the ultrasound image or individual frames of the ultrasound video loop.
3. The method of claim 2, wherein the step of analyzing the ultrasound image or individual frames of the ultrasound video loop comprises: determining an area of the ultrasound frame in which the one or more organs are located; and assessing whether the area exceeds a predetermined threshold value for the presence of the one or more organs in the ultrasound image.
4. The method of claim 2, wherein the step of analyzing the ultrasound image or individual frames of the ultrasound video loop comprises: determining a portion of the individual frames of the ultrasound video loop in which the one or more organs are located; and assessing whether the portion exceeds a predetermined threshold value for the presence of the one or more organs in the ultrasound video loop.
5. The method of claim 1, wherein the step of analyzing the ultrasound image or individual frames of the ultrasound video loop comprises employing at least one of an algorithm, an artificial intelligence or a machine learning method to identify one or more organs within the ultrasound image or individual frames of the ultrasound video loop.
6. The method of claim 1, wherein the step of presenting the organ icon in association with the thumbnail image comprises placing the organ icon within the thumbnail image.
7. The method of claim 1, wherein the step of selecting an organ icon comprises selecting a multiple organ icon corresponding to each organ detected in the ultrasound image or ultrasound video loop.
8. The method of claim 1, further comprising the steps of: detecting the view associated with the organ present in the ultrasound image or ultrasound video loop; and providing an indication of the view on the organ icon in association with the thumbnail image.
9. The method of claim 8, wherein the step of detecting the view associated with the organ in the ultrasound image or ultrasound video loop comprises implementing at least one of an algorithm, an artificial intelligence or a machine learning method to detect the view in the ultrasound image or the ultrasound image video loop.
10. The method of claim 1, further comprising the steps of: detecting an anomaly in the ultrasound image or ultrasound video loop; and providing an indication of the anomaly on the organ icon in association with the thumbnail image.
11. The method of claim 10, wherein the step of detecting the anomaly in the ultrasound image or ultrasound video loop comprises implementing at least one of an algorithm, an artificial intelligence or a machine learning method to detect the anomaly in the ultrasound image or the ultrasound image video loop.
12. The method of claim 1, further comprising the step of storing the thumbnail image with the organ icon in an electronic memory in association with the ultrasound image or ultrasound video loop.
13. The method of claim 12, further comprising the steps of: creating search-identifiable information relating to the one or more organs detected in the ultrasound image or ultrasound video loop; and storing the search-identifiable information in the electronic memory in association with the ultrasound image or ultrasound video loop and the thumbnail image with the organ icon.
14. An imaging system for displaying images obtained by the imaging system on a display, the imaging system comprising: an imaging probe adapted to obtain image data on an object to be imaged; a processor operably connected to the probe to form one of an image or an image video loop from the image data and to form a thumbnail image representative of the image or image video loop; and a display operably connected to the processor for presenting the image or the image video loop on the display, wherein the processor is configured to implement at least one of an algorithm, an artificial intelligence or a machine learning method to detect one or more organs in the image or the image video loop, to select an organ icon representing the one or more organs detected in the ultrasound image or ultrasound video loop, and to present the organ icon in association with the thumbnail image of the image or image video loop on the display.
15. The imaging system of claim 14, wherein the processor is configured to determine if a percentage of a total area of the image in which the one or more organs are detected, or a percentage of a total number of frames in the image video loop in which the one or more organs are detected exceeds a predetermined threshold.
16. The imaging system of claim 14, wherein the processor is configured to select a multiple organ icon if more than one organ is detected in the image or image video loop.
17. The imaging system of claim 14, wherein the processor is configured to implement at least one of an algorithm, an artificial intelligence or a machine learning method to detect a view associated with the organ in the image or the image video loop, and to modify the organ icon with an indication of the view.
18. The imaging system of claim 14, wherein the processor is configured to implement at least one of an algorithm, an artificial intelligence or a machine learning method to detect one or more anomalies in the image or the image video loop, and to modify the organ icon with an indication of the one or more anomalies.
19. The imaging system of claim 14, wherein the imaging system comprises an electronic memory, and wherein the processor is configured to create search-identifiable information relating to the one or more organs detected in the ultrasound image or ultrasound video loop, and to store the search-identifiable information in the electronic memory in association with the image or the image video loop and the thumbnail image with the organ icon.
20. An imaging system for displaying images obtained by the imaging system on a display, the imaging system comprising: an imaging probe adapted to obtain image data on an object to be imaged; a processor operably connected to the probe to form one of an image or an image video loop from the image data and to form a thumbnail image representative of the image or image video loop; an electronic memory operably connected to the processor; and a display operably connected to the processor for presenting the image or the image video loop on the display, wherein the processor is configured to implement at least one of an algorithm, an artificial intelligence or a machine learning method to detect one or more organs in the image or the image video loop, to select an organ icon representing the one or more organs detected in the ultrasound image or ultrasound video loop, to store the thumbnail and organ icon in association with the image or image video loop in the electronic memory, and to present the organ icon in association with the thumbnail image of the image or image video loop on the display.
21. The imaging system of claim 20, wherein the processor is configured to create search-identifiable information relating to the one or more organs detected in the ultrasound image or ultrasound video loop, and to store the search-identifiable information in the electronic memory in association with the image or the image video loop and the thumbnail image with the organ icon.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] The present invention will be better understood from reading the following description of non-limiting embodiments, with reference to the attached drawings, wherein below:
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
DETAILED DESCRIPTION
[0024] The foregoing summary, as well as the following detailed description of certain embodiments of the present invention, will be better understood when read in conjunction with the appended drawings. To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. One or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or random access memory, hard disk, or the like) or multiple pieces of hardware. Similarly, the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
[0025] As used herein, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to “one embodiment” of the present invention are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “comprising” or “having” an element or a plurality of elements having a particular property may include additional such elements not having that property.
[0026] Although the various embodiments are described with respect to an ultrasound imaging system, the various embodiments may be utilized with any suitable imaging system, for example, X-ray, computed tomography, single photon emission computed tomography, magnetic resonance imaging, or similar imaging systems.
[0027]
[0028] A probe 206 is in communication with the ultrasound imaging system 202. The probe 206 may be mechanically coupled to the ultrasound imaging system 202. Alternatively, the probe 206 may wirelessly communicate with the imaging system 202. The probe 206 includes transducer elements/an array of transducer elements 208 that emit ultrasound pulses to an object 210 to be scanned, for example an organ of a patient. The ultrasound pulses may be back-scattered from structures within the object 210, such as blood cells or muscular tissue, to produce echoes that return to the transducer elements 208. The transducer elements 208 generate ultrasound image data based on the received echoes. The probe 206 transmits the ultrasound image data to the ultrasound imaging system 202 operating the imaging system 200. The image data of the object 210 acquired using the ultrasound imaging system 202 used to form the image 214 may be two-dimensional or three-dimensional image data, such that the image 214 can be an ultrasound image and/or video loop 214. In another alternative embodiment, the ultrasound imaging system 202 may acquire four-dimensional image data of the object 210. In generating the image/video loop 214, the processor 222 is also configured to automatically identify organs and/or other anatomical structures 224 within image/video loop 214, and to provide identifications of those organs and/or other anatomical structures 224 within the image/video loop 214.
[0029] The ultrasound imaging system 202 includes a memory 212 that stores the ultrasound image data. The memory 212 may be a database, random access memory, or the like. A processor 222 accesses the ultrasound image data from the memory 212. The processor 222 may be a logic based device, such as one or more computer processors or microprocessors. The processor 222 generates an image based on the ultrasound image data. After formation by the processor 222, the image/video loop 214 is presented on a display 216 for review, such as on display screen of a cart-based ultrasound imaging system 202 having an integrated display/monitor 216, or an integrated display/screen 216 of a laptop-based ultrasound imaging system 200, optionally in real time during the procedure or when accessed after completion of the procedure.
[0030] In one exemplary embodiment, the ultrasound imaging system 202 can present the image/video loop 214 on the associated display/monitor/screen 216 along with a graphical user interface (GUI) or other displayed user interface. The image/video loop 214 may be a software based display that is accessible from multiple locations, such as through a web-based browser, local area network, or the like. In such an embodiment, the image/video loop 214 may be accessible remotely to be displayed on a remote device 230 in the same manner as the image/video loop 214 is presented on the display/monitor/screen 216.
[0031] The ultrasound imaging system 202 also includes a transmitter/receiver 218 that communicates with a transmitter/receiver 220 of the remote device 230. The ultrasound imaging system 202 and the remote device 230 may communicate over a direct peer to peer wired/wireless connection or a local area network or over an internet connection, such as through a web-based browser.
[0032] An operator may remotely access imaging data stored on the ultrasound imaging system 202 from the remote device 230. For example, the operator may log onto a virtual desktop or the like provided on the display 204 of the remote device 230. The virtual desktop remotely links to the ultrasound imaging system 202 to access the memory 212 of the ultrasound imaging system 202. Once access to the memory 212 is obtained, the operator may select image data to view. The image data is processed by the processor 222 to generate an image/video loop 214. For example, the processor 222 may generate a DICOM image/video loop 214. The ultrasound imaging system 202 transmits the image/loop 214 to the display 204 of the remote device 230 so that the image/video loop 214 is viewable on the display 204.
[0033] Looking now at
[0034] In either embodiment, referring now to
[0035] With regard to the process performed in block 304 by the processor 222,232 to determine the presence of one or more organs within the image/video loop 214, during the analysis of the image/video loop 214, the processor 222,232 can utilize threshold values stored in memory 212,234. These threshold values, which can be preset and/or modified by the user as desired, are utilized by the processor 222,232 to determine if the image/video loop 214 contains enough of a representation of the organ(s)/anatomical structure(s) within the image/video loop 214 for an organ icon 256 to be included with the thumbnail image 250 of the image/video loop 214. While the threshold value can be set as desired in any suitable format, in an exemplary embodiment the threshold value can be based off of a percentage of the total area of a single image 214 for an ultrasound image 214, and/or the total number, or percentage of individual frames forming the video loop 214 that contain at least a portion of the selected organ therein. For example, if an analysis of an image 214 shows an organ present in at least 25% of the total area of the image 214, or if analysis of a video loop 214 shows an organ present in at least 15% of the individual frames of the video loop 214, the organ has exceeded the threshold value for the image/video loop 214, and the processor 222,232 determines that a representative indicator 240 identifying that organ/anatomical structures should be included with the thumbnail image 250 for the image/video loop 214.
[0036] From block 304, when the processor 222,232 has detected a particular organ/anatomical structure within the image/video loop frame 214, the processor 222,232 continues to block 310 where the processor 222,232 selects the representative identifier(s) 240 for use in association with the image/video loop 214. The identifier 240 corresponds to the organ/anatomical structure detected in the image/video loop 214 to provide an indication of the subject matter present in the image/video loop 214.
[0037] Once the representative identifier 240 has been selected, in block 312 the processor 222,232 proceeds to generate custom data/search-identifiable information for the video loop 214 based off of the representative identifier 240. In an exemplary embodiment, this process involves the processor 222,232 creating the classification or search-identifiable information regarding the detected organ/anatomical structure to the electronic storage location or file in memory 212, 234 where the stored image/video loop 214 is located. This information can be added to the stored image/video loop 214 in any suitable manner in block 314, such as by adding the information in the form of custom meta-data or custom tags to the electronic file or electronic storage location containing the stored image/video loop 214 in memory 212,234. In this manner, the stored image/video loop 214 can be more readily located and accessed in a search for images/video loops 214 relating to the organ/anatomical structure detected by the processor 222,232, such as in a keyword search including terms contained within the meta-data or tags added to the stored image/video loop 214 by the processor 222,232.
[0038] In addition to the information added to the stored image/video loop 214, in block 316 the processor 222,232 can use the representative identifier 240 to generate information to be added directly to a thumbnail image 250 used as a visual representation of the stored image/video loop 214. The thumbnail image 250 is selected from one of the frames forming the video loop 214 and is utilized as a visual identifier for the stored video loop 214 when presented on a display 216, 204. The thumbnail image 250 includes the selected frame from the video loop 214 as well as a playback icon 252 overlaid onto the center of the thumbnail image 250. The playback icon 252 serves as a direct link to the stored video loop 214 in the memory 212,234 and can be selected by the user in any known manner to initiate the playback of the video loop 214, either within the frame 254 of the thumbnail image 250, or in a separate frame or window (not shown) on the display 216,204 that opens after selection of the playback icon 252.
[0039] Within the thumbnail image 250, the representative identifier 240 in one exemplary embodiment illustrated in
[0040] After addition of the organ icon 256 to the thumbnail image 250, the revised thumbnail image 250 is stored in block 318 such that the modified thumbnail image 250 including the organ icon 256 can be displayed when the image/video loop 214 associated with the modified thumbnail image 250 is presented on a display 216,204.
[0041] In addition to the representation of the organ/anatomical structure via the organ icon 256, when the processor 222,232 is operated in block 306 to detect the view/view orientation/view angle associated with organ/anatomical structure from which the image/video loop 214 is formed, in block 316 the processor 222,232 can provide an indication in the organ icon 256 of the detected view for the image/video loop 214. Referring to the exemplary embodiment of
[0042] Referring now to
[0043] In the case of either or both of the inclusion of the view line 258 and the anomaly modification in the organ icon 256, this information is stored along with the thumbnail image 250 as described previously, and can also be added to the custom data, i.e., meta-data and tags, stored in association with the image/video loop 214.
[0044] Looking now at
[0045] Looking now at
[0046] Referring now to
[0047] In addition to the multiple organ icons 256, in a thumbnail image 250 for an image/video loop 214 containing multiple organs/anatomical structures, the icons 256 can be individually displayed with representations of the view line 258 and/or anomaly indication 259 for each of the organ icons 256, if relevant to the particular organ icon 256. Alternatively, for the detection analyses performed in block 308 (anomaly detection), in an exemplary embodiment, if the processor 222,232 detects an anomaly in any of the individual frames of the video loop 214, the processor 222,232 can identify the entire video loop 214 as containing an anomaly.
[0048] Looking now at
[0049] The written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.