INTUITIVE AUTOMATION IN PATIENT MODELING
20170340901 · 2017-11-30
Inventors
Cpc classification
G06F3/04847
PHYSICS
G16Z99/00
PHYSICS
G06F3/04815
PHYSICS
G06F3/0488
PHYSICS
A61N2005/1074
HUMAN NECESSITIES
International classification
A61N5/10
HUMAN NECESSITIES
G06F3/0483
PHYSICS
G06F3/0484
PHYSICS
Abstract
To overcome the difficulties inherent in conventional treatment planning approaches, new techniques are described herein for providing an intuitive user interface for automatic structure derivation in patient modeling. In an embodiment, a graphical user interface is provided that provides a list of structures of a specified region. The interface uses medical terminology instead of mathematical one. In one or more embodiments, the list of structures may be a pre-defined list of structures that correspond to that region for the purposes of treatment planning. A user is able to actuate a toggle to include and/or exclude each of the structures separately. In one or more embodiments, the user is also able to actuate a toggle to define a perimeter around each included structure, and further define a margin around the perimeter. The user is also able to specify whether the desired output should include a union or the intersection of all included structures.
Claims
1. A method of automatic structure delineation, the method comprising; causing display, by a computer system, of an image from medical image data; causing display, by the computer system, of a graphical user interface, the graphical user interface comprising a list of structures corresponding to an area of interest displayed in the image and one or more operations available to be performed to automatically derive from one or more structures of the list of structures; receiving user supplied input through the graphical user interface, the user supplied input corresponding to a selection of one or more structures from the list of structures; determining a combination of Boolean operations based on the user supplied input; generating a graphical delineation of the combination of the one or more selected structures corresponding to the combination of Boolean operations; storing the graphical delineation of the one or more selected structures as one or more corresponding output structures; and updating the display to include a graphical delineation of the one or more output structures.
2. The method according to claim 1, wherein the list of structures comprises a pre-determined list of structures determined to be clinically relevant by analyzing clinical data corresponding to the area of interest.
3. The method according to claim 1, wherein the user supplied input comprises a user actuation of a toggle in the graphical user interface to include the one or more structures from the list of structures.
4. The method according to claim 3, wherein the user supplied input further comprises a user actuation of a toggle in the graphical user interface to define a wall of a structure from the one or more structures.
5. The method according to claim 4, further comprising analyzing clinically relevant data corresponding to the area of interest to reference one or more surface contours corresponding to the structure, wherein the wall of the structure is defined using the one or more surface contours corresponding to the structure.
6. The method according to claim 4, wherein the user supplied input further comprises a user actuation of a set of parameters displayed in the graphical user interface that correspond to a margin around the perimeter.
7. The method according to claim 1, further comprising: monitoring subsequent user supplied input to detect modifications to the one or more structures comprised in the selection of one or more structures; automatically propagating the graphical delineation of selected structures to correspond to detected modifications to the one or more structures; and displaying the propagated graphical delineation of selected structures.
8. The method according to claim 1, wherein the user supplied input comprises a user selection of at least one of: inclusion of the one or more structures, and an exclusion of the one or more structures, wherein the inclusion of one or more structures comprises a display of a union of the included one or more structures according to the user supplied input, further wherein the exclusion of one or more structures comprises a display that does not include the union of excluded structures according to the user supplied input.
9. The method according to claim 1, wherein the graphical user interface displays the list of structures and the one or more operations in a tabular representation.
10. The method according to claim 1, further comprising referencing the one or more output structures in a patient modeling workspace.
11. A system comprising: a memory device configured to store a set of programmed instructions and an image data; a user input device configured to receive user supplied input; a display device configured to display a graphical user interface; and a processor configured to execute the programmed instructions to generate a display of the graphical user interface comprising a display of an image corresponding to the image data and a display of a list of structures corresponding to an area of interest in the image and one or more operations available to be performed to automatically derive one or more of the list of structures, the processor being further configured to receive user supplied input from the user input device corresponding to a selection of one or more structures from the list of structures, to determine a combination of Boolean operations based on the user supplied input, to generate a graphical delineation of the one or more selected structures, to store the graphical delineation of the one or more selected structures as one or more corresponding output structures, and to update the image to include the graphical delineation in the display device.
12. The system according to claim 11, wherein the list of structures comprises a pre-determined list of structures determined to be clinically relevant by analyzing clinical data corresponding to the area of interest.
13. The system according to claim 11, wherein the user supplied input comprises a user actuation of a toggle displayed in the graphical user interface to include the one or more structures from the list of structures.
14. The system according to claim 13, wherein the user supplied input further comprises a user actuation of a toggle displayed in the graphical user interface to define a perimeter around the one or more structures.
15. The system according to claim 14, wherein the user supplied input further comprises a user actuation of a set of parameters displayed in the graphical user interface that correspond to a margin around the perimeter.
16. The system according to claim 11, wherein the processor is further operable to monitor subsequently received user supplied input to detect modifications to any of the one or more structures comprised in the selection of the one or more structures, automatically propagate the graphical delineation of selected structures to correspond to detected modifications to the one or more structures; and display the propagated graphical delineation of selected structures in the display device.
17. The system according to claim 1, wherein the user supplied input comprises a user selection of at least one of: inclusion of the one or more structures, and an exclusion of the one or more structures, wherein the inclusion of one or more structures comprises a display of a union of the included one or more structures according to the user supplied input, further wherein the exclusion of one or more structures comprises a display that does not include the union of excluded structures according to the user supplied inputt.
18. A non-transitory computer readable medium comprising a plurality of programmed instructions which, when executed by a processor in a computing system, is operable to implement automatic structure delineation, the computer readable medium comprising: instructions to cause display of an image from medical image data; instructions to cause display of a graphical user interface comprising a list of structures corresponding to an area of interest displayed in the image and one or more operations available to be performed to automatically derive one or more of the list of structures; instructions to receive, as input, user supplied input corresponding to a selection of one or more structures from the list of structures; instructions to determine a combination of Boolean operations based on the user supplied input; instructions to generate a graphical delineation of the one or more selected structures corresponding to the combination of Boolean operations; instructions to store the graphical delineation of the one or more selected structures as one or more corresponding output structures; and instructions to update the image with the graphical delineation of the selected structures.
19. The non-transitory computer readable medium according to claim 18, wherein the list of structures comprises a pre-determined list of structures determined to be clinically relevant by analyzing clinical data corresponding to the area of interest.
20. The non-transitory computer readable medium according to claim 18, wherein the user supplied input comprises a user actuation of at least one of: a toggle displayed in the graphical user interface and operable to include the one or more structures from the list of structures; a toggle displayed in the graphical user interface and operable to define a wall of one or more structures; and a set of parameters displayed in the graphical user interface and corresponding to a margin around a wall of one or more structures.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0015] The accompanying drawings, which are incorporated in and form a part of this specification, illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the presently claimed subject matter:
[0016]
[0017]
[0018]
[0019]
DETAILED DESCRIPTION
[0020] Reference will now be made in detail to several embodiments. While the subject matter will be described in conjunction with the alternative embodiments, it will be understood that they are not intended to limit the claimed subject matter to these embodiments. On the contrary, the claimed subject matter is intended to cover alternative, modifications, and equivalents, which may be included within the spirit and scope of the claimed subject matter as defined by the appended claims.
[0021] Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the claimed subject matter. However, it will be recognized by one skilled in the art that embodiments may be practiced without these specific details or with equivalents thereof. In other instances, well-known processes, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects and features of the subject matter.
[0022] Portions of the detailed description that follow are presented and discussed in terms of a process. Although operations and sequencing thereof are disclosed in a figure herein (e.g.,
[0023] Some portions of the detailed description are presented in terms of procedures, operations, logic blocks, processing, and other symbolic representations of operations on data bits that can be performed on computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. A procedure, computer-executed operation, logic block, process, etc., is here, and generally, conceived to be a self-consistent sequence of operations or instructions leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
[0024] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout, discussions utilizing terms such as “accessing,” “writing,” “including,” “storing,” “transmitting,” “traversing,” “associating,” “identifying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0025] While the following example configurations are shown as incorporating specific, enumerated features and elements, it is understood that such depiction is exemplary. Accordingly, embodiments are well suited to applications involving different, additional, or fewer elements, features, or arrangements.
Exemplary Graphical User Interface
[0026]
[0027] Compared to other proposed solutions, the claimed subject matter simplifies the process of structure automation and combination substantially, both by the number of steps required and by the time required for a user to understand and learn how to use the tool properly. According to one or more embodiments, all of the operations needed for structure combination and automation is combined in one tool that is easy to learn and understand due to an intuitive user interface design (exemplified by the graphical user interface 100 depicted in
[0028] In one or more embodiments, in contrast to conventional approaches with interfaces that use advanced mathematical symbols or technical terms, implementations of the graphical user interface are expressed (e.g., displayed to the user) using natural language, medical (anatomical) terminology and/or easy to understand graphical icons to describe and/or label operations,. Natural language may include, for example, relative orientation or directions (e.g., left, right, top, bottom, etc.), anatomical designations (e.g., anterior, medial, lateral, cranial, posterior, etc.), geometric axis characteristics (e.g., union, intersection, includes, excludes)
[0029] Implementations of the user interface also use a simplified functional range that corresponds to what is used by physicians in clinical use cases, such as Boolean structures and skin sparing. Rather than expressing functionality mathematically, or supporting all possible solutions, in one or more embodiments the user interface presents only (pre-determined) clinically relevant input and output options to the user from which the user can select, These options may represent only the clinically relevant operations involving existing structures. As depicted in
[0030] Displays of the list of structures already known to the user from the patient modeling workspace can also be reused, so the claimed subject matter can be completely integrated in contouring screen layouts, allowing for immediate visualization of results and the automatic execution of sequences assigned to output structures when input structures are ready (as indicated by a special user interface element) and cleared when input structures are changed.
[0031] As depicted in
[0032] In one or more embodiments, the region itself may be alternately automatically selected, manually selected, and manually changed based on further user input. Via the graphical user interface, a user (such as a clinician, radiation oncologist, or radiation therapy operator, etc.) can actuate a toggle to include and/or exclude each of the structures separately (e.g., via user-interface elements in columns 103 and 105 of the graphical user interface 100 in
[0033] According to one or more embodiments, via the graphical user interface, multiple structures can be included for one or more operations or displays within a single operation mode. In contrast to typical approaches that often require using an “advanced” mode to include multiple structures to perform structure automation operations (typically by using formulas and parentheses), embodiments of the present claimed invention presents a graphical display of a table of clinically relevant inputs, outputs, and operations that represent a subset of possible formulas that can be performed by the structure automating tool. According to various embodiments, the user is also able to actuate a toggle (e.g., via user-interface elements in row 109 of the graphical user interface 100 in
[0034] Once the user input phase is completed, the user input received via the graphical user interface are submitted (e.g., to a image rendering application executed by a graphics processor), an input image is processed by a graphics processor, and contouring of the structures in the input image selected by the user is automatically and immediately performed, rendered, and graphically represented directly on a display of the input image, or overlapping visually on a display of the input image. In one or more embodiments, the automatically generated contours include the area contained in the margins specified by the user. In still further embodiments, the input image data continues to be monitored such that even when the input structures are changed, either manually or with an updated structural delineation, the contouring generated from the input supplied by the user persists, and is graphically adapted by the processor to the new image data via automatic propagation for immediate visualization.
[0035] Optionally, the user is able to store one or more contoured structures from the selection of one or more toggles for future use or reference. For example, the output from generating contouring based on the user input that includes two or more structures can be stored as a new structure, and the resultant structure can be saved through the user interface at row 111. As depicted in
[0036] The example features shown in
[0037]
[0038] Optionally, structure derivation interface 200 could also include a structure property window (e.g., structure property window 205), that allows a user to configure the parameters applied to the image in image window 207 and displayed in parameter list 203. For example, one or more of the color, size, or volume of a specific structure may be configured in structure property window 205. In one or more examples, the particular structure being configured may be selected using a drop down menu populated by the structures in the parameter list 203. The image window 207 may display one or more structures of a target area imaged using an imaging system and generated by one or more processors in a computing system. In one or more embodiments, the imaging system may be implemented according to one or more of any number of imaging modalities including, but not limited to: radiography, magnetic resonance imaging (MRI), ultrasound, elastography, photoacoustic imaging, thermography, tomography, echocardiography, spectroscopy, and molecular imaging (e.g., positron emission tomography). As depicted in
[0039] Exemplary structure configuration window 209 depicts a graphical user sub-interface with functionality similar to the user-interface described above with respect to
[0040] The example features shown in
Automatic Structure Derivation
[0041]
[0042] The process depicted in
[0043] The process continues at step 303 by receiving user input. User input may be received through, for example, at least a portion of a graphical user interface (e.g., user interface 100, structure configuration window 209). In one or more embodiments, the user input may consist of a user actuation of one or more toggles corresponding to one or more structures in a pre-defined list of structures displayed in the image at step 301. Optionally, the user input may include the actuation of one or more toggles corresponding to a perimeter and/or margins corresponding to the one or more structures. Optionally, the user input may also include the actuation of one or more toggles corresponding to a user selection of the display of an intersection or union of other user-selected elements. In one or more embodiments, each user input received in the graphical user interface causes the performance of steps 303-309, with those steps being repeated each time an additional user input is received. Alternately, user input can be collected (and graphically represented to the user) temporarily until a subsequent user-initiated submission of all user input (e.g., via a submit graphical button), with steps 305-309 being performed in response to the submission.
[0044] The process continues at step 305 with calculating and determining any number of a plurality of Boolean operations corresponding to the user input received. For example, a Boolean operation may be generated that corresponds to the union of each structure indicated by the user input for inclusion. In one or more embodiments, the generated Boolean operation is automatically determined and calculated without further input from the user. Optionally, the resultant Boolean operation may be graphically displayed in the graphical user interface to the user (e.g., as a string or mathematical expression). The process continues at step 307 with processing the graphical contouring and automatic segmentation according to the Boolean operation(s) generated at step 305. In one or more embodiments, the graphical contouring may be performed by referencing and comparisons to previously rendered structures in a knowledge base or repository of treatment planning volumes, mapping the target area in the image window to corresponding areas, extrapolating pre-contoured effects to the structures in the target area, and adapting the effects to corresponding specifically to the structures. In one or more embodiments, such adaptation may be performed by, for example, a deformation mechanism or similar mapping and means for automatic propagation.
[0045] The process continues at step 309 with updating the image displayed in the graphical user interface (e.g., image window 207 of
Exemplary Computer System
[0046] In one or more embodiments, structure derivation may be executed as a series of programmed instructions executed on a computing environment described above with respect to
[0047] The computer system 400 may also comprise an optional graphics subsystem 405 for presenting information to the radiologist or other user, e.g., by displaying information on an attached display device 410, connected by a video cable 411. In one or more embodiments, one or more of the graphical user interfaces described above with respect to
[0048] According to one or more embodiments, one or more of the storage of clinical data and the performance of the automatic structure derivation may be performed in one or more (distributed) computing systems remotely positioned relative to a display device upon which one or more instances of the graphical user interface described above is instantiated. For example, processing may be performed in one or more cloud computing centers using dynamically provisioned resources, with resulting displays presented in a local display and computing device of a user. Likewise, clinical data may be stored in one or more data stores which can be remotely positioned with respect to one or both of the local display and computing device of the user and the computing device performing the bulk of the processing for the automatic structure segmentation and the analysis of the clinical data.
[0049] Additionally, computing system 400 may also have additional features/functionality. For example, computing system 400 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. RAM 402, ROM 403, and external data storage device (not shown) are all examples of computer storage media.
[0050] Computer system 400 also comprises an optional alphanumeric input device 406, an optional cursor control or directing device 407, and one or more signal communication interfaces (input/output devices, e.g., a network interface card) 408. Optional alphanumeric input device 406 can communicate information and command selections to central processor 401. Optional cursor control or directing device 407 is coupled to bus 409 for communicating user input information and command selections to central processor 401. Signal communication interface (input/output device) 408, also coupled to bus 409, can be a serial port. Communication interface 408 may also include wireless communication mechanisms. Using communication interface 408, computer system 400 can be communicatively coupled to other computer systems over a communication network such as the Internet or an intranet (e.g., a local area network).
[0051] Although the subject matter has been described in language specific to structural features and/or processological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.