Guiding protocol development for magnetic resonance thermometry
11176717 · 2021-11-16
Assignee
- Siemens Healthcare Gmbh (Erlangen, DE)
- The United States of America, as Represented by the Secretary. Department of Health and Human Services (Bethesda, MD, US)
Inventors
- Waqas Majeed (Ellicott City, MD, US)
- Sunil Goraksha Patil (Ellicott City, MD, US)
- Rainer Schneider (Erlangen, DE)
- Himanshu Bhat (Newton, MA, US)
- Adrienne Campbell (Bethesda, MD, US)
Cpc classification
G01R33/543
PHYSICS
G01R33/5608
PHYSICS
G06T11/008
PHYSICS
G06T2207/20182
PHYSICS
International classification
Abstract
A method for decomposing noise into white and spatially correlated components during MR thermometry imaging includes acquiring a series of MR images of an anatomical object and generating a series of temperature difference maps of the anatomical object. The method further includes receiving a selection of a region of interest (ROI) within the temperature difference map and estimating total noise variance values depicting total noise variance in the temperature difference map. Each total noise variance value is determined using a random sampling of a pre-determined number of voxels from the ROI. A white noise component and a spatially correlated noise component of the total noise variance providing a best fit to the total noise variance values for all of the random samplings are identified. The white noise component and the spatially correlated noise component are displayed on a user interface.
Claims
1. A method for decomposing noise into white and spatially correlated components during MR thermometry imaging, the method comprising: acquiring a series of MR images of an anatomical object; generating a series of temperature difference maps of the anatomical object, wherein the temperature difference map comprises a plurality of voxels depicting temperature values of the anatomical object at a time period; receiving a selection of a region of interest (ROI) within the temperature difference map; estimating a plurality of total noise variance values depicting total noise variance in the temperature difference map, wherein (i) each total noise variance value is determined using a random sampling of a pre-determined number of voxels from the ROI and (ii) the pre-determined number of voxels used for each random sampling are distinct; identifying a white noise component and a spatially correlated noise component of the total noise variance providing a best fit to the total noise variance values for all of the random samplings according to a predetermined relationship between (i) the total noise variance values, (ii) the white noise component and (iii) the spatially correlated component; and displaying the white noise component and the spatially correlated noise component on a user interface.
2. The method of claim 1, wherein the ROI is selected by a user interacting with a user interface displaying the temperature difference map.
3. The method of claim 2, wherein the ROI is selected by the user drawing the ROI on the user interface.
4. The method of claim 1, wherein the ROI is automatically identified based on a user identification of an anatomical object of interest.
5. The method of claim 1, wherein the series of MR images of the anatomical object are acquired using a first set of parameters comprising an echo time parameter, and the method further comprises: receiving a user selection of a second set of parameters, wherein the second set of parameters includes at least one parameter having a value that is different than a corresponding value in the first set of parameters; generating an updated signal-to-noise value for each voxel in the temperature difference map based on the signal-to-noise values and the second set of parameters; generating an updated white noise component of an updated total noise value for each voxel in the temperature difference map using the updated signal-to-noise value for the voxel and an updated echo time; displaying the updated white noise components on the user interface.
6. The method of claim 5, wherein the first set of parameters and the second set of parameters each comprise T1 and T2* values for the anatomical object.
7. The method of claim 6, further comprising: receiving a user selection of a tissue type associated with the anatomical object; and determining the T1 and T2* values for the anatomical object based on the tissue type.
8. The method of claim 7, wherein the user selection is received via a drop-down menu in the user interface.
9. The method of claim 6, wherein the T1 and T2* values for the anatomical object are input by a user into the user interface.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
(2) The foregoing and other aspects of the present invention are best understood from the following detailed description when read in connection with the accompanying drawings. For the purpose of illustrating the invention, there are shown in the drawings embodiments that are presently preferred, it being understood, however, that the invention is not limited to the specific instrumentalities disclosed. Included in the drawings are the following Figures:
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
DETAILED DESCRIPTION
(13) The present disclosure describes systems and methods for guiding protocol development for MR thermometry. Briefly, three techniques are described herein. First, a method is provided that decomposes σ.sub.ΔT.sup.2 into its white and spatially correlated components. The second method allows the prediction and visualization of the effect of modification of imaging parameters on σ.sub.ΔT.sup.2. Third, a method is described that allows comparison of σ.sub.W for different choices of imaging parameters. These three techniques may be used alone or in combination to optimize the thermometry imaging pipeline.
(14)
(15) Starting at step 105, a series of MR images of an anatomical object are acquired by the MRI scanner. Techniques for acquiring MR images are generally known in the art and, thus, these techniques are not described in detail herein. A brief overview of MR acquisition is provided below with the description of
(16) Once the temperature maps are generated, it is presented to a user in a user interface (e.g., display of the MR scanner) at step 115. Then, at step 120 the computing system executing the method receives a selection of a region of interest (ROI) within the temperature difference map. In some embodiments, the ROI is selected by a user interacting with a user interface displaying the temperature difference map. For example, in one embodiment, the ROI is selected by the user drawing the ROI on the user interface in an area with no true temperature change. During protocol optimization, a series may be acquired without external heating. In other embodiments, the ROI is automatically identified based on a user identification of an anatomical object of interest. For example, the user may click on the object and one or more object segmentation techniques generally known in the art may be applied to identify the voxels that comprise the object; these voxels can then be used to designate the ROI. In still other embodiments, the computer may automatically identify and segment the object based on the study being performed. For example, the user may specify that a brain study is being performed. The computer may then automatically identify the brain the image (e.g., using a trained machine learning model), and then perform the segmentation to generate the ROI.
(17) Next at steps 120-125 the white (σ.sub.W.sup.2) and spatially correlated (σ.sub.S.sup.2) components of the total variance are estimated using an average of multiple random samplings of voxels in the ROI. Averaging ΔT across N voxels reduces σ.sub.W.sup.2 by 1/N but has negligible impact on σ.sub.S.sup.2:
(18)
where σ.sub.ΔT.sup.2.sup.
(19) Finally at step 130, the final values for σ.sub.W.sup.2 and σ.sub.S.sup.2 may be presented in the user interface. For example, in one embodiment, a σ.sub.ΔT map overlaid with the ROI is presented adjacent to text indicating the final values for σ.sub.W.sup.2 and σ.sub.S.sup.2 for that particular ROI.
(20)
(21) Continuing with reference to
(22) Next, at step 220, the computer receives a user selection of a second set of imaging parameters (e.g., via user interaction with a user interface). The second set of imaging parameters includes at least one parameter having a value that is different than a corresponding value in the first set of imaging parameters; In addition, the user provides T1 and T2* values for the target tissue. For example, the user may one of the following: 1) T1 and T2* maps, 2) Typical T1 and T2* values for the tissue of interest. Then, at step 225, the computer system generates relative change in signal-to-noise value for each voxel using both sets of imaging parameters, T1 and T2* along with fundamental MRI signal/noise equations generally known in the art. For example, the following equation defines signal-to-noise in terms of voxel volume, number of measurements, and receiver bandwidth:
(23)
The constant K includes factors dependent on the MRI system hardware (e.g., coil, pre-amp and noise power spectrum), pulse sequence parameters (e.g., TR, TE), and tissue-dependent factors (e.g., T1, T2*).
(24) Continuing with reference to
(25)
(26) Continuing with reference to
(27) The general concept illustrated in
(28)
(29) The estimated values of σ.sub.W.sup.2 and σ.sub.S.sup.2 suggest that σ.sub.ΔT.sup.2 is dominated by white noise. This result guides the user to focus on reducing σ.sub.W.sup.2 by optimizing the imaging protocol and applying spatial/temporal filtering or using averaged baseline images. To test this prediction of the method 100 two different modifications to the original processing pipeline as described below.
(30) The same data were processed by modifying the processing pipeline in the following way: Average of two baseline images was used prior for phase subtraction, instead of using a single average. This should reduce σ.sub.W.sup.2 but not σ.sub.S.sup.2. The resultant σ.sub.ΔT map is shown in left hand side (labeled “a”) of
(31) To evaluate the effect of removing DC phones, the same data were processed by modifying the original processing pipeline (used for
(32)
(33) The method 300 described above in
(34)
(35) Further RF (radio frequency) module 20 provides RF pulse signals to RF coil 18, which in response produces magnetic field pulses which rotate the spins of the protons in the imaged body of the patient 11 by ninety degrees or by one hundred and eighty degrees for so-called “spin echo” imaging, or by angles less than or equal to 90 degrees for so-called “gradient echo” imaging. Gradient and shim coil control module 16 in conjunction with RF module 20, as directed by central control computer 26, control slice-selection, phase-encoding, readout gradient magnetic fields, radio frequency transmission, and magnetic resonance signal detection, to acquire magnetic resonance signals representing planar slices of patient 11.
(36) In response to applied RF pulse signals, the RF coil 18 receives MR signals, i.e., signals from the excited protons within the body as they return to an equilibrium position established by the static and gradient magnetic fields. The MR signals are detected and processed by a detector within RF module 20 and k-space component processor unit 34 to provide an MR dataset to an image data processor for processing into an image. In some embodiments, the image data processor is located in central control computer 26. However, in other embodiments such as the one depicted in
(37) A magnetic field generator (comprising coils 12, 14 and 18) generates a magnetic field for use in acquiring multiple individual frequency components corresponding to individual data elements in the storage array. The individual frequency components are successively acquired in an order in which radius of respective corresponding individual data elements increases and decreases along a substantially spiral path as the multiple individual frequency components is sequentially acquired during acquisition of an MR dataset representing an MR image. A storage processor in the k-space component processor unit 34 stores individual frequency components acquired using the magnetic field in corresponding individual data elements in the array. The radius of respective corresponding individual data elements alternately increases and decreases as multiple sequential individual frequency components are acquired. The magnetic field acquires individual frequency components in an order corresponding to a sequence of substantially adjacent individual data elements in the array and magnetic field gradient change between successively acquired frequency components is substantially minimized.
(38) Central control computer 26 uses information stored in an internal database to process the detected MR signals in a coordinated manner to generate high quality images of a selected slice(s) of the body (e.g., using the image data processor) and adjusts other parameters of system 1000. The stored information comprises predetermined pulse sequence and magnetic field gradient and strength data as well as data indicating timing, orientation and spatial volume of gradient magnetic fields to be applied in imaging. Generated images are presented on display 40 of the operator interface. Computer 28 of the operator interface includes a graphical user interface (GUI) enabling user interaction with central control computer 26 and enables user modification of magnetic resonance imaging signals in substantially real time. Display processor 37 processes the magnetic resonance signals to provide image representative data for display on display 40, for example.
(39) The embodiments of the present disclosure may be implemented with any combination of hardware and software. In addition, the embodiments of the present disclosure may be included in an article of manufacture (e.g., one or more computer program products) having, for example, computer-readable, non-transitory media. The media has embodied therein, for instance, computer readable program code for providing and facilitating the mechanisms of the embodiments of the present disclosure. The article of manufacture can be included as part of a computer system or sold separately.
(40) The term “computer readable medium” as used herein refers to any medium that participates in providing instructions to the processor for execution. A computer readable medium may take many forms including, but not limited to, non-volatile media, volatile media, and transmission media. Non-limiting examples of non-volatile media include optical disks, solid state drives, magnetic disks, and magneto-optical disks, such as hard disk or removable media drive. One non-limiting example of volatile media is dynamic memory. Non-limiting examples of transmission media include coaxial cables, copper wire, and fiber optics, including the wires that make up one or more buses. Transmission media may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
(41) While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
(42) An executable application, as used herein, comprises code or machine readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input. An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters.
(43) As used herein, the term “user interface” means any device for rendering information to a user and/or requesting information from the user. A user interface includes at least one of textual, graphical, audio, video, animation, and/or haptic elements. A textual element can be provided, for example, by a monitor, display, projector, etc. A graphical element can be provided, for example, via a monitor, display, projector, and/or visual indication device, such as a light, flag, beacon, etc. An audio element can be provided, for example, via a speaker, microphone, and/or other sound generating and/or receiving device. A video element or animation element can be provided, for example, via a monitor, display, projector, and/or other visual device. A haptic element can be provided, for example, via a very low frequency speaker, vibrator, tactile stimulator, tactile pad, simulator, keyboard, keypad, mouse, trackball, joystick, gamepad, wheel, touchpad, touch panel, pointing device, and/or other haptic device, etc.
(44) The functions and process steps herein may be performed automatically or wholly or partially in response to user command. An activity (including a step) performed automatically is performed in response to one or more executable instructions or device operation without user direct initiation of the activity.
(45) The system and processes of the figures are not exclusive. Other systems, processes and menus may be derived in accordance with the principles of the invention to accomplish the same objectives. Although this invention has been described with reference to particular embodiments, it is to be understood that the embodiments and variations shown and described herein are for illustration purposes only. Modifications to the current design may be implemented by those skilled in the art, without departing from the scope of the invention. As described herein, the various systems, subsystems, agents, managers and processes can be implemented using hardware components, software components, and/or combinations thereof. No claim element herein is to be construed under the provisions of 35 U.S.C. 112(f), unless the element is expressly recited using the phrase “means for.”