SELECTIVE RESAMPLING DURING NON-INVASIVE THERAPY
20190351261 ยท 2019-11-21
Inventors
Cpc classification
G01R33/483
PHYSICS
G01R33/561
PHYSICS
G01R33/5619
PHYSICS
A61B5/055
HUMAN NECESSITIES
A61B90/37
HUMAN NECESSITIES
International classification
A61B5/055
HUMAN NECESSITIES
Abstract
During a focused-ultrasound or other non-invasive procedure, regions of change within a target region are monitored, and images of the target region are updated with partial images encompassing only the regions of change.
Claims
1. A system for imaging a target region comprising a feature therewithin, the system comprising: an imaging apparatus, operable in conjunction with a treatment apparatus, for acquiring images, the imaging apparatus being configured to acquire and computationally store (i) a baseline k-space image of the target region, (ii) a comparative k-space image of the target area during an operational sequence, and (iii) one or more new k-space images each encompassing only a portion of the target region during the operational sequence; and a computation unit configured to (i) computationally compare the comparative k-space image with the baseline k-space image to identify one or more first image regions of the k space associated with a changed characteristic within the target region, the comparative k-space image comprising (a) the one or more first image regions and (b) a remaining image region, and (ii) cause the imaging apparatus to acquire a new k-space image by sampling only in the one or more first image regions.
2. The system of claim 1, wherein the new k-space image comprises pixels corresponding to the newly sampled one or more image regions and information based at least in part on previously sampled pixels corresponding to the remaining image region.
3. The system of claim 1, wherein the computation unit is configured to computationally reconstruct a real-space image from the comparative k-space image.
4. The system of claim 1, wherein the computation unit is configured to computationally reconstruct a real-space image from the new k-space image.
5. The system of claim 1, wherein the imaging apparatus is an MRI apparatus.
6. The system of claim 1, wherein the treatment apparatus comprises one or more ultrasound transducers.
7. The system of claim 1, the changed characteristic within the target region comprises a pixel value.
8. The system of claim 1, wherein the computational unit is configured to steer and/or modulate an energy beam based on the new k-space image and/or a real-space image computationally reconstructed therefrom.
9. The system of claim 8, wherein the energy beam is a focused ultrasound beam.
10. The system of claim 1, wherein the operational sequence comprises exposure of a target other than the feature, the computational unit being configured to shape and/or steer an energy beam onto the target so as to avoid the feature based on the new k-space image and/or a real-space image computationally reconstructed therefrom.
11. The system of claim 10, wherein the energy beam is a focused ultrasound beam.
12. The system of claim 1, wherein the computation unit is configured to (i) identify a plurality of first image regions of the k space associated with the changed characteristic within the target area, and (ii) sample in the first image regions at a frequency, for each first image region, based at least in part on a magnitude of the change in the characteristic therein.
13. The system of claim 1, wherein the one or more first image regions are identified at least in part by estimation based on at least one previous k-space image.
14. The system of claim 1, wherein the computation unit is configured to iteratively repeat the acquisition step by sampling in the one or more first image regions at a first frequency and in the remaining region at a second frequency lower than the first frequency.
15. A method for imaging, during an operational sequence, a target region comprising a feature therewithin, the method comprising: acquiring a baseline k-space image of the target region; and thereafter, during the operational sequence: (a) acquiring a comparative k-space image of the target region; (b) computationally comparing the comparative k-space image with the baseline k-space image to identify one or more first image regions of the comparative k-space image having a changed characteristic, the comparative k-space image comprising (i) the one or more first image regions and (ii) a remaining image region; and (c) subsequently acquiring a new k-space image by sampling only the one or more first image regions, the new k-space image comprising pixels corresponding to the newly sampled one or more first image regions and additional pixel values based at least in part on previously sampled pixels corresponding to the remaining image region.
16. The method of claim 15, further comprising displaying a real-space image computationally reconstructed from the new k-space image.
17. The method of claim 15, wherein step (c) is repeated one or more times.
18. The method of claim 17, further comprising repeating steps (a) and (b) after repeating step (c) one or more times.
19. The method of claim 15, wherein the changed characteristic within the target region is a pixel value.
20. The method of claim 15, wherein the baseline k-space image and the comparative k-space image are full-scan MRI images.
21. The method of claim 15, wherein the new k-space image is a partial-scan MRI image.
22. The method of claim 15, wherein the operational sequence comprises exposure of the feature to an energy beam.
23. The method of claim 15, wherein the operational sequence comprises steering and/or modulating an energy beam based on the new k-space image and/or a real-space image computationally reconstructed therefrom.
24. The method of claim 23, wherein the energy beam is a focused ultrasound beam.
25. The method of claim 15, wherein the operational sequence comprises exposure of a target other than the feature.
26. The method of claim 25, further comprising shaping and/or steering an energy beam onto the target region so as to avoid the feature based on the new k-space image and/or a real-space image computationally reconstructed therefrom.
27. The method of claim 26, wherein energy beam is a focused ultrasound beam.
28. The method of claim 15, wherein a plurality of first image regions of the k space associated with the changed characteristic within the target area is identified, and further comprising the step of sampling in the first image regions at a frequency, for each first image region, based at least in part on a magnitude of the change in the characteristic therein.
29. The method of claim 15, wherein the one or more first image regions are identified at least in part by estimation based on at least one previous k-space image.
30. The method of claim 15, wherein step (c) is iteratively repeated by sampling in the one or more first image regions at a first frequency and in the remaining region at a second frequency lower than the first frequency.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, with an emphasis instead generally being placed upon illustrating the principles of the invention. In the following description, various embodiments of the present invention are described with reference to the following drawings, in which:
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
DETAILED DESCRIPTION
[0030] In various embodiments, the present invention provides systems and methods for monitoring one or more rapidly changing characteristics of a target (e.g., a treatment target) or other objects of interest in a region of interest in real time during an image-guided treatment procedure. The procedure may, for example, involve the application of focused ultrasound to (i.e., the sonication of) a material, a tissue or organ for the purpose of heating it, either to necrose, ablate, or otherwise destroy the tissue if it is, e.g., cancerous, or for non-destructive treatments such as pain amelioration or the controlled inducement of hyperthermia. Ultrasound may also be used for other, nonthermal types of treatment, such as, e.g., neuromodulation. Alternatively, the procedure may use different forms of therapeutic energy, such as, e.g., radio-frequency (RF) radiation, X-rays or gamma rays, or charged particles, or involve other treatment modalities such as cryoablation.
[0031] Tracking of various changing characteristics (e.g. position and/or temperature) in treatment procedures may serve to guide the therapeutic energy beam onto the target and/or around other, non-target tissues and organs, i.e., to adjust the beam focus, profile, and/or direction based on images of the affected anatomical region, which may, in some embodiments, also visualize the beam focus. MRI is a widely used technique for such image-based tracking. However, other imaging techniques, including, e.g., X-ray imaging, X-ray computed tomography (CT), or ultrasound imaging, may also be used and are within the scope of the present invention. In addition, the tracking may be achieved using one or more two-dimensional images and/or three-dimensional images. An exemplary system for implementing methods in accordance with various embodiments is an MRgFUS system, such as the one depicted in
[0032]
[0033] In step 210, treatment of a target within the imaged treatment area is initiated or altered. As mentioned above, this step may involve either the initiation of a treatment sequence or the alteration of one or more treatment parameters during treatment. In either case, the initiation or alteration of the treatment is expected to result in or be accompanied by the change in at least one parameter (e.g., temperature, size, location, etc.) associated with the target or with another feature in the treatment area. For example, focused-ultrasound ablation of a tumor may be carried out in two or more phases: a first phase during which the central region of the tumor is targeted, and one or more subsequent phases in which the peripheral regions of the tumor are exposed to ultrasound. Since the risk to healthy tissue surrounding the tumor increases as treatment progresses, so may the need for accurate, real-time monitoring of changes within the treatment area.
[0034] In step 215, comparative image information is acquired from the treatment area after the initiation or alteration of the treatment. The comparative image information may be acquired in one or more scans of the treatment area. In various embodiments, the comparative image information corresponds to at least one full k-space scan of the treatment area. In an optional step 220, the comparative image information is computationally reconstructed to form a real-space image (e.g., via inverse Fourier transformation of k-space data).
[0035] In step 225, the comparative image information is compared with the initial baseline image information in order to identify portions thereof corresponding to changes occurring within the treatment area in response to the treatment initiated or altered in step 210. For example, changes in the position, size, or temperature of the target in response to the treatment may be encoded as changes in the acquired image information. That is, pixel values within the image information may change between the baseline scan and the comparative scan; such changing pixel values may correspond to, for example, changes in the location of the target (or a portion thereof) and/or changes in the temperature (and/or another sensed characteristic) of the target (or a portion thereof) and/or of surrounding tissue. In various embodiments, the comparative image information is compared with the baseline image information, data point by data point (e.g., k-space pixel by k-space pixel), and portions of the comparative image information in which the data has changed significantly are identified. For example, the temperature of a target, or a portion thereof, may increase in response to the initiation of treatment (e.g., the application of therapeutic energy), and the localized temperature increase will be reflected in the comparative image information. The identified portions will typically correspond to partial k-space data, e.g., one or more portions of the full k-space scan that are not necessarily contiguous in k-space.
[0036] The comparison in step 225 may be based on k-space data. Typically, the comparison is performed on a pixel-by-pixel basis, where a pixel refers to an element of the k-space data array, which generally stores amplitude and phase values or equivalent representation (e.g., real and imaginary of a complex number) as a function of k-space coordinates. For example, in various embodiments, the similarity between pixels of the k-space data acquired in steps 205 and 215 may be measured, and the identified portions of the k-space data correspond to those where the data is changing significantly. By significantly is meant that the change is sufficient that reliance on previous data would be clinically inappropriate. There are numerous quantitative and heuristic ways in which the pixels that will be updated may be identified. In one approach, the degree of change in a pixel parameter is quantified and measured to determine whether similarity falls beneath a predefined similarity threshold. That is, in some embodiments, the change in the pixel parameter between the comparative image information and the baseline image information is assessed against the similarity threshold, and only if the level of similarity falls below the threshold (which typically means, for metrics that measure the differences, i.e., the dissimilarity, between images, that the value of the metric exceeds the threshold value) is that portion of the comparative image information identified as significantly changing. Suitable similarity metrics include, for example, pixel intensity, cross-correlation coefficients, the sum of squared intensity differences, mutual information (as the term is used in probability and information theory), ratio-image uniformity (i.e., the normalized standard deviation of the ratio of corresponding pixel values), the mean squared error, the sum of absolute differences, the sum of squared errors, the sum of absolute transformed differences (which uses a Hadamard or other frequency transform of the differences between corresponding pixels in the two images), or complex cross-correlation (for complex images, such as MRI images), and other techniques familiar, to those of skill in the art, in connection with image comparison and registration.
[0037] The value of the thresholdthat is, the amount of change sufficient to justify re-scanning in higher ratedepends on the parameter and the application. In some embodiments, the threshold is fixed, but more typically it is defined dynamically based on the k-space data itself. For example, the threshold can be defined statistically in terms of the mean intensity, e.g., or 1 standard deviation from the mean pixel intensity value (or other pixel-parameter value). The threshold can also be defined based on the maximum difference in the pixel parameter across the comparative image relative to the baseline image, e.g., 25% of the maximum difference.
[0038] Alternatively, instead of an explicit threshold, the areas that have changed the most relative to a baseline may be selected for update. For example, a defined percentage of the image information exhibiting the greatest degree of change may be identified. In one embodiment, the pixels in the comparative image can be ranked in terms of their difference from the corresponding pixels of the baseline image, and the top 25% or 50% of pixels (i.e., the 25% or 50% whose differences from the baseline image are largest) are selected for update. Statistics can be used to determine the percentage of pixels identified for updating. For example, if the parameter difference considered across all pixels in the new and previous images is bimodal, all pixels within the peak having the higher mean difference may be selected.
[0039] In various embodiments, step 225 involves the comparison of the change between the comparative image information and the baseline image information can involve multiple different thresholds or other comparison metrics each corresponding to a different amount of change (based on, e.g., one or more of the similarity metrics listed above). In this manner, portions of the image information that exhibit different rates of change may be updated at different frequencies. For example, a first portion of the image information exhibiting a lesser amount of change may be updated less frequently (e.g., every 2-10 acquisitions of partial image information) than second portion of the image information exhibiting a greater amount of change (which may be updated, e.g., during each acquisition of partial image information). Thus, for example, if the if the parameter difference considered across all pixels in the new and previous images is multimodal, the pixels corresponding to each peak can be updated at a different rate. Similarly, different thresholds can be defined, each corresponding to a different update rate. Moreover, updating can be performed on identified pixel regions (i.e., regions of a given size whose mean parameter difference is considered significant) rather than at the level of specific pixels.
[0040] In various embodiments of the present invention, if no portion of the comparative image information exhibits a rate of change that exceeds any of the relevant thresholds, then the comparative image information may be utilized as the new baseline image information, and the method may commence with the acquisition of new full-scan comparative image information. This new acquisition of comparative image information may be undertaken after a delay. In various embodiments, the duration of the delay may be related to the difference between a sensed change in the previously compared images and the change threshold; for example, the delay may be decreased as the sensed change level approaches the threshold. Embodiments of the invention tolerate the slower imaging speed involved with the acquisition of new full scans due to the lack of a rapidly changing characteristic in the treatment area. In this manner, embodiments of the invention compare full-scan image information until the magnitude of a sensed change exceeds a threshold and acquisition and use of partial image information commences.
[0041] In various embodiments, k-space data is acquired in a continuous manner, usually row by row or column by column, and thus one or more portions of the image information identified in step 225 may include, consist essentially of, or consist of one or more rows and/or one or more columns of k-space data, or an edge or interior region of the k-space image (containing pixels from multiple contiguous rows and columns, but not necessarily spanning the entirety of any particular row or column). The k-space data may be contiguous or separated into multiple spaced-apart (e.g., by one or more rows or columns) groups. It should be understood that image data can be acquired other than in discrete rows or columns, e.g., in tracks such as spirals or other patterns.
[0042] In step 230, partial image information corresponding to the portions of the image information identified in step 225 is newly acquired. Such partial image information typically corresponds to one or more partial k-space scans. For example, one or more MRI scans may be performed that collect image information corresponding to only a portion of the treatment area, e.g., the part(s) of the treatment area exhibiting the most rapid change. A partial k-space scan typically includes, consists essentially of, or consists of fewer rows and/or columns of k-space data than in a full k-space scan. In step 235, a new image is formed or acquired by replacing the portions identified in step 225 with the partial image information acquired in step 230; step 235 is therefore equivalent to updating the comparative image information (e.g., the most recent full k-space scan) with the newly acquired partial image data. In an optional step 240, the new image information is reconstructed to form an updated real-space image.
[0043] In some embodiments, the data from the previous image that is not updated is not re-used directly in the new image, but is instead adjusted on a pixel-by-pixel basis. For example, the intensities of pixels from the previous image that are not updated may be increased by a fraction of the mean intensity difference between pixels in the previous and new images that are updated. In another approach, the pixels that are not updated are treated as missing data and assigned values depending on a probability distribution in accordance, for example, with the EM algorithm.
[0044] In various embodiments, step 230 also includes newly acquiring one or more portions of the comparative image information not identified in step 225 as meeting a threshold of change, but acquiring such portions by sampling at a lower resolution. In this manner, the portions of the image information identified as not changing (at least, not undergoing change sufficiently rapid to warrant an updated scan) may be spot-checked to verify that they are still not changing or changing rapidly. In this manner, the benefits of a lower sampling rate may be obtained without re-analyzing full images, since in the static regions, a smaller number of pixels may be monitored over time and utilized for comparative purposes. In various embodiments, all or a portion of the lower-resolution sampled image information from the static regions may be incorporated into new images formed in step 235 and/or in images utilized for comparisons and identification of changing regions.
[0045] In various embodiments of the invention, steps 230 and 235 may be repeated one or more times during the course of the treatment (or a particular portion of a treatment sequence), thereby enabling high-speed imaging of the treatment target at a frequency comparable to, or even exceeding, the rate of one or more changes within the treatment area resulting from the treatment. As mentioned above, during a treatment sequence, one or more treatment parameters may be varied (i.e., step 210), and the method 200 may thus proceed from step 210 accordingly. In such cases, the most recently constructed image information (i.e., the new image information assembled in step 235) or the most recently acquired comparative full scan (i.e., in step 215) may be utilized as new baseline image information when the method is repeated, and new regions of change may be identified based thereon.
[0046] In various embodiments, the method 200 may proceed back to step 215, even without alteration of a treatment parameter, in order to acquire new comparative image information (e.g., a full k-space scan). In such embodiments, the new comparative image information may be compared (e.g., in step 225) to the original baseline image information acquired in step 205 and/or to prior comparative image information acquired in a previously completed step 215 and/or to new image information previously assembled in step 235.
[0047] In various embodiments of the invention, the treatment sequence may be altered based on the new image information developed in step 235 and/or on the real-space image reconstructed in step 240. For example, the ultrasound (or other therapeutic energy) beam may be steered during the treatment procedure to compensate for any target motion, or the beam intensity may be modulated in response to temperature changes in the target. Similarly, if changes in non-target organs or tissues are detected, such changes may be utilized to steer, shape, and/or modulate the beam so as to avoid or minimize their exposure to the therapeutic energy. Particularly, organs vulnerable to damage from the therapeutic beam are often of high interest, and any changes within or of such organs may be taken into account during beam-forming and/or steering such that the energy beam is shaped or modulated so as to treat the target while avoiding damaging temperature increases in sensitive adjacent organs.
[0048] As a conceptual example, consider a k-space image divided for update purposes into four regions A, B, C, D, and that it is desired to boost the overall imaging speed by a factor of two; accordingly, two rather than all four regions in each cycle. A representative workflow would begin with no prior information, and so sample:
[0049] t=0: A,B,C,D
where t identifies an imaging cycle.
[0050] At this stage the full k-space has been sampled, resulting in a baseline image. (Because this k-space image can be processed into the first real-space image, treatment may begin here.) Because there is no comparative information yet, two arbitrary regions are sampled at each cycle:
[0051] t=1: A,B
[0052] t=2: C,D
[0053] Now changes between pixels can be assessed. Assume, for example, that the pixels in region A are, on average, in changing significantly, B exhibits minor changes on average and regions C and D change negligibly. A representative sampling strategy is:
[0054] t=3: A,B
[0055] t=4: A,C
[0056] t=5: A,B
[0057] t=6: A,D
[0058] t=7: A,B
[0059] t=8: A,C
[0060] t=9: A,B
[0061] t=10: A,D
[0062] After each cycle the change in each region relative to the last time that region was sampled can be assessed. If, for example, region B is observed to change faster than region A, the sampling strategy can be switched dynamically to:
[0063] t=11: B,A
[0064] t=12: B,C
[0065] t=13: B,A
[0066] t=14: B,D
[0067] In this way, each region is updated at a frequency based on the observed need for updating, and there no need to do a full scan of all regions (e.g., in the middle of treatment) to adapt the sampling strategy.
[0068] At each cycle (t>1), the full k-space can be assembled from the recent measurements, e.g., after t=14 the full k-space is assembled from A13, B14, C12, D14 (where A13 refers to region A sampled at cycle 13). Alternatively, in some regions (e.g., region A), extrapolation or another estimation technique (e.g., probabilistic modeling using a Kalman filter) can be used to predict current pixel values at the region (rather than pixel) level, e.g., region A14 can be approximated by A13+(A13A11)/2.
[0069] As described above, in some embodiments, imaging during a treatment procedure is used to quantitatively monitor changes of in vivo temperatures. This is particularly useful in MR-guided thermal therapy (e.g., MRgFUS treatment), where the temperature of a treatment area (e.g., a tumor to be destroyed by heat) should be continuously monitored in order to assess the progress of treatment and correct for local differences in heat conduction and energy absorption to avoid damage to tissues surrounding the treatment area. The monitoring (e.g., measurement and/or mapping) of temperature with MR imaging is generally referred to as MR thermometry or MR thermal imaging.
[0070] Among various methods available for MR thermometry, the proton resonance frequency (PRF) shift method is often the method of choice due to its excellent linearity with respect to temperature change, near-independence from tissue type, and temperature map acquisition with high spatial and temporal resolution. The PRF shift method is based on the phenomenon that the MR resonance frequency of protons in water molecules changes linearly with temperature (with a constant of proportionality that, advantageously, is relatively constant between tissue types). The frequency change with temperature is small, only 0.01 ppm/ C. for bulk water and approximately 0.0096 to 0.013 ppm/ C. in tissue, but the PRF shift is straightforwardly detected with a phase-sensitive imaging method in which image information acquired prior to a temperature change is compared to image information acquired after the temperature change, e.g., during or after treatment, thereby capturing a small phase change that is proportional to the change in temperature. A map of temperature changes may then be computed from the images by determining, on a pixel-by-pixel basis, phase differences between the baseline image and the treatment image, and converting the phase differences into temperature differences based on the PRF temperature dependence while taking into account imaging parameters such as the strength of the static magnetic field and echo time (TE) (e.g., of a gradient-recalled echo).
[0071] If the temperature distribution in the imaged area at the time of acquisition of the baseline image is known, the temperature-difference map may be added to that baseline temperature in order to obtain the absolute temperature distribution corresponding to the comparative image acquired during treatment. In some embodiments, the baseline temperature is simply uniform body temperature throughout the imaging region. More complicated baseline temperature distributions are, in some embodiments, determined prior to treatment by direct temperature-measurements in various locations in combination with interpolation and/or extrapolation based on a mathematical fit (e.g., a smooth, polynomial fit).
[0072] Thus, in various embodiments of the present invention, the identification of changing image portions performed in step 225 may include the processing of the acquired image information to form a temperature map corresponding to the treatment area. Portions of the treatment area experiencing changes in temperature exceeding a threshold may be identified and be the basis for, at least in part, the partial image information acquired in step 230.
[0073] Tracking and imaging methods in accordance herewith may be implemented using an (otherwise conventional) image-guided treatment system, such as the MRgFUS system 100 depicted in
[0074] The image-processing and control facility of systems in accordance with embodiments of the invention may be implemented in any suitable combination of hardware, software, firmware, or hardwiring.
[0075] The system memory 304 contains instructions, conceptually illustrated as a group of modules, that control the operation of CPU 302 and its interaction with the other hardware components. An operating system 318 directs the execution of low-level, basic system functions such as memory allocation, file management and operation of mass storage devices 306. At a higher level, one or more service applications provide the computational functionality required for image-processing, tracking, and (optionally) thermometry. For example, as illustrated, the system may include an image-reconstruction module 320 for reconstructing real-space images from raw image data received from the imaging apparatus 314 and for constructing full images (e.g., in k-space) from portions of baseline and/or comparative image information in combination with partial image information related to changing portions of the treatment area. The system may also include an image comparison module 322 for measuring similarity and/or dissimilarity between baseline and comparative images (whether raw data such as k-space data or reconstructed images). The image analysis module 324 extracts information (e.g., locational and/or temperature information of the target and/or other obj ect(s) of interest) from the images acquired or reconstructed as described above. In addition, the system may include a beam-adjustment module 326 for computing phase shifts or other parameters of the treatment apparatus to compensate for any detected changes in the treatment area, and a thermal-map module 328 that subtracts baseline from comparative treatment images to obtain a temperature difference map and, if the absolute temperature corresponding to the selected baseline image is known, an absolute-temperature map for the comparative treatment image and/or images constructed using partial images. The various modules may be programmed in any suitable programming language, including, without limitation, high-level languages such as C, C++, C#, Ada, Basic, Cobra, Fortran, Java, Lisp, Perl, Python, Ruby, or Object Pascal, or low-level assembly languages; in some embodiments, different modules are programmed in different languages.
EXAMPLE
[0076]
[0077] The terms and expressions employed herein are used as terms and expressions of description and not of limitation, and there is no intention, in the use of such terms and expressions, of excluding any equivalents of the features shown and described or portions thereof. In addition, having described certain embodiments of the invention, it will be apparent to those of ordinary skill in the art that other embodiments incorporating the concepts disclosed herein may be used without departing from the spirit and scope of the invention. Accordingly, the described embodiments are to be considered in all respects as only illustrative and not restrictive.