DEEP BRAIN PATH PLANNING USING AUGMENTED REALITY

20230404678 · 2023-12-21

    Inventors

    Cpc classification

    International classification

    Abstract

    Disclosed herein is a medical system (200, 300) comprising an augmented reality system (204) configured for rendering virtual objects within a visual three-dimensional field of view of an operator (214). Execution of machine executable instructions (230) causes the computational system to: receive (400) segmented medical imaging data (100, 232) descriptive of a continuous volume of a subject (210) comprising at least a portion of a brain (102) of the subject and a cranial surface (106) of the subject; control (402) the augmented reality system to determine a registration (234) between the subject and the segmented medical imaging data; receive (404) a target location (104) within the at least a portion of the brain; discretize (406) the cranial surface to define multiple entry locations (108); determine (408) a straight path (110) for each of the multiple entry locations that extends to the target location; assign (410) a path score (238) to the straight path for each of the multiple entry locations using the segmented medical imaging data; calculate (412) a three-dimensional path score surface defined by the cranial surface and is descriptive of the path score of each of the multiple entry locations; and render (414) the three-dimensional path score surface using the augmented reality system, wherein the rendered three-dimensional path score surface is positioned in the visual three-dimensional field of view of the operator using the registration.

    Claims

    1. A medical system comprising: a memory storing machine executable instructions; an augmented reality system configured for rendering virtual objects within a visual three-dimensional field of view of an operator; a computational system, wherein the-execution of the machine executable instructions causes the computational system to: receive segmented medical imaging data descriptive of a continuous volume of a subject comprising at least a portion of a brain of the subject and a cranial surface of the subject; control the augmented reality system to determine a registration between the subject and the segmented medical imaging data; receive a target location within the at least a portion of the brain; discretize the cranial surface to define multiple entry locations; determine a straight path for respective multiple entry locations that extend from the cranial surface to the target location; assign a path score to the straight path for the respective multiple entry locations using the segmented medical imaging data; calculate a three-dimensional path score surface defined by the cranial surface and is descriptive of the path score of the respective multiple entry locations; and render the three-dimensional path score on the cranial surface of the subject using the augmented reality system, wherein the rendered three-dimensional path score surface is positioned in the visual three-dimensional field of view of the operator using the registration.

    2. The medical system of claim 1, wherein the cranial surface is defined as the exterior surface of the subject.

    3. The medical system of claim 1, wherein execution of the machine executable instructions further causes the computational system to: determine an optimal path selected using the path score of each of the multiple entry locations, and render a three-dimensional medical implement positioning guide in the visual three-dimensional field of view of an operator, wherein the three-dimensional implement positioning guide is aligned with the optimal path.

    4. The medical system of claim 3, wherein execution of the machine executable instructions further causes the computational system to: control the augmented reality system to detect an implement location descriptive of a three-dimensional location of a medical implement; and render a tool aligned indicator in the visual three-dimensional field of view of the operator if the implement location is aligned with the optimal path.

    5. The medical system of claim 3, wherein the medical implement is any one of the following: a biopsy needle, a deep brain stimulation implantable device, a pulse generator insertion needle, a needle, and a surgical needle.

    6. The medical system of claim 1, wherein the memory further comprises an elastic brain model configured for modifying the segmented medical imaging data in response to a puncture of the skull and/or draining of the cerebral fluid of the subject, wherein the three-dimensional path score is corrected using the elastic brain model.

    7. The medical system of claim 1, wherein execution of the machine executable instructions further causes the computational system to: receive intermediate magnetic resonance imaging data descriptive of the continuous volume of a subject; register the intermediate magnetic resonance imaging data to the medical imaging data; and modify the segmented medical imaging data using the registration for a deformable brain model, wherein the modified segmented medical imaging data is used to recalculate the three-dimensional path score surface.

    8. The medical system of claim 1, wherein execution of the machine executable instructions further causes the computational system to: receive a selection from the augmented reality system of the one or more of the multiple entry locations; and construct radiotherapy control commands configured to control a radiotherapy system to irradiate the target location along the selection of the one or more of the multiple entry locations.

    9. The medical system of claim 8, wherein the medical system comprises the radiotherapy system, wherein execution of the machine executable instructions further causes the computational system to control the radiotherapy system with the radiotherapy system controls.

    10. The medical system of claim 1, wherein the segmented medical imaging data comprises any one of the following: segmented T2 weighted magnetic resonance imaging data, segmented magnetic resonance imaging data, segmented computed tomography magnetic resonance imaging data, segmented functional magnetic resonance imaging data, magnetic resonance angiography data, and combinations thereof.

    11. The medical system of claim 1, wherein the segmented medical imaging data assigns multiple tissue types three-dimensionally within the continuous volume, wherein the respective multiple tissue types are assigned a numerical damage value, wherein the path score is calculated by determining a distance traveled through the respective multiple tissue types using the segmented medical imaging data times its numerical damage score.

    12. The medical system of claim 1, wherein the segmented medical imaging data identifies critical anatomical structures, wherein execution of the machine executable instructions further causes the computational system to exclude any straight path through any of the critical anatomical structures.

    13. The medical system of claim 1, wherein the segmented medical imaging data further comprises functional magnetic resonance imaging data descriptive of critical brain function regions, wherein execution of the machine executable instructions further causes the computational system to perform any one of the following: exclude any straight path through any of the critical brain function regions; wherein each of the critical brain function regions is assigned a numerical brain function damage value, wherein the path score is calculated at least partially by determining a distance traveled through each of the critical brain function regions times its numerical brain function damage score; and combinations thereof.

    14. The medical system of claim 1, wherein real-time feedback may be received by the user of the system using AR googles.

    15. The medical system of claim 14, wherein the feedback may reflect the insertion an angle of a needle, wherein the changes in angle of the needle insertion may be expressed as a change of color.

    16. The medical system of claim 14, wherein the feedback may reflect speed of insertion of the needle, wherein the speed of insertion of the needle may be expressed as a change of a color.

    17. A non-transitory computer readable medium comprising machine executable instructions for execution by a computational system controlling a medical system, wherein the medical system comprises an augmented reality system configured for rendering virtual objects within a visual three-dimensional field of view of an operator, wherein the execution of the machine executable instructions causes the computational system to: receive segmented medical imaging data descriptive of a continuous volume of a subject comprising at least a portion of a brain of the subject and a cranial surface of the subject; control the augmented reality system to determine a registration between the subject and the segmented medical imaging data; receive a target location within the at least a portion of the brain; discretize the cranial surface to define multiple entry locations; determine a straight path for each of the multiple entry locations that extends from the cranial surface to the target location; assign a path score to the straight path for each of the multiple entry locations using the segmented medical imaging data; calculate a three-dimensional path score surface defined by the cranial surface and is descriptive of the path score of each of the multiple entry locations; and render the three-dimensional path score on the cranial surface of the subject using the augmented reality system, wherein the rendered three-dimensional path score surface is positioned in the visual three-dimensional field of view of the operator using the registration.

    18. A method of operating a medical system, wherein the medical system comprises an augmented reality system configured for rendering virtual objects within a visual three-dimensional field of view of an operator, wherein the method comprises: receiving segmented medical imaging data descriptive of a continuous volume of a subject comprising at least a portion of a brain of the subject and a cranial surface of the subject; controlling the augmented reality system to determine a registration between the subject and the segmented medical imaging data; receiving a target location within the at least a portion of the brain; discretizing the cranial surface to define multiple entry locations; determining a straight path for the respective multiple entry locations that extends from the cranial surface to the target location; assigning a path score to the straight path for the respective multiple entry locations using the segmented medical imaging data; calculating a three-dimensional path score surface defined by the cranial surface and is descriptive of the path score of the respective multiple entry locations; and rendering the three-dimensional path score on the cranial surface of the subject using the augmented reality system, wherein the rendered three-dimensional path score surface is positioned in the visual three-dimensional field of view of the operator using the registration.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0066] In the following preferred embodiments of the invention will be described, by way of example only, and with reference to the drawings in which:

    [0067] FIG. 1 illustrates how segmented medical imaging data can be used for planning paths to a target location within a brain;

    [0068] FIG. 2 illustrates and example of a medical system;

    [0069] FIG. 3 illustrates a further example of a medical system;

    [0070] FIG. 4 shows a flow chart which illustrates a method of using the medical system of claim 2 or 3;

    [0071] FIG. 5 illustrates an example of a rendering of three-dimensional path score surface;

    [0072] FIG. 6 illustrates a further example of a rendering of three-dimensional path score surface; and

    [0073] FIG. 7 illustrates a further example of a rendering of three-dimensional path score surface;

    DETAILED DESCRIPTION OF EMBODIMENTS

    [0074] Like numbered elements in these figures are either equivalent elements or perform the same function. Elements which have been discussed previously will not necessarily be discussed in later figures if the function is equivalent.

    [0075] FIG. 1 illustrates an example of a segmented medical imaging data 100 that can be used for planning a path to a target location 104 within a brain 102. The segmented medical imaging data 100 illustrates the brain 102 of a subject. Within the brain there is a target location 104. On the exterior of the subject is a cranial surface 106. The cranial surface can be divided into a number of discrete locations to define multiple entry locations 108. Between each of the entry locations 108 and the target location 104 there is a straight path 102. Each of these paths can be evaluated by assigning a path score to each of the paths 110.

    [0076] The path score may be calculated in different ways. For example, the segmentation of the image 100 may provide different tissue types and the distance along the paths 110 within different tissue types can be used to calculate the path score. Additionally, there can be critical anatomical structures 112, which can for example be used to exclude paths or even data from functional magnetic resonance imaging which provides information on critical brain function regions 114. For example, if there is a desire to minimize the damage to the subject's speech, a functional magnetic resonance image which maps the speech function in the subject's brain may be used to define regions 114 which should be avoided or given a higher score. The path 116 is an exemplary optimal path. Using an augmented reality system, a three-dimensional medical implement positioning guide 118 could be displayed to the surgeon.

    [0077] FIG. 2 illustrates an example of a medical system 200. The medical system 200 comprises a computer 202 and an augmented reality system 204. The augmented reality system 204 comprises augmented reality goggles 206 and a position registration system 208. In some examples the position registration system 208 may be multiple cameras or sensors for detecting the location of people as well as objects within the room. A subject 210 is shown as reposing on a subject support 212. In FIG. 1 the path scores for the various paths 108 were calculated. This is used to calculate a three-dimensional path score surface 218 which the augmented reality system 204 projects onto the cranial surface 106. The augmented reality system 204 also displays the three-dimensional path score surface 218. The operator 214 is able to align the medical implement 216 easily with the three-dimensional medical implement positioning guide 118.

    [0078] The computer 202 is shown as comprising a computational system 220 that is connected to a hardware interface 222, an optional user interface 224, and a memory 226. The hardware interface 222 enables the computational system 220 to communicate and control other components of the medical system 200. The user interface 224 may for example enable the operator to modify and control the functionality of the medical system 200. The memory 226 is intended to be any sort of memory which is accessible to the computational system 220; this may include both volatile and non-volatile types of memory.

    [0079] The memory 226 is shown as containing machine-executable instructions 230. The machine-executable instructions 230 enable the computational system 220 to control the medical system 200 as well as perform various types of data and image processing. The memory 226 is shown as containing segmented medical image data 232. The memory 226 is further shown as containing a registration 234 between the subject 210 and the segmented medical image data 232. The segmented medical image data 232 may for example be similar to what is rendered as item 100 in FIG. 1.

    [0080] The memory 226 is shown as storing the position 236 of the target location 104. The memory 226 is further shown as containing path scores 238 that were calculated for the paths 110 in FIG. 1. The path scores 238 were then used to calculate a three-dimensional path score surface 240 which is then rendered as 218 by the augmented reality system 204. The memory 226 is further shown as optionally containing pulse sequence commands 242 for controlling an optional magnetic resonance imaging system 246. The memory 226 is shown as containing initial k-space data 224 that was acquired by controlling the magnetic resonance imaging system 246 with the optional pulse sequence commands 242. To acquire this data the subject 216 was placed in an imaging zone of the magnetic resonance imaging system 246. The initial k-space data 244 may be reconstructed and then segmented to provide the segmented medical image data 232 or 100.

    [0081] After the subject's 210 skull has been punctured cerebrospinal fluid may drain. This may cause distortions or changes in the location of the subject's brain 102. After puncturing of the skull, the subject 210 may then be placed into the magnetic resonance imaging system 246 to measure intermediate k-space data 248. The same pulse sequence commands 242 may be used or even lower resolution pulse sequence commands. The acquired intermediate k-space data 248 may then be reconstructed to intermediate magnetic resonance imaging data 250. This may then be registered to the segmented medical image data 232 and a deformable brain model 252 may be used to correct the segmented medical image data 232. This corrected segmented medical image data 232 may then be optionally used to recalculate the path scores 238 and then update the three-dimensional path score surface 240.

    [0082] In some examples an elastic brain model 260 is used to predict changes in the position and deformation of the brain 102 after a draining of cerebrospinal fluid. The elastic brain model 260 may be used to predict changes in the subject's brain anatomy and calculate more accurate path scores 238.

    [0083] The rendering of the three-dimensional path score surface 218 may also additionally comprise additional aids to the operator 214 in properly positioning the medical implement 216. For example, the appearance of the rendering of the three-dimensional path score surface 218 may change. For example, it may change in size or color or brightness to indicate how well aligned the medical implement 216 with the three-dimensional medical implement positioning guide is. The augmented reality system 204 may also have printed messages or other symbols which are displayed to further indicate to the operator 214 how well the medical implement 216 is positioned with the three-dimensional medical implement positioning guide 118.

    [0084] FIG. 3 illustrates a further example of a medical system 300. The example illustrated in FIG. 3 is similar to the medical system 200 in FIG. 2 except that it additionally comprises an optional radiotherapy system 302. In this example, the medical system 300 is not being used to choose paths for mechanically inserting a medical implement 216, but instead is being used to select radiotherapy paths 304 for a radiotherapy treatment. The operator 216 is able to inspect the rendering 218 of the three-dimensional path score surface and select a number of entry points for radiotherapy paths 306 to irradiate the target location 104. The operator 214 selects the radiotherapy paths 304 and the system then generates radiotherapy system control commands 306 for eradiating along these paths 304. The subject 210 can for example be placed into the radiotherapy system 302 and the radiotherapy system 302 can be controlled with the radiotherapy system control commands 306.

    [0085] FIG. 4 shows a flowchart which illustrates a method of operating the medical system 200 in FIG. 2 or the medical system 300 in FIG. 3. First, in step 400, the segmented medical image data 100 or 232 is received. This segmented medical image data 100 describes a continuous volume of the subject 210 including at least a portion of a brain 102 and a cranial surface 106. Next, in step 402, the augmented reality system 204 is controlled to determine the registration 234. In this case, the position registration system 208 would be used. This may for example be done by using mechanical fixtures, optical systems or other sensor systems for determining the position of the subject 210 as well as the medical implement 216.

    [0086] Next, in step 404, the position 236 of the target location 104 is received. Next, in step 406, the cranial surface 106 is divided into discrete locations to provide multiple entry locations 108. Next, in step 408, a straight path 108 is determined for each of the entry locations 108. Then a path score 238 is assigned to each of the straight paths 110 using the segmented medical imaging data 232 or 100. This for example can be done by using tissue types identified by the segmentation and then calculating the distance through which each type of tissue has travelled. Next, in step 412, a three-dimensional path score surface 240 is determined. Then in step 414, the three-dimensional path score surface 240 is rendered 218 using the augmented reality system and it is positioned identically with the cranial surface 106.

    [0087] Examples may relate to systems and methods for planning, navigation, and simulation for the selection of the least-damaging straight approach route to a target deep in the brain with the help of Augmented Reality (AR).

    [0088] In some examples color or greyscale may be projected onto the skull and used to indicate a sum score of damage factors (the rendering 218 of the three-dimensional path score surface 240). For example, a gradual red/green coloring projected on the skull via AR a sum score of damage factors (path scores 238) may be used. In fact, an indicator (three-dimensional medical implement positioning guide) for the best pathway through the brain to the deep brain target (target location 104). This way, the surgeon or operator is informed about the amount of vulnerable tissue (for instance blood vessels) that will be hit on the straight route starting from that position on the skull to the target deep in the brain. The AR goggle system may show the optimal entry point with, for example, a simple light beam (three-dimensional medical implement positioning guide 118) pointing straight to the target and assists the surgeon directly on the correct insertion angles. This indication is initially base on pre-registered MR images (segmented medical imaging data 100, 232). With special tissue deformation software (deformable brain model 252) the effects of making a hole in the skull and penetration of brain with a needle is estimated and the visualization of adapted accordingly. These estimations can be checked with real-time MR images done based upon 3D structural brain information from pre-registered MR images or real-time adapted tissue deformation after making a hole in the skull and penetration of the brain tissue. In this way the surgeon does not need a lexoframe any more for setting the correct insertion angles.

    [0089] In daily practice the neurosurgeon plans an approach path for a needle (or a sharp medical device in general) with the help of several MR scans. They then have to check the path in reality with small stiff needles so that they don't damage elastic vessels. Other brain tissue is too soft to give feedback during the insertion of the needle. In other words, the neurosurgeon is blind for the kind of tissue the needle will penetrate during his travel through the brain on his way to the target area. This target area can be the location of the deep brain stimulation electrode position or a brain tumor, which needs extraction or puncture or visual inspection. Below a picture with the currently used framework, which is fixed to the head of the patient, correctly set the insertion angles for the probing needle. This framework may not be needed anymore in case an AR goggle is used to show the insertion angles.

    [0090] Examples may use one or more pre-registered MR scans (segmented medical imaging data 100, 232) of the brain to investigate possibly multiple straight approach paths to the target area that is indicated by the neurosurgeon. A possible first step is to label all voxels in the MR image based upon tissue type (like brain fluid, blood vessel, white or grey matter) using the segmentation. The neurosurgeon can indicate a damage factor (bigger number means more damage) for each tissue type (for instance brain fluid 0, white matter 5, grey matter 10 and blood vessel 25). The application may sum up all damage factors of all voxels that will be hit by one single straight path and translate the sum score (path scores 238) into a colored dot on the skull (rendering 218 of the three-dimensional pash score surface 240) of the patient at the starting point (cf. FIG. 5 below: darker means less damage in the path to the target).

    [0091] As an example, AR googles may be used to receive feedback on the insertion angle. The feedback is preferably in real-time. Color-coding may be utilized to project the real-time feedback to the user of the AR googles. Real-time feedback of the angle of the needle can for instance be expressed by a change of color of the entry point of the needle displayed through AR googles. As an example, 3 colors may be utilized in order to mark the change in position. When the needle is inserted in the correct angle, the color of the entry point is green, but when it moves outside an accepted range the entry point turns yellow and even red when it deviates too much. The angle of the needle is monitored by means of the goggles that calculate the angle in real-time.

    [0092] FIG. 5 shows an exemplary rendering of the three-dimensional path score surface 218. The three-dimensional medical implement positioning guide 118 also referred to as the insertion angle indicator is displayed. On this surface the darker the region the less damage that a path from the cranial surface to the target location 104 is reduced.

    [0093] FIG. 6 shows a different view of the plot shown in FIG. 5. In this case, we are able to see within the three-dimensional model and we can see how the three-dimensional medical implement positioning guide 118 or the optimal path 116 reaches the target location 104.

    [0094] The rendering of the three dimensional path score surface makes it easy for the neurosurgeon to select an approach path in the middle of a sweat spot (like the dark green area on top of the skull), that can tolerate small brain shifts, which will happen after drilling a small hole into the skull to insert the needle(s) and a small fraction of brain fluid will escape. In other words, the neurosurgeon has a quick overview of the damage of each approach path and can try to minimize the brain damage by adjusting the damage factors or reposition the target area. The AR goggle will show the optimal insertion orientation or angle as a light beam or positioning guide. The surgeon can use this beam for positioning his needle without using the framework, which will result in a quicker procedure to reach the target. The light beam can also show how far the needle needs to be inserted to reach exactly the target location.

    [0095] As an example, AR googles may be used to receive feedback on the speed of the needle insertion angle. The feedback is preferably in real-time. Color-coding may be utilized to project the real-time feedback to the user of the AR googles. Real-time feedback of the angle of the needle can for instance be expressed by a change of color based on predetermined speed of the needle insertion depending on e.g. potential risk to the patient. As an example, 3 colors may be utilized to mark the speed of insertion. For instance, when the needle is at an acceptable speed that will hardly damage the brain, the color of the entry point is green, but when it moves outside an accepted speed range the entry point turns yellow and even red when the speed is outside of acceptable regions. The speed of the needle is monitored by means of the goggles that calculate the speed in real-time.

    [0096] Examples may use voxel labelling in 3D space and damage categories to estimate the damage of each straight path between scalp entry point and target voxel. By labelling area in the brain (like motor or visual cortex, speech area, . . . ) and extra damage factor per area could be added.

    [0097] Blood vessels, which are smaller than MR voxels, still can be labelled as blood vessels and thereby will be not overlooked in the damage overview.

    [0098] Examples may enable neurosurgeons to make a better, robust and quicker brain surgery planning with a small as possible brain damage with the capability to avoid important brain regions. The AR goggle is very easy and intuitive to use and can avoid the use of a framework.

    [0099] In one example a sum of damage factors is produced, based upon pre-registered structural MR images and all voxels on one single straight path. Next it will translate this sum score of damage factors into a colored dot projected artificially on the skull of the patient. FIG. 7 below shows the mapping of the sum score and optimal path mapped in the real world through the goggle. Because the goggle maps the path in the real world: it is very easy for the surgeon to put the needle in the optimal direction.

    [0100] FIG. 7 illustrates the rendering 218 of the three-dimensional path score surface and the three-dimensional medical implement positioning guide 118 in the real world using an augmented reality display 700 to display the surface 218 on the cranial surface of the subject.

    [0101] In an example, the sum score of damage factors (in fact an indicator for the best pathway through the brain to the deep brain target) is determined continuously in real-time based upon the real-time information available from the MR scanning. An advantage of determining the sum score of damage factors in real-time is that it can adapt this score to a change in angle and by a changing brain structure. It is known that both making a whole in the skull and penetration of a medical device into the brain may lead to changing structure of the sulci of the brain. This real-time method is therefore an adaptive method.

    [0102] Real time MR imaging can be avoided with special software (deformable brain module 252), which estimate the deformation of the brain tissue after making a hole in the skull or insertion of a needle. The software starts with a pre-recorded MR scan, makes a segmentation to identify different brain tissue (like grey & white matter, fluid or blood vessels). Each tissue has its own mechanical properties, which enables tissue replacement and/or deformation estimations of surgeon actions like making a hole in the skull.

    [0103] When the target (target location 104) is selected in the middle of a brain tumor the optimal path can be used for a quick and easy tissue puncture of the tumor.

    [0104] For a brain tumor extraction, the AR system or goggle can show the drilling plan step by step by the light beam, which are adapted to the tissue changes after each extraction.

    [0105] In another example, the structural information will be complemented by functional MRI information. fMRI provides information on functional areas such as speech area, or visual areas.

    [0106] In another example the invention will be applied for a radiation beam.

    [0107] While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.

    [0108] Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word comprising does not exclude other elements or steps, and the indefinite article a or an does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measured cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

    REFERENCE SIGNS LIST

    [0109] 100 segmented medical imaging data of subject [0110] 102 brain [0111] 104 target location [0112] 106 cranial surface [0113] 108 entry location [0114] 110 straight path [0115] 112 critical anatomical structure [0116] 114 critical brain function region [0117] 116 optimal path [0118] 118 three-dimensional medical implement positing guide [0119] 200 medical system [0120] 202 computer [0121] 204 augmented reality system [0122] 206 augmented reality goggles [0123] 208 position registration system [0124] 210 subject [0125] 212 subject support [0126] 214 operator [0127] 216 medical implement [0128] 218 rendering of three dimensional path score surface [0129] 220 computational system [0130] 222 hardware interface [0131] 224 user interface [0132] 226 storage [0133] 230 machine executable instructions [0134] 232 segmented medical image data [0135] 234 registration [0136] 236 position of target location [0137] 238 path scores [0138] 240 three-dimensional path score surface [0139] 242 pulse sequence commands [0140] 244 initial k-space data [0141] 246 MRI system [0142] 248 intermediate k-space data [0143] 250 intermediate magnetic resonance imaging data [0144] 252 deformable brain model [0145] 260 elastic brain model [0146] 300 medical system [0147] 302 radiotherapy system [0148] 304 radiotherapy path [0149] 306 radiotherapy system control commands [0150] 700 augmented reality display