Method and system of providing visual information about a location and shape of a tumour under a body surface of a human or animal body
11523869 · 2022-12-13
Assignee
Inventors
- Jasper Albertus Nijkamp (Amsterdam, NL)
- Koert Frans Dirk Kuhlmann (Amsterdam, NL)
- Jan-Jakob Sonke (Amsterdam, NL)
- Theodoor Jacques Marie Ruers (Amsterdam, NL)
Cpc classification
A61B2562/0238
HUMAN NECESSITIES
A61B5/0077
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
A61B6/5217
HUMAN NECESSITIES
A61B5/065
HUMAN NECESSITIES
A61B2090/364
HUMAN NECESSITIES
A61B6/12
HUMAN NECESSITIES
International classification
A61B34/20
HUMAN NECESSITIES
A61B90/00
HUMAN NECESSITIES
A61B6/12
HUMAN NECESSITIES
A61B5/00
HUMAN NECESSITIES
A61B5/06
HUMAN NECESSITIES
Abstract
In a method and system for providing visual information about a tumour location in human or animal body, an electromagnetic tumour sensor is provided in the tumour and tracked to determine its location in space, which is mapped to a tumour model. A surgical tool sensor is provided on a surgical tool, and tracked to determine its location in space, which is mapped to a surgical tool model. The body is scanned to obtain information about an anatomical structure. A reference sensor is provided on the body, and tracked to determine its location in space, which is mapped to the anatomical structure. A virtual image is displayed showing the tumour model, located with the at least one tumour sensor, in spatial relationship to the surgical tool model, located with the at least one surgical tool sensor, and the anatomical structure, located with the at least one reference sensor.
Claims
1. A system for providing visual information about a location and shape of a tumour under a body surface of a human or animal body, the system comprising: at least one electromagnetic tumour sensor; a tumour sensor tracking system, wherein each tumour sensor is configured to interact with the tumour sensor tracking system to determine an individual location of the tumour sensor in space; a tumour model mapping component configured for mapping each tumour sensor location to a digital volumetric tumour model, wherein the tumour model comprises one or more volumes shaped to represent the shape of the tumour; a surgical tool provided with at least one surgical tool sensor; a surgical tool sensor tracking system, wherein each surgical tool sensor is configured to interact with the surgical tool sensor tracking system to determine an individual location of the surgical tool sensor in space, wherein the surgical tool sensor tracking system is calibrated to provide locations in a first coordinate system, and wherein the tumour sensor tracking system is calibrated to provide locations in a second coordinate system; a surgical tool mapping component configured for mapping each surgical tool sensor location to a digital surgical tool model representative of a shape of the surgical tool; an anatomical structure information input component for receiving information about at least one anatomical structure in the human or animal body different from the tumour; at least one reference sensor; a reference sensor tracking system, wherein each reference sensor is configured to interact with the reference sensor tracking system to determine an individual location of the reference sensor in space, wherein the reference sensor tracking system is calibrated to provide locations in a third coordinate system; an anatomical structure mapping component configured for mapping a position on the surface of the human or animal body determined by each reference sensor location to the at least one anatomical structure in the human or animal body; a tumour sensor locating component configured for determining the location of each tumour sensor in space; a surgical tool sensor locating component configured for determining the location of each surgical tool sensor in space; a reference sensor locating component configured for determining the location of each reference sensor in space, a display device; and a displaying component configured for displaying on the display device, based on the first and second coordinate systems, a virtual image assembled to show the digital volumetric tumour model, comprising said one or more volumes shaped to represent the shape of the tumour, as located with the at least one tumour sensor, in a spatial relationship to the digital surgical tool model representative of the shape of the surgical tool, as located with the at least one surgical tool sensor, wherein the displaying component is configured for assembling, based on the third coordinate system, the virtual image further to show the at least one anatomical structure, as located with the at least one reference sensor, in a spatial relationship to the digital volumetric tumour model, comprising said one or more volumes shaped to represent the shape of the tumour, as located with the at least one tumour sensor, and in a spatial relationship to the digital surgical tool model representative of the shape of the surgical tool, as located with the at least one surgical tool sensor.
2. The system according to claim 1, wherein the at least one tumour sensor is configured to communicate wirelessly or wired with the tumour sensor tracking system.
3. The system according to claim 1, wherein the at least one surgical tool sensor is configured to communicate wirelessly or wired with the surgical tool sensor tracking system.
4. The system according to claim 1, wherein the at least one reference sensor is configured to communicate wirelessly or wired with the reference sensor tracking system.
5. The system according to claim 1, wherein the at least one surgical tool sensor is an electromagnetic surgical tool sensor, and wherein the surgical tool sensor tracking system is the same as the tumour sensor tracking system.
6. The system according to claim 1, wherein the at least one surgical tool sensor is an optical surgical tool sensor, and wherein the surgical tool sensor tracking system is an optical tracking system comprising at least one camera.
7. The system according to claim 1, wherein the at least one reference sensor is an electromagnetic reference sensor, and wherein the reference sensor tracking system is the same as the tumour sensor tracking system.
8. The system according to claim 1, wherein the at least one reference sensor is an optical reference sensor, and wherein the reference sensor tracking system is an optical tracking system comprising at least one camera.
9. The system according to claim 1, further comprising a model updating component configured to receive data of images and update the tumour model or the anatomical structure based on the images obtained.
10. The system according to claim 1, wherein the at least one tumour sensor further is configured to interact with the tumour sensor tracking system to determine an individual orientation of the tumour sensor in space.
11. The system according to claim 1, wherein the at least one surgical tool sensor further is configured to interact with the surgical tool sensor tracking system to determine an individual orientation of the surgical tool sensor in space.
12. The system according to claim 1, wherein the at least one reference sensor further is configured to interact with the reference sensor tracking system to determine an individual orientation of the reference sensor in space.
13. The system according to claim 1, wherein the displaying component is configured to assemble the virtual image further to show a scale of length, or a scaled distance between the tumour model and the surgical tool model.
14. The system according to claim 1, wherein the displaying component is configured to assemble the virtual image further to show a scale of length, or a scaled distance between the tumour model and the anatomical structure.
15. The system according to claim 1, wherein the displaying component is configured to assemble the virtual image further to show a scale of length, or a scaled distance between the surgical tool model and the anatomical structure.
16. The system according to claim 1, wherein the one or more volumes of the tumour model are shaped according to a contour of the tumour.
17. The system according to claim 1, wherein the tumour model further comprises a body tissue layer around the shape of the tumour.
18. A method of providing visual information about a location of a tumour under a body surface of a human or animal body with the system of claim 1, the method comprising: interacting, by the tumour sensor tracking system, with the at least one electromagnetic tumour sensor to determine the individual location of the tumour sensor in space; determining the digital volumetric tumour model representative of the shape of the tumour; mapping each tumour sensor location to the tumour model; providing the at least one surgical tool provided with the at least one surgical tool sensor, the at least one surgical tool sensor interacting with the surgical tool sensor tracking system to determine the individual location of the surgical tool sensor in space; determining the digital surgical tool model representative of the shape of the surgical tool; mapping each surgical tool sensor location to the surgical tool model; scanning the human or animal body to obtain information about at least one anatomical structure in the human or animal body different from the tumour; interacting, by the reference sensor tracking system, with the at least one reference sensor located on the surface of the human or animal body to determine the individual location of the reference sensor in space; mapping each reference sensor location to at least one anatomical structure in the human or animal body; determining the location of each tumour sensor in space; determining the location of each surgical tool sensor in space; determining the location of each reference sensor in space; assembling, based on the first and second coordinate systems, the virtual image showing the digital volumetric tumour model representative of the shape of the tumour, as located with the at least one tumour sensor, in the spatial relationship to the digital surgical tool model representative of the shape of the surgical tool, as located with the at least one surgical tool sensor; and assembling, based on the third coordinate system, the virtual image further to show the at least one anatomical structure, as located with the at least one reference sensor, in the spatial relationship to the digital volumetric tumour model representative of the shape of the tumour, as located with the at least one tumour sensor, and in the spatial relationship to the digital surgical tool model representative of the shape of the surgical tool, as located with the at least one surgical tool sensor.
19. The method according to claim 18, wherein the at least one surgical tool sensor is an electromagnetic surgical tool sensor, and wherein the surgical tool sensor tracking system is the same as the tumour sensor tracking system; or wherein the at least one surgical tool sensor is an optical surgical tool sensor, and wherein the surgical tool sensor tracking system is an optical tracking system comprising at least one camera.
20. The method according to claim 18, wherein the at least one reference sensor is an electromagnetic reference sensor, and wherein the reference sensor tracking system is the same as the tumour sensor tracking system; or wherein the at least one reference sensor is an optical reference sensor, and wherein the reference sensor tracking system is an optical tracking system comprising at least one camera.
21. A non-transitory computer readable medium for providing visual information about a location and shape of a tumour under a body surface of a human or animal body comprising instructions which, when loaded in a processor, cause the processor to operate; a tumour sensor tracking system configured to interact with one or more electromagnetic tumour sensors to determine an individual location of each of the tumour sensors in space; a tumour model mapping component to map each tumour sensor location to a digital volumetric tumour model, wherein the tumour model comprises one or more volumes shaped to represent the shape of the tumour; a surgical tool sensor tracking system configured to interact with at least one surgical tool sensor of a provided surgical tool to determine an individual location of the surgical tool sensor in space, wherein the surgical tool sensor tracking system is calibrated to provide locations in a first coordinate system, and wherein the tumour sensor tracking system is calibrated to provide locations in a second coordinate system; a surgical tool mapping component to map each surgical tool sensor location to a digital surgical tool model representative of a shape of the surgical tool; an anatomical structure information input component to receive information about at least one anatomical structure in the human or animal body different from the tumour; a reference sensor tracking system configured to interact with one or more reference sensors to determine an individual location of each reference sensor in space, wherein the reference sensor tracking system is calibrated to provide locations in a third coordinate system; an anatomical structure mapping component to map a position on the surface of the human or animal body determined by each reference sensor location to the at least one anatomical structure in the human or animal body; a tumour sensor locating component to determine the location of each tumour sensor in space; a surgical tool sensor locating component to determine the location of each surgical tool sensor in space; a reference sensor locating component to determine the location of each reference sensor in space; and a displaying component to display on a display device, based on the first and second coordinate systems, a virtual image assembled to show the digital volumetric tumour model, comprising said one or more volumes shaped to represent the shape of the tumour, as located with the at least one tumour sensor, in a spatial relationship to the digital surgical tool model representative of the shape of the surgical tool, as located with the at least one surgical tool sensor, wherein the displaying component is configured for assembling, based on the third coordinate system, the virtual image further to show the at least one anatomical structure, as located with the at least one reference sensor, in a spatial relationship to the digital volumetric tumour model, comprising said one or more volumes shaped to represent the shape of the tumour, as located with the at least one tumour sensor, and in a spatial relationship to the digital surgical tool model representative of the shape of the surgical tool, as located with the at least one surgical tool sensor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
DETAILED DESCRIPTION OF EMBODIMENTS
(9)
(10) Herein, a component is taken be refer to software (a computer program), or a combination of software and hardware, for performing a specific function. The software comprises computer instructions to cause a processor to perform the function by data processing. The software may be stored in a non-transitory storage medium or memory.
(11) According to step 100, information about a digital volumetric tumour model is received, e.g. from a database.
(12) According to step 101, at least one electromagnetic tumour sensor is provided. Previously, the tumour sensor has been placed under a body surface of a human or animal body in the tumour, or in the vicinity of the tumour, or in close proximity to the tumour, to have a defined spatial relationship with the tumour. The process of providing the tumour sensor is not part of the present invention, and may be done by a surgeon or other physician during an invasive operation.
(13) Each tumour sensor is configured to interact with a tumour sensor tracking system, as will be explained in more detail below, to determine an individual location of the tumour sensor in space.
(14) According to step 102, following step 101, each tumour sensor location is mapped to a digital volumetric tumour model representative of the tumour. The tumour model comprises one or more volumes shaped to optimally represent the particular tumour, possibly including a further body tissue layer around the tumour. The mapping of a tumour sensor location entails defining a one-to-one relationship between a location on or in the tumour model and the specific tumour sensor, based on information of the actual location of the specific tumour sensor on the actual tumour.
(15) According to step 103, following step 102, the location of each tumour sensor in space is determined, with the tumour sensor tracking system. This tumour sensor location in space provides the spatial location of a part of the tumour where the tumour sensor is located, and at the same time defines a virtual location of a part of the tumour model in an image to be displayed.
(16) According to step 110, information about a digital surgical tool model is received, e.g. from a database.
(17) According to step 111, at least one surgical tool is provided with at least one surgical tool sensor arranged on or in the surgical tool. Each surgical tool sensor is configured to interact with a surgical tool sensor tracking system, as will be explained in more detail below, to determine an individual location of the surgical tool sensor in space.
(18) According to step 112, following step 111, each surgical tool sensor location is mapped to a digital surgical tool model representative of the surgical tool. The surgical tool model comprises a line (a onedimensional tool model), a plane having a defined edge shape (a twodimensional tool model), and/or one or more volumes (a threedimensional tool model) shaped to optimally represent the particular surgical tool. The mapping of a surgical tool sensor location entails defining a one-to-one relationship between a location on or in the surgical tool model and the specific surgical tool sensor, based on information of the actual location of the specific surgical tool sensor on the actual surgical tool.
(19) According to step 113, following step 112, the location of each surgical tool sensor in space is determined, with the surgical tool sensor tracking system. This surgical tool sensor location in space provides the spatial location of a part of the surgical tool where the surgical tool sensor is located, and at the same time defines a virtual location of a part of the surgical tool model in an image to be displayed.
(20) According to step 121, information about at least one anatomical structure in the human or animal body in a vicinity of the tumour is received, e.g. from a database, or from a scanning device.
(21) According to step 122, at least one reference sensor is provided on the body surface of the human or animal body. Each reference sensor is configured to interact with a reference sensor tracking system, as will be explained in more detail below, to determine an individual location of the reference sensor in space.
(22) According to step 123, following step 122, each reference sensor location is mapped to the at least one anatomical structure in the human or animal body. The anatomical structure comprises one or more volumes shaped to optimally represent the particular anatomical structure. The mapping of a reference sensor location entails defining a one-to-one relationship between a location on or in the anatomical structure and the specific reference sensor, based on information of the actual location of the specific reference sensor in relation to the actual anatomical structure.
(23) According to step 124, following step 123, the location of each reference sensor in space is determined, with the reference sensor tracking system. This reference sensor location in space provides the spatial location of a part of the anatomical structure where the reference sensor is located, and at the same time defines a virtual location of a part of the anatomical structure in an image to be displayed.
(24) The surgical tool tracking system, the tumour sensor tracking system, and the reference sensor tracking system are calibrated to provide locations in a first, second and third coordinate system, respectively.
(25) According to step 130, a virtual image, based on said coordinate systems, of the at least one anatomical structure, as located with the at least one reference sensor, in a spatial relationship to the tumour model, as located with the at least one tumour sensor, and/or in a spatial relationship to the surgical tool model, as located with the at least on tool sensor is assembled, i.e. data are processed to provide image data to construct the virtual image.
(26) According to step 140, the virtual image may be displayed on a display device to aid or guide a treating physician, e.g. a surgeon, in performing a tumour operation or other local treatment on a patient, in particular his or her human or animal body. The virtual image may further be assembled to show a scale of length, or a scaled distance between the tumour model and the surgical tool model, a scaled distance between the tumour model and the anatomical structure and/or a scaled distance between the tool model and the anatomical structure.
(27) Herein, a spatial relationship between two elements is understood as a position of the elements relative to each other in space. In some embodiments, a spatial relationship may further involve an orientation of the elements relative to each other in space.
(28) In some applications, only a spatial relationship between the tumour and the surgical tool needs to be visualized on a display device using the tumour model and the surgical tool model, and steps 121 to 124 may be omitted. In other applications, only a spatial relationship between the tumour and the anatomical structure needs to be visualized on a display device using the tumour model and the anatomical structure, and steps 111 to 113 may be omitted. In still other applications spatial relationships between the tumour, the surgical tool and the anatomical structure need to be visualized on a display device using the tumour model, the surgical tool model and the anatomical structure.
(29) Steps 101, 102, 111, 112, 121, 122 and 123 may be performed pre-treatment or pre-operatively, while steps 103, 113, 124 and 130 may be performed during treatment or intra-operatively. In some applications, steps 121 and 123 may be performed also during treatment or intra-operatively, in particular when the body of the patient may be expected to move during treatment or intra-operatively.
(30)
(31) In an implementation of the method or system according to
(32) A first data source for the navigation software 202 is a picture archiving system 204 containing image data of pre-operative images 203. The picture archiving system 204 receives image data representing pre-operative images 203 of anatomical structures from one or more imaging systems 214, such as a computer tomography, CT, imaging system 216, a magnetic resonance imaging, MRI, system 218, a positron emission tomography, PET, system 220, or any other appropriate system.
(33) A second data source for the navigation software 202 is image segmentation system 206 comprising software providing image segmentation data 208 representing a tumour model, i.e. tumour model data, an anatomy model for a specific patient, i.e. anatomical structure data, and/or a surgical tool model, i.e. surgical tool model data.
(34) A third data source for the navigation software 202 is a sensor tracking system 210, i.e. one or more of the tumour sensor tracking system, the surgical tool sensor tracking system, and the reference sensor tracking system, each tracking system providing sensor location data 212 of its respective sensor(s). The tumour sensor tracking system comprises a tumour sensor locating component configured for determining the location of each tumour sensor in space. The surgical tool sensor tracking system comprises a surgical tool sensor locating component configured for determining the location of each surgical tool sensor in space. The reference sensor tracking system comprises a reference sensor locating component configured for determining the location of each reference sensor in space.
(35) It is noted here that a tumour sensor is an electromagnetic tumour sensor. A surgical tool sensor may be an electromagnetic surgical tool sensor or an optical surgical tool sensor. A reference sensor may be an electromagnetic reference sensor or an optical reference sensor.
(36) An electromagnetic sensor may be a passive electromagnetic sensor, accommodating no power source and providing an individualized electromagnetic locating signal to a tracking system when interrogated by the tracking system, or an active electromagnetic sensor, accommodating a power source and actively providing an individualized electromagnetic locating signal to a tracking system without being interrogated by the tracking system.
(37) An optical sensor may be a passive optical sensor, accommodating no power source and providing an individualized optical locating signal to a tracking system when receiving light, or an active optical sensor, accommodating a power source and actively providing an individualized optical locating signal irrespective of receiving light or not.
(38) If the at least one tool sensor is an electromagnetic surgical tool sensor, the surgical tool sensor tracking system may be the same as the tumour sensor tracking system. If the at least one surgical tool sensor is an optical tool sensor, the surgical tool sensor tracking system may be an optical tracking system comprising at least one camera, and is different from the tumour sensor tracking system.
(39) If the at least one reference sensor is an electromagnetic reference sensor, the reference sensor tracking system may be the same as the tumour sensor tracking system. If the at least one reference sensor is an optical reference sensor, reference sensor tracking system is an optical tracking system comprising at least one camera, and is different from the tumour sensor tracking system.
(40) The navigation software 202 comprises an image registration component 222 for registering of pre-operative images from the picture archiving system 204. A displaying component 224 processes data for visualization of anatomical structures, a tumour model representative of a tumour, and a surgical tool model representative of a surgical tool. A tumour model mapping component 226 is configured for mapping each tumour sensor location to a tumour model. A surgical tool mapping component 228 is configured for mapping each surgical tool sensor location to a surgical tool model. An anatomical structure mapping component 230 is configured for mapping each reference sensor location to at least one anatomical structure in the body of a patient.
(41) With the navigation software 202, a real time visualization 232 of a tumour (as represented by the tumour model), a surgical tool (as represented by the surgical tool model) and an anatomical structure (as obtained by one or more of the imaging systems 214) may be performed on a display device.
(42) In the implementation of
(43)
(44) In an implementation of the method or system according to
(45) A first data source for the navigation software 202 is a picture archiving system 204 containing image data of pre-operative images 203. The picture archiving system 204 receives image data representing pre-operative images 203 of anatomical structures from one or more imaging systems 214, such as a computer tomography, CT, imaging system 216, a magnetic resonance imaging, MRI, system 218, a positron emission tomography, PET, system 220, or any other appropriate system.
(46) A second data source for the navigation software 202 is image segmentation system 206 comprising software providing image segmentation data 208 representing a tumour model, i.e. tumour model data, an anatomy model for a specific patient, i.e. anatomical structure data, and/or a surgical tool model, i.e. surgical tool model data.
(47) A third data source for the navigation software 202 is a sensor tracking system 210, i.e. one or more of the tumour sensor tracking system, the surgical tool sensor tracking system, and the reference sensor tracking system, each tracking system providing sensor location data 212 of its respective sensor(s). The tumour sensor tracking system comprises a tumour sensor locating component configured for determining the location of each tumour sensor in space. The surgical tool sensor tracking system comprises a surgical tool sensor locating component configured for determining the location of each surgical tool sensor in space. The reference sensor tracking system comprises a reference sensor locating component configured for determining the location of each reference sensor in space.
(48) A fourth data source for the navigation software 202 is image data representing intra-operative images 308 of anatomical structures from one or more Operation Room, OR, imaging systems 302, such as a (for example, cone beam, CB) computer tomography, CT, imaging system 304, a magnetic resonance imaging, MRI, system 306, or any other appropriate system. The MRI system 306 may be the same as MRI system 218.
(49) It is noted here that a tumour sensor is an electromagnetic tumour sensor. A surgical tool sensor may be an electromagnetic surgical tool sensor or an optical surgical tool sensor. A reference sensor may be an electromagnetic reference sensor or an optical reference sensor.
(50) An electromagnetic sensor may be a passive electromagnetic sensor, accommodating no power source and providing an individualized electromagnetic locating signal to a tracking system when interrogated by the tracking system, or an active electromagnetic sensor, accommodating a power source and actively providing an individualized electromagnetic locating signal to a tracking system without being interrogated by the tracking system.
(51) An optical sensor may be a passive optical sensor, accommodating no power source and providing an individualized optical locating signal to a tracking system when receiving light, or an active optical sensor, accommodating a power source and actively providing an individualized optical locating signal irrespective of receiving light or not.
(52) If the at least one surgical tool sensor is an electromagnetic surgical tool sensor, the surgical tool sensor tracking system may be the same as the tumour sensor tracking system. If the at least one surgical tool sensor is an optical surgical tool sensor, the surgical tool sensor tracking system may be an optical tracking system comprising at least one camera, and is different from the tumour sensor tracking system.
(53) If the at least one reference sensor is an electromagnetic reference sensor, the reference sensor tracking system may be the same as the tumour sensor tracking system. If the at least one reference sensor is an optical reference sensor, reference sensor tracking system is an optical tracking system comprising at least one camera, and is different from the tumour sensor tracking system.
(54) The navigation software 202 comprises an image registration component 222 for registering of pre-operative images from the picture archiving system 204. A displaying component 224 processes data for visualization of anatomical structures, a tumour model representative of a tumour, and a surgical tool model representative of a surgical tool. A tumour model mapping component 226 is configured for mapping each tumour sensor location to a tumour model. A surgical tool mapping component 228 is configured for mapping each surgical tool sensor location to a surgical tool model. An anatomical structure mapping component 230 is configured for mapping each reference sensor location to at least one anatomical structure in the body of a patient. A model updating component 310 receives data of intra-operative images 308 taken with one or more OR imaging systems 302. The model updating component 310 is configured to intra-operatively update the tumour model and/or the anatomical structure based on the intra-operative images 308 obtained.
(55) With the navigation software 202, a real time visualization 232 of a tumour (as represented by the tumour model), a surgical tool (as represented by the surgical tool model) and an anatomical structure (as obtained by one or more of the imaging systems 214) may be performed on a display device.
(56) In the implementation of
(57)
(58)
(59) At least one electromagnetic, EM, tumour sensor 440 has been provided on or in the tumour 430. Alternatively, the tumour sensor 440 may be provided in the vicinity of, or in close proximity to, the tumour 430. At least one EM reference sensor 450 (
(60) Below the patient 404, an electromagnetic, EM, field generator 442 is provided in the operation table 400. The EM generator 442 is configured to generate an electromagnetic field to cause each tumour sensor 440, reference sensor 450, 452, and surgical tool sensor 436 to provide a signal indicating its respective, location in space. The signal from each sensor 436, 440, 450, 452 is individualized to be able to discriminate between the sensors.
(61) The respective location signals from each tumour sensor 440, reference sensor 450, 452, and surgical tool sensor 436 are tracked by a tracking system 472, such that the location and/or orientation of each sensor in space is available real-time, or quasi real-time. The location signals may be sent to the tracking system 472 wirelessly or through wires, as indicated in
(62) As indicated by dash-dotted line 474, sensor location data 212 (see
(63)
(64)
(65) At least one electromagnetic, EM, tumour sensor 440 is provided on or in or near the tumour 430. Alternatively, the tumour sensor 440 may be provided in the vicinity of, or in close proximity to, the tumour 430. At least one optical reference sensor 451 is provided on the body 406 of the patient 404, such that each reference sensor 451 has a spatial relationship to an internal anatomical structure of the patient 404 as well as possible. The at least one reference sensor 451 may be placed on the skin of the body 406 near the hip joints of the patient 404, or in the body 406 of the patient 404. The surgical tool 434 is provided with at least one optical surgical tool sensor 437.
(66) Above the patient 404, an optical radiation transmitter/receiver 500 is provided covering an area indicated by dashed lines 502. The optical reference sensor 451 and the optical surgical tool sensor 436 both comprise optical reflectors and/or optical emitters, indicated by circles, so that the optical radiation transmitter/receiver 500 may provide location signals indicating the location of the sensors 451, 436 in space. The optical radiation transmitter/receiver 500 comprises a camera.
(67) Below the patient 404, an electromagnetic, EM, field generator 442 is provided in the operation table 400. The EM generator 442 is configured to generate an electromagnetic field to cause each tumour sensor 440 to provide a signal indicating its respective location in space.
(68) The location signals from each tumour sensor 440 are tracked by an EM tracking system 472, while the respective location signals from each reference sensor 451, and surgical tool sensor 437 are tracked by an optical tracking system 473, such that the location and/or orientation of each sensor in space is available real-time. The location signals may be sent to the EM tracking system 472 and the optical tracking system 473 wirelessly or through wires, as indicated in
(69) As indicated by dash-dotted line 474, sensor location data 212 (see
(70)
(71) The box is provided with a reference sensor 608 fixed to a side wall 606 at the outside thereof. The body part 602 is provided with a tumour sensor 706 (see
(72) In the experimental arrangement, the box may be considered to simulate a human or animal body, in that the body part 602 is an anatomical structure taking a first spatial location and orientation relative to the box, and the tumour is “under the body surface”. The reference sensor 608 is provided on a surface of the box, thus simulating placement thereof on a surface of the human or animal body.
(73) Referring to
(74) The location of the reference sensor 608 in space was determined by interaction of the reference sensor 608 with a reference sensor tracking system (not shown), and the reference sensor 608 was mapped to the body part 602.
(75) The location of the tumour sensor 706 in space was determined by interaction of the tumour sensor 706 with a tumour sensor tracking system (not shown).
(76) Based on the first and second contours 704a and 704b, and possible other contours determined from other points of view, a digital volumetric tumour model, representative of the tumour, was determined, and the tumour sensor 706 was mapped to the tumour model.
(77) Then, the body part 602 was moved in space, in particular shifted over several centimeters on the support 604, and rotated around a vertical axis.
(78) Then, the location of the reference sensor 608 in space was determined, and the location of the tumour sensor 706 in space was determined by the tumour sensor tracking system.
(79) A new CT scan of the body part 602 was acquired to confirm the accuracy of a projection of the tumour model.
(80)
(81) Comparison of actual tumour contours, as determined with the new CT scan, with the first and second contours 804a, 804b, as determined from the tumour model, shows the tumour model and it location to be correct within 1.5 millimeters, thereby proving the applicability of the present invention to provide accurate visual information about a location of a tumour under a body surface of a human or animal body.
(82) As explained in detail above, in a method and system for providing visual information about a tumour location in human or animal body, an electromagnetic tumour sensor is provided in the tumour and tracked to determine its location in space, which is mapped to a tumour model. A surgical tool sensor is provided on a surgical tool, and tracked to determine its location in space, which is mapped to a surgical tool model. The body is scanned to obtain information about an anatomical structure. A reference sensor is provided on the body, and tracked to determine its location in space, which is mapped to the anatomical structure. A virtual image is displayed showing the tumour model, located with the at least one tumour sensor, in spatial relationship to the surgical tool model, located with the at least one surgical tool sensor, and the anatomical structure, located with the at least one reference sensor.
(83) As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention, which can be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present invention in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting, but rather, to provide an understandable description of the invention.
(84) The terms “a”/“an”, as used herein, are defined as one or more than one. The term plurality, as used herein, is defined as two or more than two. The term another, as used herein, is defined as at least a second or more. The terms including and/or having, as used herein, are defined as comprising (i.e., open language, not excluding other elements or steps). Any reference signs in the claims should not be construed as limiting the scope of the claims or the invention.
(85) The fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
(86) The term coupled, as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically.
(87) A single processor or other unit may fulfil the functions of several items recited in the claims.
(88) The terms computer program, software, and the like as used herein, are defined as a sequence of instructions designed for execution in a processor of a computer system. A program, computer program, or software application may include a subroutine, a function, a procedure, an object method, an object implementation, an executable application, an applet, a servlet, a source code, an object code, a shared library/dynamic load library and/or other sequence of instructions designed for execution on a computer system.
(89) A computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.