Surgical instrument with real time navigation assistance
11547498 ยท 2023-01-10
Assignee
Inventors
- Joseph C. McGinley (Casper, WY, US)
- Michael Andrew Swartz (San Jose, CA, US)
- Thomas Edward Stout (San Jose, CA, US)
- Mike Zelina (Lakewood, OH, US)
- Collin T. Stoner (San Leandro, CA, US)
Cpc classification
A61B2090/365
HUMAN NECESSITIES
A61B90/06
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
A61B5/061
HUMAN NECESSITIES
A61B90/37
HUMAN NECESSITIES
A61B2034/2063
HUMAN NECESSITIES
A61B2034/107
HUMAN NECESSITIES
A61B2034/254
HUMAN NECESSITIES
A61B2090/367
HUMAN NECESSITIES
A61B2090/064
HUMAN NECESSITIES
A61B2090/3966
HUMAN NECESSITIES
International classification
A61B5/05
HUMAN NECESSITIES
A61B34/00
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
Abstract
Navigation assistance systems and methods for use with a surgical instrument to assist in navigation of a surgical instrument during an operation. The system may include sensors that may observe the patient to generate positioning data regarding the relative position of the surgical instrument and the patient. The system may retrieve imaging data regarding the patient and correlate the imaging data to the positioning data. In turn, the position of the surgical instrument relative to the imaging data may be provided and used to generate navigation date (e.g., position, orientation, trajectory, or the like) regarding the surgical instrument.
Claims
1. A method of providing navigation assistance data for use in a surgical operation, comprising: continually detecting at least one reference anatomical feature of a patient in a surgical site adjacent to a surgical instrument using a plurality of three-dimensional imaging sensors comprising time of flight sensors located on the surgical instrument that is disposed relative to the patient on which an operation is to be performed, wherein the plurality of three-dimensional sensors are disposed on the surgical instrument to have an observable field of the patient at the surgical site that extends in at least two directions relative to the surgical instrument; generating positioning data in response to the detecting, wherein the positioning data comprises a three-dimensional representation of the surgical site adjacent to the surgical instrument in a virtual reference space representing a location of the surgical instrument relative to the at least one reference anatomical feature of the patient, and wherein the at least one reference anatomical feature comprises visible anatomy of the patient relative to the location of the surgical instrument; retrieving imaging data corresponding to an imaging study of the patient, wherein the imaging data includes patient anatomy including at least one imaged anatomical feature of the patient; and continually correlating the imaging data with the positioning data to align the three-dimensional representation of the at least one reference anatomical feature to the at least one imaged anatomical feature in the virtual reference space to continually generate navigation data corresponding to the relative position of the surgical instrument with respect to the imaged anatomical feature of the patent for navigation assistance of the surgical instrument in real-time during the operation.
2. The method according to claim 1, wherein the plurality of sensors comprise a collective observable field that extends entirety about the surgical instrument.
3. The method according to claim 1, wherein the plurality of sensors further comprise at least one of an ultrasound sensor, a proximity sensor, an infrared sensor, a laser sensor, or a contact sensor.
4. The method according to claim 1, wherein the imaging data comprises three-dimensional data.
5. The method according to claim 4, wherein the imaging study comprises a computed tomography (CT) scan.
6. The method according to claim 4, wherein the imaging study comprises a magnetic resonance imaging (MRI) scan.
7. The method according to claim 1, wherein the at least one reference anatomical feature comprises a dimensionally stable structure.
8. The method according to claim 7, wherein the at least one reference anatomical feature comprises an internal anatomical feature.
9. The method according to claim 8, wherein the at least one reference anatomical feature comprises a bone.
10. The method of claim 1, wherein the at least one reference anatomical feature comprises an external anatomical feature.
11. The method according to claim 10, wherein the at least one reference anatomical feature comprises a contour of skin of the patient.
12. The method according to claim 1, wherein the at least one reference anatomical feature comprises one or more of an arm, a leg, a hand, a foot, a finger, a toe, a head, a torso, a spine, a pelvis, or other dimensionally stable anatomical landmark detectable by at least one of the plurality of sensors.
13. The method according to claim 1, wherein the at least one imaged anatomical feature comprises at least one subcutaneous structure.
14. The method according to claim 13, wherein the at least one imaged anatomical feature comprises at least one of a bone, a blood vessel, or a nerve.
15. The method according to claim 1, wherein the navigation module comprises a machine vision system.
16. The method according to claim 1, wherein the navigation data is at least in part based on a known relative position between at least one of the plurality of sensors and the surgical instrument.
17. The method according to claim 16, further comprising: displaying the navigation data in relation to the imaging data.
18. The method according to claim 17, further comprising: displaying the navigation data in an augmented reality display positioned relative to a user.
19. The method according to claim 18, wherein the navigation data is at least partially based on a position of the augmented reality display relative to the patient.
20. The method according to claim 17, further comprising: presenting the navigation data as trajectory information regarding the surgical instrument relative to the patient.
21. The method according to claim 20, further comprising: providing the navigation data in real time relative to movements of the surgical instrument relative to the patient.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
DETAILED DESCRIPTION
(9) While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and are herein described in detail. It should be understood, however, that it is not intended to limit the invention to the particular form disclosed, but rather, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the invention as defined by the claims.
(10)
(11) The navigation module 110 may comprise any appropriate hardware or software components to perform as recited herein. In this regard, the navigation module 110 may include one or more hardware components including, for example, a field programmable gate array, an application specific integrated circuit, or other hardware component. Additionally or alternatively, the navigation module 110 may be implemented using software. As such, reference to the navigation module 110 may include corresponding computer hardware for execution of the module including one or more processors that may be in operative communication with a physical memory device. Specifically, the one or more processors may retrieve instructions comprising non-transitory machine readable instructions that may be stored digitally on the physical memory device. In turn, the instructions, when executed by the processor, may configure the processor to perform the functionality described herein. Additional computer hardware may be provided to facilitate operation of the processor including busses, networking components, or the like, and may be included as part of the navigation module 110.
(12) The navigation module 110 may also be in operative communication with an image repository 150. The image repository 150 may include a physical memory device that may be operative to store imaging data digitally on the physical memory device. Specifically, the image repository 150 may include imaging data that may be obtained by way of an imaging study conducted using a medical imager 152 that may be operative to image the patient 140. As referenced above, the medical imager 152 may comprise hardware corresponding to any appropriate medical imaging technology including, but not limited to, x-ray, magnetic resonance imaging (MRI), computer tomography (CT), positron emission tomography (PET), or any other appropriate medical imaging technology. The medical imager 152 may provide digital imaging data regarding the patient 140 resulting from an imaging study directly to the image repository 150. Alternatively, the imaging data may be digitized for storage in the image repository 152. Further still, the imaging data may be transferred from the imager 152 to the image repository 152 using any appropriate means.
(13) The navigation module 110 may also be in operative communication with the display 160. The display 160 may present the navigation data generated by the navigation module 110 to a user of the system 100. As will be described in greater detail below, the display 160 may include any one or more of a variety of display technologies that may allow a user to perceive the navigation data as generated by the navigation module 110.
(14) The navigation module 110 may also be in operative communication with the surgical instrument 130. The surgical instrument 130 may have a tool portion 132 that acts upon the patient 140 during the operation of the surgical instrument 130. The surgical instrument may be, but is not limited to, a drill, saw, ream, grinder, or another surgical tool. In this regard, the tool portion may comprise a drill bit, a reamer, a grinding tool, or other appropriate tool.
(15) The surgical instrument 130 may also include a measurement system 134 such as those described above. As may be appreciated, the surgical instrument 130 may include on-board sensors 120 such that the sensors 120 may be disposed at or integrated with the surgical instrument 130. In this regard, bi-directional communication may be established between the navigation module 110 and the instrument 130. As such, data from the instrument 130 may be provided to the navigation module 110 such as outputs from the sensors 120 and/or measurement system 134. Further still, the navigation module 130 may provide data to the instrument 130 (e.g., including a control signal that controls operation of the surgical instrument 130 as described above).
(16) As described above and shown schematically in
(17) In some embodiments, a wide-field sensor 122 may be provided separate from the surgical instrument 130. The wide-field sensor 122 may include within its field of observation both the medical instrument 130 and the patient 140. In this regard, the wide-field sensor 122 may observe the surgical instrument 130 in relation to one or more reference anatomical features 142. This may be used to generate positioning data that is provided to the navigation module 110 for generation of navigation data as will be described in greater detail below.
(18) Additionally or alternatively, the surgical instrument 130 may include on-board sensors 120 that are provided at or integrated with the instrument 130. With further reference to
(19) A number of potential sensors and sensor configurations are contemplated as potentially applicable in various embodiments of the present disclosure. For instance, the sensors 120 and/or sensors 122 may include ultrasonic, infrared, laser, and/or optical sensors. A number of potential technologies for interpreting sensor inputs may be provided without limitation. For instance, a time-of-flight sensor may be employed as any one of an ultrasonic, infrared, laser, or optical sensor. Moreover, in relation to optical sensors, the sensors may include stereo-optic sensors, time-of-flight sensors, structure light sensors, and/or light field sensors. In connection with any of the foregoing sensors, the surgical instrument may include appropriate emitters and receptors for use in connection with any of the sensors described herein, as will be described in greater detail below. Additionally, it is contemplated that the sensors 120 and/or sensors 122 may include inertial measurement sensors and/or mechanical sensors (e.g., including an articulated arm).
(20) In this regard, specific combinations of the foregoing sensors have been contemplated. Accordingly, certain combinations of sensors are to be described herein, yet it should be understood such descriptions are not to be limiting and other combinations not specifically addressed herein are contemplated. Moreover, description of a sensor configuration as comprising a combination of sensor types is intended to confer a specific combination of sensor technologies without necessarily imply a quantity of sensors uses. As such, description of a sensor configuration as including an optic sensor and an ultrasound sensor does not necessarily limit such a configuration to a single optic sensor and a single ultrasound sensor.
(21) In an embodiment, the sensor configuration may include a multi-imager optical sensor array. In this embodiment, a plurality of sensors (e.g., three or more optical sensors) may be arrayed to observe the surgical site. In turn, the imaging data obtained by the optical sensor array may be provided to a controller that stitches the images together to create depth and/or orientation information.
(22) In another embodiment, the sensor configuration may include a three-dimensional imager and infrared time-of-flight sensor. This may include a single imager with an infrared light source and a time of flight controller hardware. This approach may utilize infrared wavelength input of an imager to measure the time of flight of infrared light from the emitter to determine image field depth and to capture a surgical site image.
(23) In another embodiment, the sensor configuration may include a structured light imager. This approach may include a single imager plus a micromirror array controller. This may include use of a micromirror array (e.g., utilizing digital light processing technology) to create depth information for an image.
(24) In another embodiment, the senor configuration may include a light field (plenoptic) lens and imager. This approach may employ a single imager that utilizes a special light field lens called a plenoptic lens that captures information about the light field emanating from a scene including the intensity of light in a scene and the direction that the light rays are traveling in space. In turn, software may process the captured light field information to create a three-dimensional image.
(25) In another embodiment, the sensor configuration may include an imager and articulating arm. In this approach, the interment may be engaged with an articulating arm that may measure the position and/or orientation of the instrument in the field. This may be coupled with an imager to capture information regarding the surgical site.
(26) In still another embodiment, the sensor configuration may include an imager, an inertial measurement unit, and a laser time of flight sensor. In this approach, the inertial measurement unit may measure the orientation of the drill and may utilize the laser time of flight sensor to determine the depth of the instrument relative to the patient.
(27) In another embodiment, the sensor configuration may include an imager, an inertial measurement unit, and a laser time of flight sensor. In this approach, a single imager may be provided in combination with an inertial measurement unit to generate orientation information regarding the instrument. Additionally, a time of flight infrared sensor may provide a depth measurement of the instrument relative to the patient. This may utilize an image to locate the surgical site and then uses the inertial measurement unit data to orient the instrument.
(28) In an embodiment, the sensor configuration may include an imager, an inertial measurement unit, and an ultrasound sensor. In this approach, a single imager may be provided in combination with an inertial measurement unit to generate orientation information regarding the instrument. Additionally, an ultrasound sensor may provide a depth measurement of the instrument relative to the patient. This may utilize an image to locate the surgical site and then uses the inertial measurement unit data to orient the instrument.
(29) In yet another embodiment, the sensor configuration may include a multi-imager optic sensor and an infrared time of flight sensor. In this embodiment, a plurality of sensors (e.g., three or more optical sensors) may be arrayed to observe the surgical site. In turn, the imaging data obtained by the optical sensor array may be provided to a controller that stitches the images together to create depth and/or orientation information. Additionally, the time of flight infrared sensor may be utilized to determine a depth of the instrument relative to a starting datum.
(30) Accordingly, the navigation module 110 may be operative to retrieve imaging data from the image repository 150. The navigation module 110 may be operative to identify a corresponding reference feature from the imaging data corresponding to the reference anatomical features 142 of the positioning data generated by the sensors 120. In turn, the imaging data may be correlated to the positioning data. As such, the navigation data generated by the navigation module 110 may describe the position of the surgical instrument 130 relative to the imaging data including imaged anatomical features 144.
(31) In this regard,
(32) As way of illustration, the internal anatomical features 144 may include a number of different structures. For example, the skin layer 200 may be represented in the imaging data corresponding to the imaged anatomical features 144. Moreover, muscle tissue 202 may be represented. Further still, features such as blood vessels 204 and/or nerves 206 may also be represented in the imaging data and correspond to imaged anatomical features 144. A bone of the patient 140 may be represented in the imaged anatomical features 144. Specifically, a hard outer cortex 208 of a bone may represented as well as an inner medullary layer 210. While a number of anatomical features have been illustrated in
(33)
(34) Another representation of a potential display 160 is shown in
(35) The navigation data generated by the navigation module 110 may be presented to a user in any appropriate manner. For example, the display 160 may be positioned near the patient 140 to present navigation data to the user who is positioning the surgical instrument 130 adjacent to the patient 140. As described above, the position of the surgical instrument 130 relative to the patient 140 may be obtained in relatively real time (e.g., with a delay that does not impact surgical operations such as less than about a second or so) such that the manipulation of the surgical instrument 130 relative to the patient 140 may be reflected in the navigation data presented to the user on the display 160 in corresponding real time. The display 160 may be any appropriate display that may include monitors or the like positioned near the patient 140.
(36) In an alternative embodiment, the display 160 may correspond to an augmented reality display 170 that may include projection of images within the field of view of the user. One such embodiment is shown in
(37) Turning to
(38) As briefly described above, the on-board sensors 120 and/or wide-field sensor 122 may comprise one or more of any appropriate type of sensor the may be used to identify reference anatomical features 142. For example, the on-board sensors 120 and/or wide-field sensors 122 may comprise optical sensors that obtain imaging data for use in identification of the reference anatomical features 142 adjacent to the surgical instrument 130. Further still, proximity sensors, laser sensors, or any other appropriate type of sensor that may be operative to observe and/or identify reference anatomical features 142 may be utilized. Moreover, combinations of different kinds of sensors may be provided without limitation. As may be appreciated, the observed reference anatomical feature 142 may simply correspond to a curvature or contour of the skin of the patient adjacent to the surgical instrument 130.
(39) Another embodiment of an instrument 180 is shown in
(40) With further reference to
(41) The method 300 may also include retrieving 308 imaging data from an image repository. The method 300 also may include identifying 308 common reference features from the positioning data and the imaging data. For instance, the reference anatomical features observed 304 by the sensors may be provided in both the positioning data and the imaging data. Also as described above, the identifying 308 may include manipulation of the imaging data or analysis of different portions of imaging data to identify the common features or feature portions.
(42) The method 300 may also include correlating 310 the common features to align the imaging data to the positioning data. The correlating 310 may include manipulating the imaging data in a virtual reference space to align the imaging data to the positioning data. In any regard, upon the correlating 310, the positioning data including information about the location of the surgical instrument may be related to the imaging data. Accordingly, the method 300 may include generating 312 navigation data based on this known relation between the surgical instrument and the features described in the imaging data. Further still, the method 300 may include presenting 314 the navigation data to a user of the system (e.g., in any of the manners described above).
(43) While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description is to be considered as exemplary and not restrictive in character. For example, certain embodiments described hereinabove may be combinable with other described embodiments and/or arranged in other ways (e.g., process elements may be performed in other sequences). Accordingly, it should be understood that only the preferred embodiment and variants thereof have been shown and described and that all changes and modifications that come within the spirit of the invention are desired to be protected.