AUTONOMOUS ROBOT TRACKING
20220168048 · 2022-06-02
Inventors
Cpc classification
A61B17/17
HUMAN NECESSITIES
A61B2090/367
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
A61B2034/105
HUMAN NECESSITIES
International classification
A61B34/20
HUMAN NECESSITIES
A61B17/16
HUMAN NECESSITIES
A61B17/17
HUMAN NECESSITIES
Abstract
A system for tracking the position of a surgical tool manipulated by a surgical robotic system, to determine that the tool is correctly positioned and orientated. Miniature 3-D tracking cameras are mounted on the end effector of the robotic system, one viewing markers on the surgical tool, and the other markers attached to the anatomy part being operated on. The initial spatial position and orientation of the surgical tool on the surface of the anatomy part is measured, and the progress of the surgical tool into the anatomic body part is tracked using one of the miniature cameras. The cameras or sensors are close to the surgical region of interest, and all the mechanical and sensor elements necessary for the system operation are mounted within the realm of the robot. The system thus avoids interruption of communication between a remotely positioned navigation camera and the robot or patient.
Claims
1. A robotic system comprising: a first tracking camera fixedly attached to a part of a robot, in a position that enables the first tracking camera to capture images of a first three-dimensional reference frame attached to a surgical tool; a second tracking camera fixedly attached to a part of the robot, in a position that enables the second tracking camera to capture images of a second three-dimensional reference frame attached to an anatomical part of a subject; and a control system adapted to determine at least a position of the surgical tool in a co-ordinate system, based on images of the first three-dimensional reference frame from the first tracking camera, and to determine a pose of the anatomical part of the subject in the co-ordinate system, based on images of the second three-dimensional reference frame from the second tracking camera.
2. The robotic system according to claim 1, wherein the control system is further adapted to determine an orientation of the surgical tool.
3. The robotic system according to claim 1, wherein the part of the robot is an end effector.
4. The robotic system according to claim 3, wherein the control system is adapted to take into account the position of an operating tip of the surgical tool relative to the position of the surgical tool determined by the first three-dimensional reference frame, such that a longitudinal position of the operating tip of the surgical tool is known relative to the end effector.
5. The robotic system according to claim 4, wherein the anatomical part of the subject is a bone, and the surgical tool is a drill carrying a drill bit adapted to move within a guide sleeve carried by the end effector, and wherein the control system is adapted to track an insertion depth of the drill bit into the bone.
6. The robotic system according to claim 5, wherein the control system is further adapted to use a registration procedure of the co-ordinate system with a set of preoperative three-dimensional images of at least the bone, such that the drill bit is oriented and positioned relative to the bone according to a surgical plan generated from the preoperative three-dimensional set of images.
7. The robotic system according to claim 6, wherein the control system is further adapted to compare the pose of the anatomical part of the subject determined by an image registration procedure with the pose of the anatomical part of the subject determined from the images from the second tracking camera, such that the pose of the anatomical part is intraoperatively verifiable.
8. The robotic system according to claim 1, wherein at least one of the first tracking camera and the second tracking camera enables generation of three-dimensional images.
9. The robotic system according to claim 1, wherein at least one of the first and the second tracking cameras are adapted to be fixedly attached to an end effector in a position no further than 100 cm from the first three-dimensional reference frame at which the first tracking camera is directed.
10. The robotic system according to claim 9, wherein at least one of the first and the second tracking cameras are adapted to be fixedly attached to the end effector in a position no further than 50 cm from the first three-dimensional reference frame at which the first tracking camera is directed.
11. The robotic system according to claim 10, wherein at least one of the first and the second tracking cameras are adapted to be fixedly attached to the end effector in a position no further than 20 cm from the first three-dimensional reference frame at which the first tracking camera is directed.
12. The robotic system according to claim 11, further comprising a third reference frame fixedly attached to a touch probe, such that at least one of the first and second tracking cameras attached to the end effector of the robot can determine the position of the touch probe in a robot co-ordinate frame, enabling points of the anatomical part of the subject touched by the touch probe to be registered in the robot co-ordinate frame.
13. The robotic system according to claim 12, wherein the control system is further adapted to detect in an image of the anatomical part of the subject obtained by at least one of the first and second tracking cameras attached to the end effector of the robot, an impingement of a laser beam on a predetermined point of the anatomical part of the subject, such that the predetermined point of the anatomical part of the subject illuminated by the laser beam is known in the robot co-ordinate frame.
14. The robotic system according to claim 1, wherein the control system is further adapted to determine a position of a surgical tool tip relative to the anatomical part of the subject, without need of registration with preoperative images of the subject.
15. The robotic system according to claim 1, wherein the surgical tool is hand-held.
16. A robotic system comprising: a first imaging camera fixedly attached to a first robotic arm, and directed to generate first two-dimensional images of a three-dimensional reference frame attached to a hand-held surgical tool; a second imaging camera fixedly attached to a second robot arm and directed to generate second two-dimensional images of the three-dimensional reference frame attached to the hand-held surgical tool; and a control system adapted to determine from a first two-dimensional image from the first imaging camera, and from a second image from the second two-dimensional imaging camera, a three-dimensional pose of the hand-held surgical tool in a coordinate system of at least one of the first and the second robot arms, wherein positions of the first and second robotic arms are adapted to be held at a distance apart that provides a desired accuracy of the three-dimensional pose of the hand-held surgical tool.
17. The robotic system according to claim 16, further comprising a third three-dimensional reference frame attached to an anatomical part of a subject on which the hand-held surgical tool is to perform a surgical procedure, such that the first and the second imaging cameras can provide information to enable the three-dimensional pose of the anatomical part of the subject to be determined in the coordinate system of at least one of the first and the second robot arms.
18. The robotic system according to claim 17, wherein the first and second robotic arms are adapted to be held at a distance sufficiently close to the three-dimensional reference frame attached to the anatomical part of the subject, to achieve a desired accuracy of a determined three-dimensional pose of the anatomical part of the subject.
19. A method of tracking a surgical tool manipulated by a robot, the method comprising: generating at least one image of a first three-dimensional referencing frame fixedly mounted to a surgical tool, the first three-dimensional referencing frame being disposed within a field of view of a first camera mounted on an end effector of a robot; generating at least one image of a second three-dimensional referencing frame fixedly attached to an anatomical body part of a subject, the second three-dimensional referencing frame being disposed within the field of view of a second camera mounted on the end effector of the robot; performing image analysis on at least one image generated by the first camera to determine a pose of the surgical tool relative to the end effector of the robot; performing image analysis on at least one image generated by the second camera to determine a pose of the anatomical body part relative to the end effector of the robot; and correlating the determined pose of the surgical tool with the determined pose of the anatomical body part so that the surgical tool can be tracked relative to the anatomical body part while executing a surgical procedure.
20. The method of claim 19, wherein correlating the determined pose of the surgical tool with the determined pose of the anatomical body part of the subject enables verification for accuracy of performance of the surgical procedure on the anatomic body part of the subject by the surgical tool, the method further comprising at least one of: (i) performing registration of a co-ordinate system of the robot relative to preoperative three-dimensional images of a region of the anatomic body part, according to which the surgical procedure has been planned, and manipulating the surgical tool using the robot to execute the surgical procedure; (ii) determining a spatial position of the surgical tool relative to the surgical tool, such that a longitudinal position of the surgical tool is determined relative to the anatomic body part of the subject; and (iii) determining a spatial position of a surgical tool tip relative to a point of contact of the surgical tool tip with a surface of the anatomical body part, such that the longitudinal position of the surgical tool can be tracked relative to the anatomic body part of the subject.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0040] The present invention will be understood and appreciated more fully from the following detailed description, taken in conjunction with the drawings in which:
[0041]
[0042]
[0043]
DETAILED DESCRIPTION
[0044] In order to determine the position and orientation of the remote object, a reference frame having a known three-dimensional arrangement of marker elements is attached rigidly to the remote object, and the camera is directed at that remote object to generate an image of the reference frame. Using the knowledge of the three-dimensional structure of the reference frame, the position and orientation of the reference frame relative to the camera can be determined by analysis of that image. Since the reference frame is rigidly attached to the remote object, this analysis enables the establishment of the position and orientation of the remote object relative to the camera, and hence also relative to the end effector.
[0045] The robotic system, in some embodiments, can operate in different modes, depending on the level of use made with registration techniques performed in robotic surgical procedures, to define the location of the robotic system relative to the position of the anatomic body part on which the surgical plan is to be implemented. According to a first implementation, the position of the robot base, or the co-ordinate frame of reference of the robot, and hence the position and angular orientation (jointly known as the pose) of the end effector, is known relative to the anatomical body part on which the surgical procedure is to be performed. This is achieved by determining the intraoperative position of the robot relative to the anatomical body part, for example, by use of intraoperative fluoroscope imaging that shows both the body part and, either a part of the robot base itself, or, for a remotely mounted robot, a part of an element such as a target held by the robotic end effector, that can define the position of the robotic end effector, and hence the robotic co-ordinate frame. The intraoperative fluoroscope images are used in conjunction with image registration between the intraoperative images showing the body part, and preoperative three-dimensional image sets of the same region of the subject, which also show the anatomical body part on which a surgical plan is executed. Once that procedure has been performed, the system then knows the pose of the surgical tool, since the surgical tool bit is held, generally within a tool guide, in a known position relative to the end effector and relative to the position of the body part on which the surgical tool is to perform its intended task.
[0046] However, if the surgical tool is to be moved longitudinally relative to the guide tube held by the end effector, the registration procedure described above is unable to determine that longitudinal position, and accurate knowledge thereof can be critical in order to ensure that the tool is not inserted too far into the body part, thereby possibly causing damage to the subject. Embodiments of the present disclosure provide the longitudinal position by analyzing the images of a reference frame attached to the surgical tool, such that the longitudinal position of the surgical tool is directly known relative to the end effector, without the need of remotely located scanning and navigation devices. In order for the position of the surgical tool tip to be known relative to the scanned position of the surgical tool itself, it is then necessary to perform a calibration procedure relating the position of the tool bit tip part of the surgical tool, to the position of the surgical tool body itself. Alternatively, the surgical tool bit part and its surgical tool body should have some correlating feature, such as by ensuring that the tool bit part is always locked into the surgical tool body part at a predetermined depth, such as by use of a mechanical registration stop, a visual insertion marker, or a similar arrangement. The tool bit part is then registered on insertion into the surgical tool driver part of the complete surgical tool, such that it always protrudes by a known measure from the surgical tool body part.
[0047] Reference is now made to
[0048] According to the implementation shown in
[0049] The first camera 16 is directed towards the registration frame 13 on the surgical tool 11, such that analysis of an image generated by that camera enables determination of the pose of the tool relative to the camera, and hence relative to the end effector 12 and thus, to the coordinate frame of the robot. Similarly, the second camera 15 is directed towards the registration frame 14 attached to the anatomy part 19 of the subject on which the surgical tool is to operate, such that analysis of the image by that camera, of the anatomy part registration frame 14 enables determination of the pose of the anatomy part relative to the camera, and hence also relative to the end effector 12 and thus to the coordinate frame of the robot. This assumes that the pose of the registration frame 14 relative to the pose of the anatomical body part 19 is known, such as by a previous imaging calibration procedure.
[0050] The cameras are mounted on the end effector in mutually fixed positions. Consequently, according to one exemplary method of this disclosure, comparison of the analyses of the images generated by the two cameras, enables the position and orientation of the surgical tool relative to the anatomic body part to be determined, without the need for any external or remote tracking system. This comparison may be performed by a control system which performs image processing on the images obtained from the two cameras. This control system can be either a separate module, or can constitute a part of the robotic control system itself, as shown in
[0051] Reference is now made to
[0052] In use, a number of different procedures can be adopted to use the registration abilities afforded by embodiments of the present disclosure. In the first place, for all surgical procedures, a registration procedure relating the robot to the target bone can be performed, independently of whatever system is used to track the positions of the elements. This robot-to-bone registration can be achieved by any of various registration methods, such as, for example, the use of fiducial markers, often percutaneously mounted on the subject, which are then very clearly imaged in the preoperative three-dimensional image set, typically a CT or MRI image set. The exemplary fiducial markers can then be related to the robot position intraoperatively, either by a physical connection thereto of a reference plane of the robot, such as the base, or by intraoperative imaging which includes both the fiducial markers and an indication of the position of the robot base or the pose of a target held by the end effector, or by use of a touch probe whose position may be tracked by a navigation system, thus providing digitized location information of the fiducial points, and of the robot. Alternative registration methods may be based on comparison of the shape, outline or intensity of anatomical features as viewed in the three-dimensional preoperative image set, with the same features as appearing in fluoroscope images obtained intraoperatively. Matching all the features enables the position of the robot, as observed in the intraoperative fluoroscope images, to be registered to the anatomy of the subject as imaged preoperatively, and on the basis of which the surgical plan has been generated.
[0053] Once the registration of the robotic co-ordinate system to the co-ordinate system of the preoperative imaging set has been accomplished, the robotic end effector can then be programmed to perform the motions required in execution of the surgical plan, which is based on the preoperative images. For instance, in the case of a robotic alignment for the drilling of a hole for the insertion of a pedicle screw, the robot can direct the pose of the drill guide such that it is aimed at the insertion point both spatially and in orientation angle. However, the insertion depth of the drill part of the surgical tool can be more problematic to determine since the surgical tool is often inserted manually by the surgeon holding the surgical tool handle. Therefore, even though the robotic system has aligned the surgical tool for accurate entry position and path orientation, the insertion depth also needs to be accurately controlled in order to ensure that no damage ensues from excess insertion. Even if the drill travel is controlled by a separate motorized actuator, the origin position must be accurately determined, and the insertion depth monitored in order to ensure conformity of the controlled insertion.
[0054] In some previous systems, this has often been achieved by use of a navigation system located remotely from the region of the surgical operation, incorporating a tracking camera typically mounted above the surgical scene, and tracking the position of a marker mounted on the surgical tool. Such a navigation system can be blocked by the insertion of the surgeon's arm or another part of the body of one of the operating room staff, into the line of sight between the tracking camera and the markers on the surgical tool being tracked. In addition, such systems may generally be costly.
[0055] Embodiments of the present disclosure use the end effector tool camera, which can provide continuous real-time images of the tool registration frame to the control system module. This module analyzes the images and provides therefrom an accurate measurement of the longitudinal position of the surgical tool relative to the camera, and since the camera is rigidly mounted on the end effector, the surgical tool position is known in the robotic coordinate system. Since, from the initial robotic position registration, the pose of the end effector is known relative to the bone on which the surgical tool is operating, the pose of the surgical tool is also known relative to the bone into which the surgical tool is being inserted. However, in order to be able to track the insertion depth of the surgical tool into the bone on which the surgical tool is operating, it is necessary to relate the position of the surgical tool to the tracked position of the tool drill. This can be performed, for instance, by a calibration measurement of the position of the surgical tool tip relative to the reference frame. Alternatively, the operating procedure may also include the step of using the tool camera 16 to measure and define a base reference position of the surgical tool tip, in a position with the surgical tool tip just touching the surface of the anatomy part 19, such that the depth of insertion into the anatomic part 19, can be measured from that point in real time using the camera tracking procedure.
[0056] In an additional mode of operation, embodiments of the present disclosure may be used for verifying the accuracy of the intended positioning and insertion of the surgical tool, as determined by the preliminary registration of the robotic co-ordinate system with the preoperative image set on which the surgical plan was based, by using the second camera mounted on the end effector, and viewing the reference frame mounted in a fixed position relative to the bone on which the surgical procedure is being performed. Although the previously described method using the surgical tool directed camera, should, together with the robot registration, provide an accurate pose measurement of the surgical tool and its insertion depth, a verifying measurement can be performed in surgical procedures in order to ensure that an unexpected error in the previously described primary referencing method, does not result in bodily damage to the patient. The camera directed to the reference frame on the anatomy part, represents a measurement system, independent of any previous registrations, enabling the system to ascertain the position and orientation relative to the robotic co-ordinate system, of the body part on which the operation is being performed. The main measurement method involved image registration performed of the actual intraoperative position of the body part being operated on, with the robotic coordinate system, by one of the registration methods described hereinabove.
[0057] In situations in which the intraoperative imaging method is of sufficiently good resolution, this implementation therefore provides a method of relating the surgical tool pose and the tool tip position to the anatomical body part on which the tool is to perform the surgical process, without the need for preoperative imaging. The controller uses the commonly defined base—the end effector—to directly relate the positions of the surgical tool tip and of the bone structure to that common base. So long as the reference frame can be accurately related to the bone shape and structure of the anatomical part, the system according to this implementation looks simultaneously straight at the tool tip and at the anatomical body part, and can measure both of their positions, and hence their mutual positions, without any other preoperative registration.
[0058] In yet another mode of operation, a system implementation using only a single camera directed at the surgical tool reference frame is proposed. Since the position and orientation of the body part on which the procedure is to be performed may be known to the robotic system controller by an image registration procedure, and since the surgical tool pose may also be known from the robotic end effector pose as determined either from a full image registration procedure, or from a separate optical tracking measurement of the robot end effector pose, the only unknown information for carrying out the surgical procedure is the longitudinal position of the surgical tool. Therefore, according to this additional method, the system only needs a single robot-mounted camera directed in the direction of the surgical tool in order to ascertain the longitudinal position of the surgical tool, since its initial pose relative to the body part, is determined by the image registration procedure, or a separate optical tracking measurement of the robotic pose. This method and system is feasible since the robot pose can be determined by a remote tracking system more readily without fear of obstruction, than can the surgical tool position, for which a closely disposed scanning camera is an advantage.
[0059] Reference is now made to
[0060] The co-ordinate system of the C-arm is thus known in the reference co-ordinate system of the robot. Additionally, since the C-arm imaging system now generates a set of images of the anatomical part of the subject, the features of that anatomical part are therefore known in the C-arm co-ordinate system, and consequently, also in the robotic co-ordinate system. There is thus achieved, a registration of the anatomical body part, and hence its pose, relative to the robotic end effector, without the need to attach a registration frame to the anatomic part, and without a camera to image that registration frame. This should increase positional accuracy, since a registration element attached to, or part of a C-arm component, will generally have a more definable placement than a registration frame attached to a body part on the end of a long K-wire. Once the registration of the body part to the robotic end effector is known, the various implementations of the system regarding the surgical tool pose relative to the body part can be performed, as previously described in this disclosure.
[0061] Finally, reference is made to
[0062] Reference is now made to
[0063] It is appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the present invention includes both combinations and subcombinations of various features described hereinabove as well as variations and modifications thereto which would occur to a person of skill in the art upon reading the above description and which are not in the prior art.