DETERMINING RELATIVE ROBOT BASE POSITIONS USING EXTERNALLY POSITIONED IMAGERS
20230080041 · 2023-03-16
Inventors
Cpc classification
A61B2034/305
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
A61B90/37
HUMAN NECESSITIES
A61B2090/3945
HUMAN NECESSITIES
International classification
A61B34/20
HUMAN NECESSITIES
Abstract
A system for determining relative positions of subsystems of a robot-assisted surgical system includes a subsystem component including a plurality of manipulator arms and a surgeon console. Each subsystem component includes at least one of an optical tracker and a light emitter. Image data from the optical trackers is analyzed to determine the relative positions of the subsystems.
Claims
1. A robotic surgical system, comprising: a plurality of robotic manipulators, each including at least one of: an optical tracker; and an optical emitter; wherein the system is configured to receive image data obtained using the optical trackers and to determine using the image data the relative positions of the respective bases of the robotic manipulators.
2. The robotic surgical system of claim 1, wherein at least one of the manipulators includes a plurality of optical emitters.
3. The robotic surgical system of claim 2, wherein the plurality of optical emitters includes at least one first emitter positioned on a portion of the manipulator that remains fixed during surgery, and at least one second emitter positioned on a portion of the manipulator that is moveable during surgery.
4. A method of determining the relative positions of robotic manipulators, comprising: positioning a first robotic manipulator in an operating room, the first robotic manipulator having at least a pair of first emitters thereon; positioning a second robotic manipulator in an operating room, the second robotic manipulator having at least one tracker positioned thereon, and capturing an image of the pair of first emitters using the tracker, and using triangulation to determine, based on the known distance between the first emitters in the pair, the relative positions between the first robotic manipulator and the second robotic manipulator.
5. A method of determining the relative positions of robotic manipulators, comprising: positioning a first robotic manipulator in an operating room, the first robotic manipulator having at least a pair of first emitters and at least one first tracker positioned thereon positioning a second robotic manipulator in an operating room, the second robotic manipulator having a plurality of second emitters and at least one second tracker positioned thereon, positioning at least one third tracker at a fixed location in the operating room, each of the first and second robotic manipulators being independently positionable relative to the other of the first and second robotic manipulator and relative to the third tracker, capturing an image of a first one of the first emitters using the second tracker, capturing an image of a second one of the first emitters using the third tracker, and based on the known relative positions between the fixed location and the second robotic manipulator arm, determining the position of the first robotic arm.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
DETAILED DESCRIPTION
[0019] Concepts described this application allow the relative positions of bases of robotic arms and optionally other robotic system components within an operating room (e.g., the patient bed and/or surgeon console) to be determined. This information is useful for certain operations of the robotic system, including coordinated motion between the manipulators arms, or for automatic movements, or for collision avoidance interventions. This is particularly beneficial where components of the system are not physically linked, such as where the robotic manipulator arms and patient bed are independently positionable (e.g., their bases are independently moveable between different positions along the floor of the operating room).
[0020] Referring to
[0021] Optical tracking is used to determine the relative positions of the subsystems. Each sub-system will have at least one light emitter or set of light emitters, or at least one camera (which may be alternatively referred to as a tracking sensor, imager or tracker), or both light emitter(s) and a camera in order to allow determination of the relative position of each sub-system related to all the others. In the
[0022] If the distance between two emitters fixably mounted to a single subsystem component is known a priori, then the distance from a camera viewing those emitters may be calculated using triangulation.
[0023] The camera field-of-view (FoV) of each camera is identified in
[0024] The processing of this relative positioning data may be accomplished in a few different ways:
[0025] In one embodiment, each subsystem may include a processor having a memory storing a computer program that includes instructions executable by the processor to receive image data corresponding to images captured by that subsystem's camera, to execute an algorithm to detect light from emitters of neighboring subsystems in the captured images, and to perform the relative position calculation of all subsystems that it can see in a coordinate system relative to itself. In this embodiment, the processor then publishes that data onto the network for other subsystems to digest, or then publish that data to some other processing unit that aggregates this information into an overall positioning calculation.
[0026] In other cases, the data from each camera may be sent (as raw image data, or as digitally processed data) to a central processing unit having a memory storing a computer program that includes instructions executable by the processor to receive the image data and to aggregate the raw data from each subsystem, and to then calculate the relative positioning of each base in an overall global coordinate system.
[0027] The emitters may be configured to allow the system to determine which subsystem a given emitter or emitter set or array is positioned on based on the received image data. Memory associated with the processor that determines the relative positioning stores information correlating the identifying characteristics of an emitter or emitter array with its corresponding subsystem component. There are many different ways in which the emitters or emitter arrays can be uniquely identifiable by the system. For example, relative distance between multiple emitters on a single subsystem may be used to differentiate between different subsystems or subsystem types. For instance, each subsystem might have a different pattern of emitter arrays. One such example is shown in
[0028] In other embodiments, blink sequences may serve as differentiators. In
[0029] As discussed with respect to
[0030] The described embodiment lends itself well to subsystem arrangements lacking in comprehensive visibility, i.e., where in may not be possible for a single camera/tracker's view to have overall coverage and be able to see every other subsystem's emitters as discussed with respect to
[0031] Although the
[0036] Moreover, emitters and/or tracking cameras may be positioned on any portion of the system or other features within the operating room, including the manipulator arms, surgeon console, boom lights, laparoscopic towers, cars, the ceiling, a floor mounted structure, the operating table, anesthesia equipment, IV stands, etc.
[0037] Emitter/Tracker Positioning
[0038] In some embodiments, some or all of the various subsystems of the surgical system may be equipped with multiple emitters (and/or trackers) to enhance visibility and enable robust tracking of the respective positions of the robotic system components.
[0039] To permit differentiation between different ones of the emitters on a single subsystem, each emitter on the subsystem may have differentiating features of the type described above, such as different blink sequences and/or colors or color patterns.
[0040] Using a stereo pair of trackers/cameras with a known distance between them viewing a single emitter, it is possible to determine the relative distance from the cameras to any emitter.
[0041] Using a single camera, viewing at least two emitters of known separation is also able to provide triangulation information.
[0042] If multiple emitters are installed around a robotic manipulator base, a tracker/camera/set of cameras is then able to determine the side of the manipulator arm in question simply by the IDs/emitters in view.
[0043] Emitters may be placed on either or both of a fixed portion of the manipulator arm, or a moveable portion of the manipulator arm. In the example shown in
[0044] A second emitter or collection of emitters 106 may be positioned on a moveable part of the manipulator arm. In the drawings, the emitters 106 are shown on boom 102, but in other embodiments they might be on another moveable part of the manipulator arm. It should be understood that while
[0045] Where one or more emitters 104 are on a fixed portion of the manipulator, and one or more emitters 106 are on a portion of the manipulator that moves during surgery, the separation between the emitters 104 and the emitters 106 may be determined using the kinematic data from the relevant manipulator joints and the associated transformations.
[0046] To enhance visibility and to minimize line-of-sight occlusions, it may be advantageous to mount a tracker/camera to the moveable portion of the manipulator arm, such as on the boom as shown. However, where the camera is mounted to a moving portion of the manipulator, it is necessary to consider the movements on the camera (i.e., using kinematic data from the relevant manipulator joints) when performing the triangulation calculations.
[0047] The use of multiple emitters on the various subsystems lends itself well to understanding the relative positions of the subsystems even in occluding environments. If only one emitter of a subsystem (e.g., arm 14) is visible to a camera of a second subsystem (e.g., arm 15), but a second emitter of arm 14 is visible to a camera of a third subsystem (arm 13), the position or arm 14 relative to arms 15 and 13 can be determined. Since the spatial relationship between the two emitters on arm 14 is known (because they are at a fixed distance from one another, or because their spatial relationship can be determined using the known kinematics of arm 14), then the location of arm 14 can be determined using the known relative positions of arms 15 and 13.
[0048] With the disclosed system, the relative base positions of the subsystems may be determined using emitters on fixed portions of the tracked subsystems or on tracked moving portions. Kinematic data from each subsystem may be used to determine movements of that subsystem's joints and their impacts on the cameras or emitters. An internal transformation between the base and the moving camera(s)/emitter(s) on that subsystem is performed. Knowledge of the positions of the emitters/cameras for each subsystem then allows the relative positions of each subsystem base to be determined using triangulation as described above.
[0049] In the disclosed embodiments, the camera may comprise a tracking camera unit (TCU), which this may be a single camera/imager or a set of cameras/imagers. Referring to the top and front views of
[0050] All patents and applications referred to herein, including for purposes of priority, are incorporated herein by reference.