Patent classifications
G05B2219/45123
Rendering tool information as graphic overlays on displayed images of tools
An operator telerobotically controls tools to perform a procedure on an object at a work site while viewing real-time images of the work site on a display. Tool information is provided in the operator's current gaze area on the display by rendering the tool information over the tool so as not to obscure objects being worked on at the time by the tool nor to require eyes of the user to refocus when looking at the tool information and the image of the tool on a stereo viewer.
Techniques for detecting errors or loss of accuracy in a surgical robotic system
Systems and methods for detecting an error in a surgical system. The surgical system includes a manipulator with a base and a plurality of links and the manipulator supports a surgical tool. The system includes a navigation system with a tracker and a localizer to monitor a state of the tracker. Controller(s) determine values of a first transform between a state of the base of the manipulator and a state of one or both of the localizer and the tracker of the navigation system. The controller(s) determine values of a second transform between the state of the localizer and the state of the tracker. The controller(s) combine values of the first transform and the second transform to determine whether an error has occurred relating to one or both of the manipulator and the localizer.
SYNTHETIC REPRESENTATION OF A SURGICAL ROBOT
A system comprises a first robotic arm adapted to support and move a tool and a second robotic arm adapted to support and move a camera. The system also comprises an input device, a display, and a processor. The processor is configured to, in a first mode, command the first robotic arm to move the camera in response to a first input received from the input device to capture an image of the tool and present the image as a displayed image on the display. The processor is configured to, in a second mode, display a synthetic image of the first robotic arm in a boundary area around the captured image on the display, and in response to a second input, change a size of the boundary area relative a size of the displayed image.
Tool position and identification indicator displayed in a boundary area of a computer display screen
An apparatus comprises a memory device and a processor coupled to a display device, an image capture device, and the memory device. The processor is configured to: cause images captured by the image capture device to be displayed in a viewing area on the display device; determine a position of a tool in a reference frame of the image capture device; determine a position to display a non-depictive symbol for the tool in a boundary area circumscribing the viewing area to indicate a direction of the determined position of the tool relative to a field of view of the image capture device, by determining a trajectory of the tool; and cause the non-depictive symbol to be displayed at the determined position in the boundary area while images that were captured by the image capture device are restricted to being displayed in the viewing area.
TOOL POSITION AND IDENTIFICATION INDICATOR DISPLAYED IN A BOUNDARY AREA OF A COMPUTER DISPLAY SCREEN
An endoscope captures images of a surgical site for display in a viewing area of a monitor. When a tool is outside the viewing area, a GUI indicates the position of the tool by positioning a symbol in a boundary area around the viewing area so as to indicate the tool position. The distance of the out-of-view tool from the viewing area may be indicated by the size, color, brightness, or blinking or oscillation frequency of the symbol. A distance number may also be displayed on the symbol. The orientation of the shaft or end effector of the tool may be indicated by an orientation indicator superimposed over the symbol, or by the orientation of the symbol itself. When the tool is inside the viewing area, but occluded by an object, the GUI superimposes a ghost tool at its current position and orientation over the occluding object.
Tool position and identification indicator displayed in a boundary area of a computer display screen
An endoscope captures images of a surgical site for display in a viewing area of a monitor. When a tool is outside the viewing area, a GUI indicates the position of the tool by positioning a symbol in a boundary area around the viewing area so as to indicate the tool position. The distance of the out-of-view tool from the viewing area may be indicated by the size, color, brightness, or blinking or oscillation frequency of the symbol. A distance number may also be displayed on the symbol. The orientation of the shaft or end effector of the tool may be indicated by an orientation indicator superimposed over the symbol, or by the orientation of the symbol itself. When the tool is inside the viewing area, but occluded by an object, the GUI superimposes a ghost tool at its current position and orientation over the occluding object.
MULTI-DEGREES-OF-FREEDOM HAND CONTROLLER
Disclosed is a controller including a first control member, a second control member that extends from a portion of the first control member, and a controller processor that is operable to produce a rotational movement output signal in response to movement of the first control member, and a translational movement output signal in response to movement of the second control member relative to the first control member. The rotational movement output signal may be any of a pitch movement output signal, a yaw movement output signal, and a roll movement output signal, and the translational movement output signal may be any of an x-axis movement output signal, a y-axis movement output signal, and a z-axis movement output signal. In exemplary embodiments, the first control member may be gripped and moved using a single hand, and the second control member may be moved using one or more digits of the single hand, thus permitting highly intuitive, single-handed control of multiple degrees of freedom, to and including, all six degrees of rotational and translational freedom without any inadvertent cross-coupling inputs.
Synthetic representation of a surgical robot
A synthetic representation of a robot tool for display on a user interface of a robotic system. The synthetic representation may be used to show the position of a view volume of an image capture device with respect to the robot. The synthetic representation may also be used to find a tool that is outside of the field of view, to display range of motion limits for a tool, to remotely communicate information about the robot, and to detect collisions.
Tool position and identification indicator displayed in a boundary area of a computer display screen
An endoscope captures images of a surgical site for display in a viewing area of a monitor. When a tool is outside the viewing area, a GUI indicates the position of the tool by positioning a symbol in a boundary area around the viewing area so as to indicate the tool position. The distance of the out-of-view tool from the viewing area may be indicated by the size, color, brightness, or blinking or oscillation frequency of the symbol. A distance number may also be displayed on the symbol. The orientation of the shaft or end effector of the tool may be indicated by an orientation indicator superimposed over the symbol, or by the orientation of the symbol itself. When the tool is inside the viewing area, but occluded by an object, the GUI superimposes a ghost tool at its current position and orientation over the occluding object.
Techniques For Detecting Errors Or Loss Of Accuracy In A Surgical Robotic System
Systems and methods for operating a robotic surgical system are provided. The system includes a surgical tool, a manipulator comprising links for controlling the tool, a navigation system comprising a tracker coupled to the manipulator or the tool and a localizer to monitor a state of the tracker. Controller(s) determine a raw (or lightly filtered) relationship between one or more components of the manipulator and one or more components of the navigation system by utilizing one or more of raw kinematic measurement data from the manipulator and raw navigation data from the navigation system. The controller(s) utilize the raw (or lightly filtered) relationship to determine whether an error has occurred relating to at least one of the manipulator and the navigation system.