A61B34/76

Robotic surgical pedal with integrated foot sensor
11547500 · 2023-01-10 · ·

A foot pedal assembly for controlling a robotic surgical system. The foot pedal assembly including a foot pedal base, a foot pedal and a sensor. The foot pedal moves relative to the foot pedal base and has a contact surface extending from a distal end to a proximal end of the foot pedal. The contact surface is to come into contact with a foot of a user during use of the foot pedal assembly for controlling the robotic surgical system and the distal end is farther away from a heel of the foot than the proximal end during use of the assembly for controlling the robotic surgical system. The sensor is coupled to the contact surface of the foot pedal at a position closer to the proximal end than the distal end, and the sensor is operable to sense a target object positioned a distance over the contact surface.

Fluid collecting sheaths for endoscopic devices and systems
11547782 · 2023-01-10 · ·

A system includes an endoscope defining a proximal end, a distal end and an elongated shaft extending between the proximal end and the distal end of the endoscope. A fluid collecting sheath defines a proximal end and a distal end. The fluid collecting sheath is configured for insertion into a vaginal opening. The fluid collecting sheath includes fluid collecting apertures defined at the distal end of the fluid collecting sheath. A fluid line is in fluid communication with the fluid collecting apertures. A channel is formed in the fluid collecting sheath. The channel extends between the proximal end and the distal end of the fluid collecting sheath. The channel defines an opening therein. The channel of the fluid collecting sheath is configured to operably engage the elongated shaft of the endoscope by passing the elongated shaft of the endoscope through the opening of the channel.

Methods for performing medical procedures using a surgical robot
11690687 · 2023-07-04 · ·

Embodiments are directed to a medical robot system including a robot coupled to an end-effectuator element with the robot configured to control movement and positioning of the end-effectuator in relation to the patient. One embodiment is a method for removing bone with a robot system comprising: taking a two-dimensional slice through a computed tomography scan volume of target anatomy; placing a perimeter on a pathway to the target anatomy; and controlling a drill assembly with the robot system to remove bone along the pathway in the intersection of the perimeter and the two-dimensional slice.

User interface devices for use in robotic surgery

A handheld user interface device for controlling a robotic system may include a member, a housing at least partially disposed around the member and configured to be held in the hand of a user, and a tracking sensor system disposed on the member and configured to detect at least one of position and orientation of at least a portion of the device. At least one of the detected position of the portion of the device and detected orientation of the portion of the device is correlatable to a control of the robotic system.

Use of eye tracking for tool identification and assignment in a robotic surgical system
11690677 · 2023-07-04 · ·

A robotic surgical system includes an eye gaze sensing system in conjunction with a visual display of a camera image from a surgical work site. Detected gaze of a surgeon towards the display is used as input to the system. This input may be used by the system to assign an instrument to a control input device (when the user is prompted to look at the instrument), or it may be used as input to a computer vision algorithm to aid in object differentiation and seeding information, facilitating identification/differentiation of instruments, anatomical features or regions.

Surgical arm system with internally drive gear assemblies

Example embodiments relate to robotic arm assemblies. The robotic arm assembly includes forearm and upper arm segments. Upper arm segment includes distal motor. Robotic arm assembly includes elbow coupling joint assembly connecting distal end of upper arm segment to proximal end of forearm segment via a serial arrangement of proximal and distal elbow joints. Proximal elbow joint is located between upper arm segment and distal elbow joint. Distal elbow joint is located between proximal elbow joint and forearm segment. Proximal elbow joint forms proximal main elbow axis. Distal elbow joint forms distal main elbow axis. Elbow coupling joint assembly includes distal elbow joint subassembly connected to forearm segment. Elbow coupling joint assembly includes proximal elbow joint subassembly connecting upper arm segment to distal elbow joint subassembly. Proximal elbow joint subassembly is configured to be driven to rotate forearm segment relative to proximal main elbow axis.

Mobile virtual reality system for surgical robotic systems

Mobile virtual reality system for simulation, training or demonstration of a surgical robotic system can include a virtual reality processor. The processor can generate a virtual surgical robot and render the virtual surgical robot on a display. The virtual surgical robot can include a virtual surgical tool. A handheld user input device (UID) can sense a hand input from a hand. A foot input device can sense a foot input from a foot. The virtual reality processor can be configured to control a movement or action of the virtual surgical robot based on the hand input, and change which of the virtual surgical instruments is controlled by the one or more handheld UIDs based on the foot input. Other embodiments and aspects are disclosed and claimed.

GRASPING WORK DETERMINATION AND INDICATIONS THEREOF

A surgical system is disclosed. The surgical system comprises an end effector configured to move through a grasping motion, a motor configured to drive the grasping motion, an encoder configured to detect rotary positions, a load sensor configured to detect loads delivered, a position sensor configured to detect three-dimensional positions of the end effector, and a control circuit configured to receive a position parameter, a rotary parameter, and a load parameter, store the position parameter at the outset of the grasping motion, calculate an amount of work performed during the grasping motion while the position sensor detects the position of the end effector within a three-dimensional zone around the stored position parameter, transmit a work signal indicative of the amount of work performed, and reset the calculation of the amount of work performed when the position sensor detects a displacement of the end effector out of the three-dimensional zone.

METHOD, APPARATUS AND SYSTEM FOR CONTROLLING AN IMAGE CAPTURE DEVICE DURING SURGERY

A system for controlling a medical image capture device during surgery is provided, the system including circuitry configured to acquire first image data from the medical image capture device, the first image data being of an appearance of a surgical scene at a first instance of time; determine, based on a predicted appearance of the surgical scene based on the first image data at a second instance of time after the first instance of time, one or more desired image capture properties of the medical image capture device; and control the medical image capture device at a third instance of time, the third instance of time being between the first instance of time and the second instance of time, in accordance with the one or more desired image capture properties of the medical image capture device.

SYSTEMS AND METHODS FOR IDENTIFYING AND FACILITATING AN INTENDED INTERACTION WITH A TARGET OBJECT IN A SURGICAL SPACE

An exemplary system includes a memory storing instructions and a processor communicatively coupled to the memory. The processor may be configured to execute the instructions to: detect an intent of a user of a computer-assisted surgical system to use a robotic instrument attached to the computer-assisted surgical system to interact with a target object while the target object is located in a surgical space; determine a pose of the target object in the surgical space; and perform, based on the detected intent of the user to interact with the target object and the determined pose of the target object in the surgical space, an operation with respect to the target object.