A61B2034/254

CONTROL ACCESS VERIFICATION OF A HEALTH CARE PROFESSIONAL

A computing system may identify a surgical instrument for a surgical procedure in an operating room (OR). The computing system may detect a control input by a health care professional (HCP) to control the surgical instrument. The computing system may determine the HCP's access control level associated with the surgical instrument. The computing system may determine whether the HCP has an authorization to control the surgical instrument. If the computing system determines that HCP is unauthorized to control the surgical instrument based on the access control level associated with the HCP, the computing system may block the control input by the HCP. If the computing system determines that the HCP is authorized to control the surgical instrument based on the access control level associated with the HCP, the computing system may effectuate the control input by the HCP to control the surgical instrument.

System and method of utilizing computer-aided identification with medical procedures
11510742 · 2022-11-29 · ·

The disclosure provides a system that may receive an identification of a first patient; may receive a first template that includes first multiple locations associated with a face of the first patient and associated with the identification of the first patient; may determine second multiple locations associated with a face of a current patient; may determine a second template of the face of the current patient based at least on the second multiple locations associated with the face of the current patient; may determine if the first template matches the second template; if the first template matches the second template, may provide an indication that the current patient has been correctly identified as the first patient; and if the first template does not match the second template, may provide an indication that the current patient has not been identified.

Surgical system with combination of sensor-based navigation and endoscopy

A set of pre-operative images may be captured of an anatomical structure using an endoscopic camera. Each captured image is associated with a position and orientation of the camera at the moment of capture using image guided surgery (IGS) techniques. This image data and position data may be used to create a navigation map of captured images. During a surgical procedure on the anatomical structure, a real-time endoscopic view may be captured and displayed to a surgeon. The IGS navigation system may determine the position and orientation of the real-time image; and select an appropriate pre-operative image from the navigation map to display to the surgeon in addition to the real-time image.

Rotary motion passive end effector for surgical robots in orthopedic surgeries
11510684 · 2022-11-29 · ·

A passive end effector of a surgical system includes a base connected to a rotational disk, and a saw attachment connected to the rotational disk. The base is attached to an end effector coupler of a robot arm positioned by a surgical robot, and includes a base arm extending away from the end effector coupler. The rotational disk is rotatably connected to the base arm and rotates about a first location on the rotational disk relative to the base arm. The saw attachment is rotatably connected to the rotational disk and rotates about a second location on the rotational disk. The first location on the rotational disk is spaced apart from the second location on the rotational disk. The saw attachment is configured to connect to a surgical saw including a saw blade configured to oscillate for cutting. The saw attachment rotates about the rotational disk and the rotational disk rotates about the base arm to constrain cutting of the saw blade to a range of movement along arcuate paths within a cutting plane.

Surgical Simulation System With Coordinated Imagining

An interactive and dynamic surgical simulation system may be used in the context of a computer-implemented interactive surgical system. The surgical simulation system may provide coordinated surgical imagining. A processor may be configured to execute a simulation of a surgical procedure. The surgical procedure may be simulated in a simulated surgical environment. The processor may generate a first visual representation and a second visual representation. The first visual representation may be of a first portion of the simulated surgical environment. The second visual representation may also be of the first portion of the simulated surgical environment. The processor may coordinate generation of the first visual representation and the second visual representation such that the first visual representation and the second visual representation correspond to a common event in the surgical procedure. And the processor may present the first visual representation and the second visual representation for user interaction within the simulated surgical environment.

Systems and methods for treating tissue with radiofrequency energy

A system for controlling operation of a radiofrequency treatment device to apply radiofrequency energy to tissue to heat tissue to create lesions without ablating the tissue. The system includes a first treatment device having at least one electrode for applying radiofrequency energy to tissue, a controller including a connector to which a first treatment device is coupled for use, and a generator for applying radiofrequency energy to the electrodes. The controller controls application of energy so that the tissue is thermally treated to create lesions but preventing thermal treatment beyond a threshold which would ablate the tissue.

OPTICAL CAMERA POSITIONING TOOL

A system and method may be used to position or orient a camera within a surgical field. A method may include generating a graphical user interface including a first set of instructions to reposition the camera, and determining whether the camera is within a target volume location. The method may include automatically outputting an indication when the camera is within the target volume location. The method may include outputting a second set of instructions for display on the graphical user interface to align a laser, coupled to or integrated into the camera, to the tracker by changing an angle of the camera

CO-MANIPULATION SURGICAL SYSTEM HAVING MULTIPLE OPERATIONAL MODES FOR USE WITH SURGICAL INSTRUMENTS FOR PERFORMING LAPAROSCOPIC SURGERY

Co-manipulation robotic systems are described herein that may be used for assisting with laparoscopic surgical procedures. The co-manipulation robotic systems allow a surgeon to use commercially-available surgical tools while providing benefits associated with surgical robotics. Advantageously, the surgical tools may be seamlessly coupled to the robot arms using a disposable coupler while the reusable portions of the robot arm remain in a sterile drape. Further, the co-manipulation robotic system may operate in multiple modes to enhance usability and safety, while allowing the surgeon to position the instrument directly with the instrument handle and further maintain the desired position of the instrument using the robot arm.

Methods and systems for controlled deployment of needle structures in tissue

A system for deploying needles in tissue includes a controller and a visual display. A treatment probe has both a needle and tines deployable from the needle which may be advanced into the tissue. The treatment probe also has adjustable stops which control the deployed positions of both the needle and the tines. The adjustable stops are coupled to the controller so that the virtual treatment and safety boundaries resulting from the treatment can be presented on the visual display prior to actual deployment of the system.

Systems, devices, and methods for lymph specimen tracking, drainage determination, visualization, and treatment
11583222 · 2023-02-21 · ·

Disclosed are systems and methods of lymphatic specimen tracking, visualization, and lymph node drainage pathway determination. An exemplary method includes receiving computed tomographic (CT) image data corresponding to a CT scan, generating a three-dimensional (3D) model of at least a portion of a patient's body based on the CT image data, identifying one or more lymph nodes in the 3D model, performing a registration of the 3D model with one or more physical locations in the patient's body, determining an expected lymph node drainage pathway away from a region of interest through one or more lymph nodes, and displaying the 3D model and the expected lymph node drainage pathway.