Patent classifications
A61B90/37
Optimizing treatment using TTFields by changing the frequency during the course of long term tumor treatment
Tumors can be treated with an alternating electric field. The size of cells in the tumor is determined prior to the start of treatment by, for example, biopsy or by inverse electric impedance tomography. A treatment frequency is chosen based on the determined cell size. The cell size can be determined during the course of treatment and the treatment frequency is adjusted to reflect changes in the cell size. A suitable apparatus for this purpose includes a device for measuring the tumor impedance, an AC signal generator with a controllable output frequency, a processor for estimating the size of tumor cells and setting the frequency of the AC signal generator based thereon, and at least one pair of electrodes operatively connected to the AC signal generator such that an alternating electric field is applied to the tumor.
Multi-fiber optical probe and optical coherence tomography system
Multichannel optical coherence systems and methods involving optical coherence tomography (OCT) subsystems are operably and respectively connected to optical fibers of a multichannel optical probe, such that each optical fiber forms at least a distal portion of a sample beam path of a respective OCT subsystem. The optical fibers are in optical communication with distal optical elements such that external beam paths associated therewith are directed towards a common spatial region external to the housing. Image processing computer hardware is employed to process OCT signals obtained from the plurality of OCT subsystems to generate an OCT image dataset comprising a plurality of OCT A-scans and process the OCT image dataset to generate volumetric image data based on known positions and orientations of the external beam paths associated with the OCT subsystems.
Method of robotic hub communication, detection, and control
Various surgical systems are disclosed. A surgical system can include a surgical robot and a surgical hub. The surgical robot can include a control unit in signal communication with a control console and a robotic tool. The surgical hub can include a display. The surgical hub can be in signal communication with the control unit. A facility can include a plurality of surgical hubs that communicate data from the surgical robots to a primary server. To alleviate bandwidth competition among the surgical hubs, the surgical hubs can include prioritization protocols for collecting, storing, and/or communicating data to the primary server.
System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
A method and system implementing a method for detecting a catheter in fluoroscopic data and updating a displayed electromagnetic position of the catheter on a 3D rendering is provided including navigating a catheter to a target area and acquiring fluoroscopic data from a fluoroscopic sweep of the target area. An initial catheter detection is performed to detect catheter tip candidates in each 2D frame of the fluoroscopic data using a shallow neural network. A secondary catheter detection is performed to detect catheter tip candidates in each 2D frame of the fluoroscopic data using a deep neural network. False-positive catheter tip candidates are removed by reconstructing a 3D position of the catheter tip and finding an intersecting point of rays corresponding to each 2D frame.
Selectable variable response of shaft motion of surgical robotic systems
A robotic surgical system for treating a patient is disclosed including a surgical tool movable relative to the patient and a user input device including a base and a space joint including a central portion movable relative to the base to effect a motion. The robotic surgical system further includes a control circuit configured to receive a user selection signal indicative of a selection between a first motion scaling profile of the motion of the surgical tool and a second motion scaling profile of the motion of the surgical tool, receive a motion control signal from the user input device indicative of a user input force, and cause the surgical tool to be moved in response to the motion control signal in accordance with the first motion scaling profile or the second motion scaling profile based on the user selection signal. The first motion scaling profile is different than the second motion scaling profile.
Systems, instruments and methods for surgical navigation with verification feedback
Systems, instruments, and methods for surgical navigation with verification feedback are provided. The systems, instruments, and methods may be used to verify a trajectory of a surgical tool during a procedure. The systems, instruments, and methods may receive one or more captured images of an anatomical portion of a patient; execute a surgical plan to insert the surgical tool into the anatomical portion; receive sensor data collected from one or more sensors being inserted into the anatomical portion; determine whether the sensor data corresponds to the surgical plan; and send, in response to determining that the sensor data does not correspond to the surgical plan, an alert indicating that the surgical tool is not being inserted according to the surgical plan. The one or more sensors may be attached to the surgical tool.
METHOD, APPARATUS AND SYSTEM FOR CONTROLLING AN IMAGE CAPTURE DEVICE DURING SURGERY
A system for controlling a medical image capture device during surgery, the system including: circuitry configured to receive a first image of the surgical scene, captured by the medical image capture device from a first viewpoint, and additional information of the scene; determine, for the medical image capture device, in accordance with the additional information and previous viewpoint information of surgical scenes, one or more candidate viewpoints from which to obtain an image of the surgical scene; provide, in accordance with the first image of the surgical scene, for each of the one or more candidate viewpoints, a simulated image of the surgical scene from the candidate viewpoint; control the medical image capture device to obtain an image of the surgical scene from the candidate viewpoint corresponding to a selection of one of the one or more simulated images of the surgical scene.
System and methods for planning and performing three-dimensional holographic interventional procedures with three-dimensional tomographic and live imaging
A method and a system for image-guided intervention such as a percutaneous treatment or diagnosis of a patient may include at least one of a pre-registration method and a re-registration method. The pre-registration method is configured to permit for an efficient virtual representation of a planned trajectory to target tissue during the intervention, for example, as a holographic light ray shown through an augmented reality system. In turn, this allows the operator to align a physical instrument such as a medical probe for the intervention. The re-registration method is configured to adjust for inaccuracy in the virtual representation generated by the pre-registration method, as determined by live imaging of the patient during the intervention. The re-registration method may employ the use of intersectional contour lines to define the target tissue as viewed through the augmented reality system, which permits for an unobstructed view of the target tissue for the intervention.
METHOD OF ALERTING A USER TO OFF-SCREEN EVENTS DURING SURGERY
An image of a field of view is captured by the endoscope and a portion of the image is displayed on a display. A popup image is generated and displayed upon image analysis identification of an occurrence of bleeding or motion of edges of an incision that is within the field of view, but outside the portion of the field of view displayed on the display.
SYSTEMS AND METHODS FOR VISUAL SENSING OF AND DOCKING WITH A TROCAR
A surgical robotic system has a tool drive coupled to a distal end of a robotic arm that has a plurality of actuators. The tool drive has a docking interface to receive a trocar. The system also includes one or more sensors that are operable to visually sense a surface feature of the trocar. One or more processors determine a position and orientation of the trocar, based on the visually sensed surface feature. In response, the processor controls the actuators to orient the docking interface to the determined orientation of the trocar and to guide the robotic arm toward the determined position of the trocar. Other aspects are also described and claimed.