Patent classifications
A61B2090/3735
GRAPHICAL USER INTERFACE FOR DISPLAYING GUIDANCE INFORMATION IN A PLURALITY OF MODES DURING AN IMAGE-GUIDED PROCEDURE
A method for displaying guidance information using a graphical user interface during an medical procedure comprises displaying, in a first mode of the graphical user interface, first image data from a perspective corresponding to a distal end of an elongate device. The first image data includes a virtual roadmap. The method also comprises transitioning from the first mode of the graphical user interface to a second mode of the graphical user interface. The transition is based on an occurrence of a triggering condition. The method also comprises displaying, in the second mode of the graphical user interface, second image data from a perspective corresponding to the distal end of the elongate device. The second image data includes a target indicator corresponding to a target location and an alignment indicator corresponding to an expected location of the medical procedure at the target location.
Intelligent Surgical Marker
Surgical marker systems and methods for delineating a lesion margin of a subject are provided. An example system includes a handheld probe device configured to capture an optical coherence tomography (OCT) image and a processor coupled to a memory. The handheld probe device includes a handheld probe including a fiber-optic probe assembly and a marker assembly. The processor is configured to: segment, by a neural network, each pixels of the OCT into different tissue-type categories; generate one or more feature vectors based at least in part on the segmented pixels; determine, by a one-class classifier, a boundary location in the OCT image between a normal tissue and an abnormal tissue of the tissue structure; and control the marker assembly to selectively create a visible label on a tissue location of the subject, the tissue location corresponding to the boundary location.
Synchronized placement of surgical implant hardware
Methods, apparatuses, and systems for robotic insertion of a screw, a rod, or another component of a surgical implant into a patient are disclosed. Synchronous insertion of screws is performed by multiple surgical robots or a single surgical robot having multiple arms and end effectors. The movements of each robotic arm are coordinated into position in preparation of the insertion of multiple surgical implant components at the same time or in the same surgical step. The insertion of the surgical implant components is performed while monitoring the insertion progress. The insertion is completed autonomously or in coordination with a surgeon.
VISUALIZATION DEVICES FOR USE DURING PERCUTANEOUS TISSUE DISSECTION AND ASSOCIATED SYSTEMS AND METHODS
A device and method for visualization of the intravascular creation of autologous valves, and particularly venous valve, is disclosed herein. One aspect of the present technology, for example, is directed toward a delivery catheter that can include a lumen configured to receive a dissection assembly and a trough having a plurality of transducers electrically coupled to a proximal portion of the delivery catheter. At least one of the transducers can be configured to emit a signal towards a portion of a blood vessel adjacent the trough, and at least one of the transducers can be configured to receive a reflection of the emitted signal.
SYSTEMS AND METHODS FOR INTELLIGENTLY SEEDING REGISTRATION
A method of registering sets of anatomical data for use during a medical procedure is provided herein. The method may include accessing a first set of model points of a patient anatomy of interest and intra-operatively acquiring a second set of model points by visualizing a portion of the anatomical surface in the patient with a vision probe. The method may further include extracting system information, including kinematic information from a robotic arm of a medical system and/or setup information, and generating an initial seed transformation based on the extracted system information. Thereafter, the method may include applying the initial seed transformation to the first set of model points and generating a first registration between the first set of model points and the second set of model points to permit model and actual information to be viewed and used together by an operator.
SYSTEMS AND METHODS FOR 3D DATA DRIVEN LASER ORIENTATION PLANNING
The present disclosure describes a method comprising ablating a substrate with a laser at an orientation to create a cavity in the substrate, scanning the cavity, and creating a three-dimensional surface for the cavity. The method further includes storing the three-dimensional surface in a dataset. The dataset includes a laser projected distance as an independent variable and a depth of cut as a dependent variable. The method further includes fitting parameters of a gaussian-based model for the laser and the substrate based on the dataset. The present disclosure also describes a method providing a pre-ablation surface, labeling a three-dimensional obstacle boundary that separates material to be remove by a laser and material to remain, and determining an orientation of the laser that results in a predicted post-ablation surface that does not intersect the three-dimension obstacle boundary.
ASSESSMENT OF SUTURE OR STAPLE LINE INTEGRITY AND LOCALIZATION OF POTENTIAL TISSUE DEFECTS ALONG THE SUTURE OR STAPLE LINE
A method for assessing suture line integrity includes loading a navigation plan into a navigation system, the navigation plan including a planned pathway shown in a 3D model, inserting a probe into a patient's airways, the probe including a location sensor in operative communication with the navigation system, registering a sensed location of the probe with the planned pathway, and selecting a target in the navigation plan, the target including a proposed suture line. The method further includes presenting a view of the 3D model showing the planned pathway and indicating the sensed location of the probe, navigating the probe through the airways of the patient's lungs toward the target, and imaging the proposed suture line of the target, via the probe, to determine tissue integrity surrounding the proposed suture line.
SYNCHRONIZING AN OPTICAL COHERENCE TOMOGRAPHY SYSTEM
Methods for synchronizing an Optical Coherence Tomography (OCT) system including detection when a plurality of A-line scans obtained from reflected light of a cantilever scanning fiber within a probe oscillating along a scanning path that increases in amplitude over time are no longer being obtained at a point along the oscillating scanning path when the scanning fiber reaches a minimum speed, determining a value by which a phase angle of the oscillating scanning path is out of synchronization with the plurality of A-line scans, and adjusting a trigger clock for the obtaining the plurality of A-line scans based on the value.
Methods and systems for providing depth information
Methods and systems for outputting depth data during a medical procedure on a patient. Depth data is outputted, representing at least one of relative depth data and general depth data. Tracking information about the position and orientation of a medical instrument and depth information about variations in depth over a site of interest are used. Relative depth data represents the depth information relative to the position and orientation of the instrument. General depth data represents the depth information over the site of interest independently of the position and orientation of the instrument.
Optical coherence tomography guided robotic ophthalmic procedures
The systems and methods described herein provide improved techniques for OCT guided robotic ophthalmic procedures. A method includes receiving, during OCT scanning of an eye, position data of a plurality of galvanometer scanners from a plurality of absolute and incremental encoders coupled to the corresponding galvanometer scanners. The method further includes receiving scan data related to one or more tissues of the eye. The method further includes determining, a set of first positions of the one or more tissues of the eye in a first 3D coordinate system. The method further includes determining, based on the set of first positions and a mapping between the first and a second 3D coordinate systems, a position in the second 3D coordinate system for a surgical instrument coupled to a robotic device. The method includes causing the robotic device to move the surgical instrument to the position in the second 3D coordinate system.