Patent classifications
A61B2034/2051
Method of using lung airway carina locations to improve ENB registration
Disclosed are systems, devices, and methods for registering a luminal network to a 3D model of the luminal network. An example method comprises generating a 3D model of a luminal network, identifying a target within the 3D model, determining locations of a plurality of carinas in the luminal network proximate the target, displaying guidance for navigating a location sensor within the luminal network, tracking the location of the location sensor, comparing the tracked locations of the location sensor and the portions of the 3D model representative of open space, displaying guidance for navigating the location sensor a predetermined distance into each lumen originating at the plurality of carinas proximate the target, tracking the location of the location sensor while the location sensor is navigated into each lumen, and updating the registration of the 3D model with the luminal network based on the tracked locations of the location sensor.
System and method for navigating within the lung
Methods and systems for navigating to a target through a patient's bronchial tree are disclosed including a bronchoscope, a probe insertable into a working channel of the bronchoscope and including a location sensor, and a workstation in operative communication with the probe and the bronchoscope, the workstation including a user interface that guides a user through a navigation plan and is configured to present a central navigation view including a plurality of views configured for assisting the user in navigating the bronchoscope through central airways of the patient's bronchial tree toward the target, a peripheral navigation view including a plurality of views configured for assisting the user in navigating the probe through peripheral airways of the patient's bronchial tree to the target, and a target alignment view including a plurality of views configured for assisting the user in aligning a distal tip of the probe with the target.
SYSTEMS AND METHODS FOR ROBOTICALLY-ASSISTED HISTOTRIPSY TARGETING BASED ON MRI/CT SCANS TAKEN PRIOR TO TREATMENT
Methods and devices for producing cavitation in tissue are provided. Methods and devices are also provided for surgical navigation, including defining a target treatment zone and navigating a focus of a therapy transducer to the target treatment zone. Embodiments are provided for co-registering a plurality of surgical imaging and navigation systems. Systems for performing Histotripsy therapy are also discussed.
Technique for transferring a registration of image data of a surgical object from one surgical navigation system to another surgical navigation system
A method, a controller, and a surgical hybrid navigation system for transferring a registration of three dimensional image data of a surgical object from a first to a second surgical navigation system are described. A first tracker that is detectable by a first detector of the first surgical navigation system is arranged in a fixed spatial relationship with the surgical object and a second tracker that is detectable by a second detector of the second surgical navigation system is arranged in a fixed spatial relationship with the surgical object. The method includes registering the three dimensional image data of the surgical object in a first coordinate system of the first surgical navigation system and determining a first position and orientation of the first tracker in the first coordinate system and a second position and orientation of the second tracker in a second coordinate system of the second surgical navigation system.
METHOD FOR AUTOMATICALLY PLANNING A TRAJECTORY FOR A MEDICAL INTERVENTION
The invention relates to a method for automatically planning a trajectory to be followed during a medical intervention by a medical instrument targeting an anatomy of interest of a patient, said automatic planning method comprising the steps of: acquiring at least one medical image of the anatomy of interest; determining a target point on the previously acquired image; generating a set of trajectory planning parameters from the medical image of the anatomy of interest and the previously determined target point, the set of planning parameters comprising coordinates of an entry point on the medical image. The set of parameters is generated using a machine learning method of neural network type. The invention also relates to a guiding device implementing the set of planning parameters obtained.
ULTRASONIC ROBOTIC SURGICAL NAVIGATION
Surgical robot systems, anatomical structure tracker apparatuses, and US transducer apparatuses are disclosed. A surgical robot system includes a robot, a US transducer, and at least one processor. The robot includes a robot base, a robot arm coupled to the robot base, and an end-effector coupled to the robot arm. The end-effector is configured to guide movement of a surgical instrument. The US transducer is coupled to the end-effector and operative to output US imaging data of anatomical structure proximately located to the end-effector. The least one processor is operative to obtain an image volume for the patient and to track pose of the end-effector relative to anatomical structure captured in the image volume based on the US imaging data.
Surgical instrument with real time navigation assistance
Navigation assistance systems and methods for use with a surgical instrument to assist in navigation of a surgical instrument during an operation. The system may include sensors that may observe the patient to generate positioning data regarding the relative position of the surgical instrument and the patient. The system may retrieve imaging data regarding the patient and correlate the imaging data to the positioning data. In turn, the position of the surgical instrument relative to the imaging data may be provided and used to generate navigation date (e.g., position, orientation, trajectory, or the like) regarding the surgical instrument.
Mixed-reality surgical system with physical markers for registration of virtual models
An example method includes obtaining, a virtual model of a portion of an anatomy of a patient obtained from a virtual surgical plan for an orthopedic joint repair surgical procedure to attach a prosthetic to the anatomy; identifying, based on data obtained by one or more sensors, positions of one or more physical markers positioned relative to the anatomy of the patient; and registering, based on the identified positions, the virtual model of the portion of the anatomy with a corresponding observed portion of the anatomy.
System and method for navigation
Disclosed is a system for assisting in guiding and performing a procedure on a subject. The subject may be any appropriate subject such as inanimate object and/or an animate object. The guide and system may include various manipulable or movable members, such as robotic systems, and may be registered to selected coordinate systems.
Surgical robotic system and method for transitioning control to a secondary robot controller
A robotic surgical system and method are disclosed for transitioning control to a secondary robotic arm controller. In one embodiment, a robotic surgical system comprises a user console comprising a display device and a user input device; a robotic arm configured to be coupled to an operating table; a primary robotic arm controller configured to move the robotic arm in response to a signal received from the user input device at the user console; and a secondary robotic arm controller configured to move the robotic arm in response to a signal received from a user input device remote from the user console. Control over movement of the robotic arm is transitioned from the primary robotic arm controller to the secondary robotic arm controller in response to a failure in the primary robotic arm controller. Other embodiments are provided.