Patent classifications
A61B2034/2055
ULTRASONIC ROBOTIC SURGICAL NAVIGATION
Surgical robot systems, anatomical structure tracker apparatuses, and US transducer apparatuses are disclosed. A surgical robot system includes a robot, a US transducer, and at least one processor. The robot includes a robot base, a robot arm coupled to the robot base, and an end-effector coupled to the robot arm. The end-effector is configured to guide movement of a surgical instrument. The US transducer is coupled to the end-effector and operative to output US imaging data of anatomical structure proximately located to the end-effector. The least one processor is operative to obtain an image volume for the patient and to track pose of the end-effector relative to anatomical structure captured in the image volume based on the US imaging data.
Mixed-reality surgical system with physical markers for registration of virtual models
An example method includes obtaining, a virtual model of a portion of an anatomy of a patient obtained from a virtual surgical plan for an orthopedic joint repair surgical procedure to attach a prosthetic to the anatomy; identifying, based on data obtained by one or more sensors, positions of one or more physical markers positioned relative to the anatomy of the patient; and registering, based on the identified positions, the virtual model of the portion of the anatomy with a corresponding observed portion of the anatomy.
System and method for navigation
Disclosed is a system for assisting in guiding and performing a procedure on a subject. The subject may be any appropriate subject such as inanimate object and/or an animate object. The guide and system may include various manipulable or movable members, such as robotic systems, and may be registered to selected coordinate systems.
Surgical robotic system and method for transitioning control to a secondary robot controller
A robotic surgical system and method are disclosed for transitioning control to a secondary robotic arm controller. In one embodiment, a robotic surgical system comprises a user console comprising a display device and a user input device; a robotic arm configured to be coupled to an operating table; a primary robotic arm controller configured to move the robotic arm in response to a signal received from the user input device at the user console; and a secondary robotic arm controller configured to move the robotic arm in response to a signal received from a user input device remote from the user console. Control over movement of the robotic arm is transitioned from the primary robotic arm controller to the secondary robotic arm controller in response to a failure in the primary robotic arm controller. Other embodiments are provided.
Methods and apparatus for treating disorders of the ear nose and throat
Methods and apparatus for treating disorders of the ear, nose, throat or paranasal sinuses, including methods and apparatus for dilating ostia, passageways and other anatomical structures, endoscopic methods and apparatus for endoscopic visualization of structures within the ear, nose, throat or paranasal sinuses, navigation devices for use in conjunction with image guidance or navigation system and hand held devices having pistol type grips and other handpieces.
Method for recording probe movement and determining an extent of matter removed
A method and system for determining an extent of matter removed from a targeted anatomical structure are disclosed. The method includes acquiring an initial representation of a targeted anatomical structure and then removing matter from the targeted anatomical structure. An instrument is then navigated within the targeted anatomical structure. The instrument includes a tracking array, and a relative position of the instrument within the targeted anatomical structure is determined by the tracking array. The method includes recording the relative position of the instrument within the targeted anatomical structure to determine a final representation of the targeted anatomical structure. Finally, the method includes determining an extent of matter removed from the targeted anatomical structure by comparing the initial representation of the targeted anatomical structure with the final representation of the targeted anatomical structure. Indicators are provided to convey the extent of matter remaining within the targeted anatomical structure.
Systems and methods to compute a subluxation between two bones
Systems, methods and a sensor alignment mechanism are disclosed for medical navigational guidance systems. In one example, a system to make sterile a non-sterile optical sensor for use in navigational guidance during surgery includes a sterile drape having an optically transparent window to drape the optical sensor in a sterile barrier and a sensor alignment mechanism. The alignment mechanism secures the sensor through the drape in alignment with the window without breaching the sterile barrier and facilitates adjustment of the orientation of the optical sensor. The optical sensor may be aligned to view a surgical site when the alignment mechanism, assembled with the sterile drape and optical sensor, is attached to a bone. The alignment mechanism may be a lockable ball joint and facilitate orientation of the sensor in at least two degrees of freedom. A quick connect mechanism may couple the alignment mechanism to the bone.
Methods and apparatus for intraoperative assessment of parathyroid gland vascularity using laser speckle contrast imaging and applications of same
One aspect of the invention relates to a method for intraoperative assessment of parathyroid gland viability in a surgery. The method includes diffusing a beam of light onto a tissue surface of a parathyroid gland of a patient to illuminate the tissue surface; acquiring images of the illuminated tissue surface, where each of the acquired images includes a speckle pattern; and processing the acquired images to obtain speckle contrast images for the intraoperative assessment of parathyroid gland viability.
Surgical navigation system and method
The present disclosure relates to a surgical navigation system for the alignment of a surgical instrument and methods for its use, wherein the surgical navigation system may comprise a head-mounted display comprising a lens. The surgical navigation system may further comprise tracking unit, herein the tracking unit may be configured to track a patient tracker and/or a surgical instrument. Patient data may be registered to the patient tracker. The surgical instrument may define an instrument axis. The surgical navigation system may be configured to plan one or more trajectories based on the patient data. The head-mounted display may be configured to display augmented reality visualization, including an augmented reality position alignment visualization and/or an augmented reality angular alignment visualization related to the surgical instrument on the lens of the head-mounted display.
Visualization systems using structured light
A visualization system including multiple light sources, an image sensor configured to detect imaging data from the multiple light sources, and a control circuit is disclosed. At least one of the light sources is configured to emit a pattern of structured light. The control circuit is configured to receive the imaging data from the image sensor, generate a three-dimensional digital representation of the anatomical structure from the pattern of structured light detected by the imaging data, obtain metadata from the imaging data, overlay the metadata on the three-dimensional digital representation, receive updated imaging data from the image sensor, and generate an updated three-dimensional digital representation of the anatomical structure based on the updated imaging data. The visualization system can be communicatively coupled to a situational awareness module configured to determine a surgical scenario based on input signals from multiple surgical devices.