A61B2090/3945

FIRING MEMBER TRACKING FEATURE FOR SURGICAL STAPLER

An apparatus includes a shaft assembly, an end effector, and a drive member visualization assembly. The end effector includes a first jaw, a second jaw, a staple cartridge, and a drive member capable of actuating along a firing stroke to fire a plurality of staples out of the staple cartridge or to sever tissue. The drive member visualization assembly provides an electronic indication linked to a physical location of the drive member within the upper jaw and the lower jaw during the firing stroke.

Opthalmic microsurgical instrument
11576816 · 2023-02-14 ·

In some embodiments, a microsurgical instrument includes a trocar having a rigid, hollow shaft formed with a lumen extending from a proximal end to a distal end of the shaft. The distal end of the shaft may be shaped for tissue penetration. The instrument may further include a composite microcannula slidably engaged with the trocar in the lumen. The microcannula includes a light guide and a flexible hollow tube having an outer diameter less than an inner diameter of the lumen in the trocar. Other embodiments include placing the microcannula in the lumen of the trocar, illuminating the end of the trocar by illuminating the end of the microcannula, advancing the trocar from a selected entry point on an eye into a selected structure in the eye, and extending the illuminated end of the microcannula from the trocar into the selected structure.

Technique for transferring a registration of image data of a surgical object from one surgical navigation system to another surgical navigation system

A method, a controller, and a surgical hybrid navigation system for transferring a registration of three dimensional image data of a surgical object from a first to a second surgical navigation system are described. A first tracker that is detectable by a first detector of the first surgical navigation system is arranged in a fixed spatial relationship with the surgical object and a second tracker that is detectable by a second detector of the second surgical navigation system is arranged in a fixed spatial relationship with the surgical object. The method includes registering the three dimensional image data of the surgical object in a first coordinate system of the first surgical navigation system and determining a first position and orientation of the first tracker in the first coordinate system and a second position and orientation of the second tracker in a second coordinate system of the second surgical navigation system.

Mixed-reality surgical system with physical markers for registration of virtual models

An example method includes obtaining, a virtual model of a portion of an anatomy of a patient obtained from a virtual surgical plan for an orthopedic joint repair surgical procedure to attach a prosthetic to the anatomy; identifying, based on data obtained by one or more sensors, positions of one or more physical markers positioned relative to the anatomy of the patient; and registering, based on the identified positions, the virtual model of the portion of the anatomy with a corresponding observed portion of the anatomy.

Method for recording probe movement and determining an extent of matter removed

A method and system for determining an extent of matter removed from a targeted anatomical structure are disclosed. The method includes acquiring an initial representation of a targeted anatomical structure and then removing matter from the targeted anatomical structure. An instrument is then navigated within the targeted anatomical structure. The instrument includes a tracking array, and a relative position of the instrument within the targeted anatomical structure is determined by the tracking array. The method includes recording the relative position of the instrument within the targeted anatomical structure to determine a final representation of the targeted anatomical structure. Finally, the method includes determining an extent of matter removed from the targeted anatomical structure by comparing the initial representation of the targeted anatomical structure with the final representation of the targeted anatomical structure. Indicators are provided to convey the extent of matter remaining within the targeted anatomical structure.

SURGICAL ROBOT PLATFORM
20180000546 · 2018-01-04 ·

A medical robot system, including a robot coupled to an effectuator element with the robot configured for controlled movement and positioning. The system may include a transmitter configured to emit one or more signals, and the transmitter is coupled to an instrument coupled to the effectuator element. The system may further include a motor assembly coupled to the robot and a plurality of receivers configured to receive the one or more signals emitted by the transmitter. A control unit is coupled to the motor assembly and the plurality of receivers, and the control unit is configured to supply one or more instruction signals to the motor assembly. The instruction signals can be configured to cause the motor assembly to selectively move the effectuator element.

GUIDE CATHETER AND METHOD OF USE

A system for manipulating a guide catheter within a patient's nasal passages or sinus cavities includes a guide catheter formed from an elongate flexible member having a lumen passing there through. A wire guide is slidably disposed within the lumen of the guide catheter. The system further includes a steering member fixedly secured to a proximal end of the wire guide and a proximal hub secured to a proximal end of the guide catheter. The system further includes a recessed handle having a first recess for fixedly receiving the proximal hub of the guide catheter and a second recess for receiving the steering member, the second recess being dimensioned to permit axial and rotational movement of the steering member while disposed in the second recess.

Method and system for hand tracking in a robotic system

A method and system for hand tracking in a robotic system includes a hand tracking system and a controller coupled to the hand tracking system. The controller is configured to receive, from the hand tracking system, a plurality of locations of a hand; determine if the hand is in a first hand pose based on the plurality of locations; in response to determining that the hand is in the first hand pose, and switch the robotic system to a hand trajectory detection mode. While in the hand trajectory detection mode, the control unit is configured to detect, based on hand tracking information from the hand tracking system, that the hand has performed a first hand trajectory of a plurality of known hand trajectories; and in response to detecting the first hand trajectory, change a mode of operation of the robotic system.

System and method for tracking completeness of co-registered medical image data
11707256 · 2023-07-25 · ·

A system and method for tracking completeness of co-registered medical image data is disclosed herein. The system and method tracks the position of an anatomical reference marker positionable on a patient and an ultrasound probe during an imaging session and co-registers medical images based on positional data received from the anatomical reference marker and the ultrasound probe. Using the co-registered image data, the system and method generates a surface contour of a region of interest (ROI) of the patient, such as a breast. The surface contour is defined to represent an interface between a chest wall structure and tissue of the ROI in a plurality of co-registered medical images. A completeness map of the image data within the defined surface contour during the imaging session is generated and overlaid on a graphic representation of the ROI.

METHODS FOR OPTICAL TRACKING AND SURFACE ACQUISITION IN SURGICAL ENVIRONMENTS AND DEVICES THEREOF

A computer assisted system is disclosed that includes an optical tracking system and one or more computing devices. The optical tracking system includes an RGB sensor and is configured to capture color images of an environment in the visible light spectrum and tracking images of fiducials in the environment in a near-infrared spectrum. The computer assisted system is configured to generate a color image of the environment using the color images, identify fiducial locations using the tracking images, generate depth maps from the color images, reconstruct three-dimensional surfaces of structures based on the depth maps, and output a display comprising the reconstructed three-dimensional surface and one or more surgical objects that are associated with the tracked fiducials. The computer assisted system can further include a monitor or a head-mounted display (HMD) configured to present augmented reality (AR) images during a procedure.