Patent classifications
A61B2034/2057
ULTRASONIC ROBOTIC SURGICAL NAVIGATION
Surgical robot systems, anatomical structure tracker apparatuses, and US transducer apparatuses are disclosed. A surgical robot system includes a robot, a US transducer, and at least one processor. The robot includes a robot base, a robot arm coupled to the robot base, and an end-effector coupled to the robot arm. The end-effector is configured to guide movement of a surgical instrument. The US transducer is coupled to the end-effector and operative to output US imaging data of anatomical structure proximately located to the end-effector. The least one processor is operative to obtain an image volume for the patient and to track pose of the end-effector relative to anatomical structure captured in the image volume based on the US imaging data.
Systems and methods for navigation in image-guided medical procedures
Medical imaging systems and methods are provided herein that provide for navigation and procedure planning without segmentation. A method comprises receiving, by a medical imaging system having at least one processing device, three-dimensional image data of a patient anatomy. The method also comprises filtering the three-dimensional data to display a portion of the three-dimensional image data that is associated with the patient anatomy and receiving, at the processing device, input from an operator input device. The input comprises navigational directions for virtual movement within a space defined by the three-dimensional image data. The method also includes tracking the virtual movement, defining a tracked pathway based on the tracked virtual movement, and generating a model of the patient anatomy based on the tracked pathway. The model of the patient anatomy is a line model including one or more lines based on the tracked pathway.
Hip replacement navigation systems and methods
Hip joint navigation systems and methods are provided. In some embodiments, the systems and methods described herein determine a table reference plane that approximates the Anterior Pelvic Plane. In some embodiments, the systems and methods described herein measure a pre-operative and post-operative point. In some embodiments, the comparison of the pre-operative and post-operative point corresponds to changes in leg length and joint offset. In some embodiments, the systems and methods described herein determine an Adjusted Plane. In some embodiments, the Adjusted Plane adjusts for tilt by rotating the Anterior Pelvic Plane about the inter-ASIS line. In some embodiments, the Adjusted Plane improves correlation between navigated cup angles and post-operative images.
Trackable protective packaging for tools and methods for calibrating tool installation using the same
Protective packaging, surgical kits, systems, and methods are described herein for assisting in determining whether a tool is properly installed on a surgical device. The protective packaging retains the tool and has trackable features defined relative to a tool center point of the tool. The trackable features have a predetermined state defined relative to the tool center point and the trackable features are configured to be detectable by a localizer to locate the tool center point. One or more controllers can compare the actual state of the tool center point with an expected state of the tool center point, which is based on an expected condition in which the tool is properly mounted to the surgical device. Based on the comparison, the one or more controllers can determine whether the tool is properly mounted to the surgical device.
Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
A virtual model a planned instrument attachment can be provided to ensure correct selection of a physical instrument attachment. An XR headset controller can generate a shape and a pose of the virtual model of the planned instrument attachment based on predetermined information associated with the planned instrument attachment and based on a pose of an instrument relative to the XR headset. An XR headset can display the virtual model on a see-through display screen of the XR headset that is configured to allow at least a portion of a real-world scene to pass therethrough.
Intraoperative alignment assessment system and method
Some embodiments provide systems, assemblies, and methods of analyzing patient anatomy including providing an analysis of a patient's spine. The systems, assemblies, and/or methods can include obtaining initial patient data, and acquiring spinal alignment contour information. Further, the systems, assemblies, and/or methods can assess localized anatomical features of the patient, and obtain anatomical region data. The system, assemblies, and/or method can analyze the localized anatomy and therapeutic device location and contouring. Further, the system, assemblies, and/or method can output localized anatomical analyses and therapeutic device contouring data and/or imagery on a display.
Technique for providing user guidance in surgical navigation
A technique of providing user guidance for surgical navigation is provided. A method implementation of the technique includes obtaining a predetermined spatial relationship between an optical tracking pattern and a through-hole extending through an implant, obtaining image data of the optical tracking pattern acquired by an imaging unit attached to a surgical instrument, obtaining a spatial relationship between the surgical instrument and the imaging unit at a point in time when the image data have been acquired, determining a spatial relationship between the surgical instrument and the through-hole, obtaining a plurality of predefined spatial relationships between the surgical instrument and the through-hole, and triggering simultaneous display of an indication of the plurality of predefined spatial relationships and an indication of the spatial relationship between the surgical instrument and the through-hole.
HYBRID MULTI-CAMERA TRACKING FOR COMPUTER-GUIDED SURGICAL NAVIGATION
The invention relates to a camera system for surgical navigation systems including a plurality of cameras mounted in a room. At least three cameras are mounted in the room which are operated in at least two different modes. In the first mode at least a subset of the cameras is operated to determine the position of markers and in a second mode at least a subset of the cameras is operated to determine the position of surfaces of the room.
SYSTEMS AND METHODS FOR VENTRICLE PROCEDURES
A computing device having a first processor configured to receive three-dimensional imaging data acquired by an imaging system, the three-dimensional imaging data being from ahead of a subject. The first processor can be configured to determine vascular structures from the three-dimensional imaging data, generate a three-dimensional model of the head of the subject from the three-dimensional imaging data, the three-dimensional model of the head including a three-dimensional model of the vascular structures and a three-dimensional model of a portion of a frontal horn of the subject, determine an entry point on the three-dimensional model of the head of the subject, determine a plurality of trajectories, each trajectory intersecting the three-dimensional model of the portion of the frontal horn and does not intersect the three-dimensional model of the vascular structures, and each trajectory is linear, and select a final trajectory from the plurality of trajectories.
MEDICAL ROBOTIC SYSTEMS, OPERATION METHODS AND APPLICATIONS OF SAME
A robotic system for treating a patient includes a robotic arm with an end effector for treating the patient, wherein the robotic arm is configured to be movable in a space surrounding the patient, and the end effector is configured to be movable individually and co-movable with the robotic arm in said space; a sensing device for acquiring data associating with coordinates and images of the end effector and the patient; and a controller in communications with the robotic arm and the sensing device for controlling movements of the robotic arm with the end effector and treatments of the patient with the end effector based on the acquired data and a treatment plan.