Patent classifications
A61B2034/2068
ANATOMICAL SCANNING, TARGETING, AND VISUALIZATION
A method for visualizing and targeting anatomical structures inside a patient utilizing a handheld screen device may include grasping the handheld screen device and manipulating a position of the handheld screen device relative to the patient. The handheld screen device may include a camera and a display. The method may also include orienting the camera on the handheld screen device relative to an anatomical feature of the patient by manipulating the position of the handheld screen device relative to the patient, capturing first image data of light reflecting from a surface of the anatomical feature with the camera on the handheld screen device, and comparing the first image data with a pre-operative 3-D image of the patient to determine a location of an anatomical structure located inside the patient and positioned relative to the anatomical feature of the patient.
METHODS AND SYSTEMS FOR DISPLAYING PREOPERATIVE AND INTRAOPERATIVE IMAGE DATA OF A SCENE
Mediated-reality imaging systems, methods, and devices are disclosed herein. In some embodiments, an imaging system includes a camera array configured to (i) capture intraoperative image data of a surgical scene in substantially real-time and (ii) track a tool through the scene. The imaging system is further configured to receive and/or store preoperative image data, such as medical scan data corresponding to a portion of a patient in the scene. The imaging device can register the preoperative image data to the intraoperative image data, and display the preoperative image data and a representation of the tool on a user interface, such as a head-mounted display.
Virtual 6-DOF Tracker for Surgical Navigation
Disclosed is a method for use in surgical navigation, the method being performed by a computing system. A first pose of a first tracker is determined in exactly four degrees of freedom, DOF, and a second pose of a second tracker is determined in exactly four DOF. Based on the first pose and the second pose, a third pose of a virtual tracker may be determined in six DOF. The virtual tracker has a fixed spatial relationship relative to the anatomical object in the six DOF of the third pose. A transformation between the third pose and a fourth pose of the anatomical object in six DOF in an image coordinate system of image data of the anatomical object may be determined or obtained. The present disclosure further relates to a computing system, a surgical navigation system and a computer program product.
Modular Orthopedic Implants, Instruments, and Navigation Methods
Modular orthopedic implants, associated instruments, and navigation methods. The modular orthopedic fixation assembly may include a modular bone fastener and a modular tulip head configured to be installed separately. The modular bone fastener may be installed and tracked with a screw extender instrument having an outer sleeve and an inner shaft coupled to the bone fastener. The screw extender instrument may continue to track the location and orientation of the bone throughout the surgical procedure for navigational integrity. The modular tulip head may be assembled to the bone fastener with a head inserter instrument, which ensures the modular head is properly seated on the installed bone fastener.
ULTRASONIC ROBOTIC SURGICAL NAVIGATION
Surgical robot systems, anatomical structure tracker apparatuses, and US transducer apparatuses are disclosed. A surgical robot system includes a robot, a US transducer, and at least one processor. The robot includes a robot base, a robot arm coupled to the robot base, and an end-effector coupled to the robot arm. The end-effector is configured to guide movement of a surgical instrument. The US transducer is coupled to the end-effector and operative to output US imaging data of anatomical structure proximately located to the end-effector. The least one processor is operative to obtain an image volume for the patient and to track pose of the end-effector relative to anatomical structure captured in the image volume based on the US imaging data.
Robotic surgical system with virtual control panel for tool actuation
A surgical system includes a detector, comprising an array of pixels configured to detect light reflected by a surgical instrument and generate a first signal comprising a first dataset representative of a visible image of the surgical instrument. The surgical system also includes a processor configured to receive the first signal, generate a modified image of the surgical instrument that includes a control panel. The control panel includes one or more control elements representative of one or more operating parameters of the surgical instrument. The processor is further configured to receive an input to the control panel from a user, the input being effective to change one of the operating parameters. The processor is also configured to generate a command signal based on the input to change the one of the operating parameters.
Hip replacement navigation systems and methods
Hip joint navigation systems and methods are provided. In some embodiments, the systems and methods described herein determine a table reference plane that approximates the Anterior Pelvic Plane. In some embodiments, the systems and methods described herein measure a pre-operative and post-operative point. In some embodiments, the comparison of the pre-operative and post-operative point corresponds to changes in leg length and joint offset. In some embodiments, the systems and methods described herein determine an Adjusted Plane. In some embodiments, the Adjusted Plane adjusts for tilt by rotating the Anterior Pelvic Plane about the inter-ASIS line. In some embodiments, the Adjusted Plane improves correlation between navigated cup angles and post-operative images.
Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
A surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the cavity of the patient. The surgical instrument navigation system includes: a surgical instrument; an imaging device which is operable to capture scan data representative of an internal region of interest within a given patient; a tracking subsystem that employs electro-magnetic sensing to capture in real-time position data indicative of the position of the surgical instrument; a data processor which is operable to render a volumetric, perspective image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric perspective image of the patient.
Method for bone registration and surgical robot
The present disclosure provides a surgical robot including a control system, a force identification system, a robotic arm system and a navigation system, the robotic arm system including a robotic arm, a robotic arm terminal detachably connected to a trackable element. The navigation system acquires and provides a registration point of interest on an object to the robotic arm system. The robotic arm system controls movements of the robotic arm to drive the trackable element to move to the registration point of interest. The force identification system detects and provides a force applied to the robotic arm terminal to the control system. The control system determines whether the trackable element has moved to the registration point of interest on the object. The present disclosure also provides a method for bone registration of the surgical robot.
Surgical navigation system providing attachment metrics
A system and method for providing enhanced information to a surgeon is described. A three-dimensional reconstruction of a patient's anatomical structure selected for surgery and a representation of a surgical treatment apparatus are rendered on a display device. At least one attachment metric for a proposed attachment between the surgical treatment apparatus and the patient's anatomical structure is calculated using the three-dimensional position of the surgical treatment apparatus relative to the patient's anatomical structure. And, an indication of the attachment metric is rendered on the display device.