Patent classifications
A61B2090/367
THREE-DIMENSIONAL INSTRUMENT POSE ESTIMATION
The present disclosure relates to systems, devices, and methods for augmenting a two-dimensional image with three-dimensional pose information of instruments shown in the two-dimensional image.
Surgical navigation with stereovision and associated methods
A surgical guidance system has two cameras to provide stereo image stream of a surgical field; and a stereo viewer. The system has a 3D surface extraction module that generates a first 3D model of the surgical field from the stereo image streams; a registration module for co-registering annotating data with the first 3D model; and a stereo image enhancer for graphically overlaying at least part of the annotating data onto the stereo image stream to form an enhanced stereo image stream for display, where the enhanced stereo stream enhances a surgeon's perception of the surgical field. The registration module has an alignment refiner to adjust registration of the annotating data with the 3D model based upon matching of features within the 3D model and features within the annotating data; and in an embodiment, a deformation modeler to deform the annotating data based upon a determined tissue deformation.
SYSTEMS AND METHODS OF USING THREE-DIMENSIONAL IMAGE RECONSTRUCTION TO AID IN ASSESSING BONE OR SOFT TISSUE ABERRATIONS FOR ORTHOPEDIC SURGERY
Systems and methods for calculating external bone loss for alignment of pre-diseased joints comprising: generating a three-dimensional (“3D”) computer model of an operative area from at least two two-dimensional (“2D”) radiographic images, wherein at least a first radiographic image is captured at a first position, and wherein at least a second radiographic image is captured at a second position, and wherein the first position is different than the second position; identifying an area of bone loss on the 3D computer model; and applying a surface adjustment algorithm to calculate an external missing bone surface fitting the area of bone loss.
ULTRASOUND SLICE ENHANCEMENT
In one embodiment a system includes a ultrasound probe to capture 2D ultrasonic images of a body part of a living subject, a process to generate a 3D anatomical map of the body part, the 3D anatomical map and the 2D ultrasonic images being registered with a 3D coordinate space, add a 3D indication of an anatomical structure to the 3D anatomical map, render to a display the 3D anatomical map including the 3D indication of the anatomical structure, and render to the display a given one of the 2D ultrasonic images with a 2D indication of the anatomical structure on the given 2D ultrasonic image responsively to the 3D indication of the anatomical structure.
System and methods for correcting image data of distinct images and generating and stereoscopic three-dimensional images
An optical imaging system for imaging a target during a medical procedure, the optical imaging system involving a first camera for capturing a first image of the target, a second wide-field camera for capturing a second image of the target, at least one optional path folding mirror disposed in an optical path between the target and a lens of the second camera, and a processor for receiving the first image and the second image, the processor configured to apply an image transform to one of the first image and the second wide-field image and combine the transformed image with the other one of the images to produce a stereoscopic image of the target.
HYBRID MULTI-CAMERA TRACKING FOR COMPUTER-GUIDED SURGICAL NAVIGATION
The invention relates to a camera system for surgical navigation systems including a plurality of cameras mounted in a room. At least three cameras are mounted in the room which are operated in at least two different modes. In the first mode at least a subset of the cameras is operated to determine the position of markers and in a second mode at least a subset of the cameras is operated to determine the position of surfaces of the room.
METHOD AND SYSTEM FOR REPRODUCING AN INSERTION POINT FOR A MEDICAL INSTRUMENT
The invention relates to a method for displaying an injection point for a medical instrument. The method comprises the following steps: Providing at least one marker on a surface of an object, with such marker exhibiting the property that it can be recorded both tomographically, in particular fluoroscopically, and also optically; Generating tomographic image data that can be used to reconstruct a fluoroscopic image of the at least one marker, located on the surface of the object, together with the object; Determining the insertion point for the medical instrument on the surface of the object relative to the at least one marker in the coordinate system of the tomographic image data; Generating visual image data that can be used to reconstruct a visual image of the at least one marker, located on the surface of the object, together with the object; Transforming the coordinate of the insertion point in the coordinate system of the tomographic image data into the coordinate system of the visual image data using the relative position of the insertion point to the at least one marker; and Displaying the insertion point for the medical instrument in real time in a view of the object.
SYSTEMS AND METHODS FOR IDENTIFYING AND FACILITATING AN INTENDED INTERACTION WITH A TARGET OBJECT IN A SURGICAL SPACE
An exemplary system includes a memory storing instructions and a processor communicatively coupled to the memory. The processor may be configured to execute the instructions to: detect an intent of a user of a computer-assisted surgical system to use a robotic instrument attached to the computer-assisted surgical system to interact with a target object while the target object is located in a surgical space; determine a pose of the target object in the surgical space; and perform, based on the detected intent of the user to interact with the target object and the determined pose of the target object in the surgical space, an operation with respect to the target object.
SYSTEMS AND METHODS FOR PLANNING A PATIENT-SPECIFIC SPINAL CORRECTION
Systems and methods are provided to plan a spinal correction surgery. The method includes measuring parameters of a spine in a two-dimensional (2D) spinal image including a thoracic Cobb angle and a thoracic kyphosis (TK) and transforming the 2D image to a three-dimensional (3D), spinal image representation. The transforming includes performing segmentation of spine elements in the 2D image, and applying a formula based on the thoracic Cobb angle and the TK to the spine elements. The method includes identifying a TK goal having a post-operative TK value to selected spine elements, transforming a gap of the spine elements representative of a difference between the pre-operative TK in 3D spinal image representation and the TK goal to create a 3D post-operative spinal image representation, and determining a first rod design based on the 3D post-operative spinal image representation to achieve the post-operative TK value in the spine elements.
SYSTEM FOR PLANNING THE INTRODUCTION OF A NEEDLE IN A PATIENT'S BODY
The invention relates to a system for planning introduction of a needle in a patient's body, comprising: a needle guide configured to be coupled to a needle; a localization system configured for tracking the needle guide with respect to the patient's body, the localization system being coupled to a needle tracker attached to the needle guide and a reference marker adapted to be attached to the patient's body to determine a spatial position and orientation of the needle tracker relative to the reference marker; a processor configured for determining a virtual position and orientation of the needle with respect to the 3D image using localization data of the needle guide, thereby defining a virtual needle having said virtual position and orientation, detecting a part of the needle that has already been inserted into the patient's body as a trace in the 3D medical image, computing a distance between the virtual needle and the detected needle, and determining a representation of the computed distance; a display coupled to the processor for displaying a representation of the virtual needle and the representation of the computed distance between the virtual needle and the detected part of the needle.