Patent classifications
A61B2034/2065
AUGMENTED/MIXED REALITY SYSTEM AND METHOD FOR ORTHOPAEDIC ARTHROPLASTY
Augmented and/or mixed reality systems for performing various types of arthroplasty are provided, along with methods of performing various types of arthroplasty using such augmented reality systems. More particularly, the augmented and/or mixed reality system and method is used to achieve accurate bone preparation, implant placement and orientation, and biomechanical restoration in orthopaedic arthroplasty procedures. Preparation, implantation, and adjustment of arthroplasty surgical sites, prosthetic components, and tailoring and positioning of installed prosthetic components can be guided using augmented reality overlays, projections, or combined imaging of a surgeon's real-world view.
ANATOMICAL SCANNING, TARGETING, AND VISUALIZATION
A method for visualizing and targeting anatomical structures inside a patient utilizing a handheld screen device may include grasping the handheld screen device and manipulating a position of the handheld screen device relative to the patient. The handheld screen device may include a camera and a display. The method may also include orienting the camera on the handheld screen device relative to an anatomical feature of the patient by manipulating the position of the handheld screen device relative to the patient, capturing first image data of light reflecting from a surface of the anatomical feature with the camera on the handheld screen device, and comparing the first image data with a pre-operative 3-D image of the patient to determine a location of an anatomical structure located inside the patient and positioned relative to the anatomical feature of the patient.
REGISTRATION AND/OR TRACKING OF A PATIENT'S BONE EMPLOYING A PATIENT SPECIFIC BONE JIG
A method includes obtaining, via one or more processors, three-dimensional data representing a patient's bone, obtaining, via the one or more processors, three-dimensional data representing at least portions of a patient specific bone jig, the patient specific bone jig having an inner surface portion matched to an outer surface portion of the patient's bone, obtaining, via the one or more processors, image data representing the at least portions of the patient specific bone jig registered to the patient's bone, and generating, via the one or more processors, data representing a location and an orientation of the patient's bone based on the obtained image data, the obtained three-dimensional data representing the patient specific bone jig, and the obtained three-dimensional data representing the patient's bone. In another embodiment, a patient specific bone jig with predetermined spatial indicia registered to a portion of the patient's bone may be employed with point sampling.
ROBOTIC NAVIGATION AND GUIDANCE SYSTEM FOR IMPLANTING A NEUROMODULATION DEVICE
The present invention provides a robotic navigation system for identifying a target nerve for guiding and/or performing the implanting a neuromodulation device at the target nerve wherein the neuromodulation device includes a pulse generator and at least one lead in electrical or operative connection with the pulse generator. In some embodiments, the location of the robotically advanced lead and electrode may be imaged and displayed on a display and/or may be visually annunciated using one or more lights to indicate whether the placement location of the lead or electrode is within or outside of a predetermined distance of the target nerve.
METHODS AND SYSTEMS FOR DISPLAYING PREOPERATIVE AND INTRAOPERATIVE IMAGE DATA OF A SCENE
Mediated-reality imaging systems, methods, and devices are disclosed herein. In some embodiments, an imaging system includes a camera array configured to (i) capture intraoperative image data of a surgical scene in substantially real-time and (ii) track a tool through the scene. The imaging system is further configured to receive and/or store preoperative image data, such as medical scan data corresponding to a portion of a patient in the scene. The imaging device can register the preoperative image data to the intraoperative image data, and display the preoperative image data and a representation of the tool on a user interface, such as a head-mounted display.
Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
A method of registering a real-time image feed from an imaging device inserted into a steerable catheter using a navigation system is provided. The method includes inserting the imaging device into a working channel of the steerable catheter and generating a real-time image feed of one or more reference points, wherein the orientation of the reference points is known. The method further includes orienting a handle of the steerable catheter to a neutral position, displaying the real-time image feed on a display of the navigation system, and registering the real-time image feed to the steerable catheter by rotating the displayed image so that the reference points in the real-time image feed are matched to the known orientation of the reference points.
SYSTEMS AND METHODS FOR PROJECTING AN ENDOSCOPIC IMAGE TO A THREE-DIMENSIONAL VOLUME
A method comprises obtaining an endoscopic image dataset of a patient anatomy from an endoscopic imaging system and retrieving an anatomic model dataset of the patient anatomy obtained by an anatomic imaging system. The method also comprises mapping the endoscopic image dataset to the anatomic model dataset and displaying a first vantage point image using the mapped endoscopic image dataset. The first vantage point image is presented from a first vantage point at a distal end of the endoscopic imaging system. The method also comprises displaying a second vantage point image using at least a portion of the mapped endoscopic image dataset. The second vantage point image is presented from a second vantage point, different from the first vantage point.
System and method for real-time magnetic resonance imaging data visualization in three or four dimensions
A system for displaying and interacting with magnetic resonance imaging (MRI) data acquired using an MRI system includes an image reconstruction module configured to receive the MRI data and to reconstruct a plurality of images using the MRI data, an image rendering module coupled to the image reconstruction module and configured to generate at least one multidimensional image based on the plurality of images and a user interface device coupled to the image rendering module and located proximate to a workstation of the MRI system. The user interface device is configured to display the at least one multidimensional image in real-time and to facilitate interaction by a user with the multidimensional image in a virtual reality or augmented reality environment.
System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures
Aspects of the present disclosure relate to systems, devices and methods for performing a surgical step or surgical procedure for example with visual guidance using a head mounted display or with a surgical navigation system or with a surgical robot. A computer processor can be configured to determine the pose of a first vertebra with an attached first marker and a second vertebra with an attached second marker. The computer processor can be configured to determine the pose of at least one vertebra interposed or adjacent to the first and second vertebrae with attached markers, e.g. fiducial markers.
GUIDED ANATOMICAL MANIPULATION FOR ENDOSCOPIC PROCEDURES
Various embodiments of the present disclosure encompass manipulative endoscopic guidance device employing an endoscopic viewing controller (20) for controlling a display of an endoscopic view (11) of an anatomical structure, and a manipulative guidance controller (30) for controlling a display of one or more guided manipulation anchors (50-52) within the display of the endoscopic view (11) of the anatomical structure. A guided manipulation anchor (50-52) is representative of a location marking and/or a motion directive of a guided manipulation of the anatomical structure (e.g., grasping, pulling, pushing, sliding, reorienting, tilting, removing, or repositioning of the anatomical structure). The manipulative guidance controller (30) may generate an anchor by analyzing a correlation of the endoscopic view (11) of the anatomical structure with a knowledge base of image(s), model(s) and/or details corresponding to the anatomical structure and by deriving the anchor based on a degree of correlation of the endoscopic view (11) of the anatomical structure with the knowledge base.