Patent classifications
A61B90/36
System and methods for planning and performing three-dimensional holographic interventional procedures with three-dimensional tomographic and live imaging
A method and a system for image-guided intervention such as a percutaneous treatment or diagnosis of a patient may include at least one of a pre-registration method and a re-registration method. The pre-registration method is configured to permit for an efficient virtual representation of a planned trajectory to target tissue during the intervention, for example, as a holographic light ray shown through an augmented reality system. In turn, this allows the operator to align a physical instrument such as a medical probe for the intervention. The re-registration method is configured to adjust for inaccuracy in the virtual representation generated by the pre-registration method, as determined by live imaging of the patient during the intervention. The re-registration method may employ the use of intersectional contour lines to define the target tissue as viewed through the augmented reality system, which permits for an unobstructed view of the target tissue for the intervention.
ANATOMICAL FEATURE EXTRACTION AND PRESENTATION USING AUGMENTED REALITY
An ultrasound probe captures real-time images of patient anatomy, which are analyzed by a processor to extract salient features pertaining to an anatomical structure. By tracking the location and orientation of the ultrasound probe, a model of that anatomical structure can be created. A visual indication of the position of segments of the anatomical structure can be presented holographically to a user of an augmented reality headset to provide information extracted from the ultrasound imaging, such as holographic display of a model of the anatomical structure at the approximate location of the visual field of the headset corresponding to the physical location of the actual anatomy being viewed by a user, without presenting the entirety of the ultrasound image to the user.
DISCHARGE RISK AND MANAGEMENT
A method comprising receiving an input indicating intake information associated with a patient. Based on the input, the method further includes determining an initial discharge date and receiving mobility information associated with the patient. Based in part on the mobility information, the method further includes determining an estimated discharge date and a confidence metric associated with the estimated discharge date, determining that the estimated discharge date is later than the initial discharge date by more than a threshold period of time, and determining that the confidence metric is greater than a threshold metric. Based in part on the estimated discharge date being later than the initial discharge date by more than the threshold period of time and the confidence metric being greater than the threshold metric, the method further includes generating an alert.
AUGMENTED/MIXED REALITY SYSTEM AND METHOD FOR ORTHOPAEDIC ARTHROPLASTY
Augmented and/or mixed reality systems for performing various types of arthroplasty are provided, along with methods of performing various types of arthroplasty using such augmented reality systems. More particularly, the augmented and/or mixed reality system and method is used to achieve accurate bone preparation, implant placement and orientation, and biomechanical restoration in orthopaedic arthroplasty procedures. Preparation, implantation, and adjustment of arthroplasty surgical sites, prosthetic components, and tailoring and positioning of installed prosthetic components can be guided using augmented reality overlays, projections, or combined imaging of a surgeon's real-world view.
SYSTEMS AND METHODS FOR PROJECTING AN ENDOSCOPIC IMAGE TO A THREE-DIMENSIONAL VOLUME
A method comprises obtaining an endoscopic image dataset of a patient anatomy from an endoscopic imaging system and retrieving an anatomic model dataset of the patient anatomy obtained by an anatomic imaging system. The method also comprises mapping the endoscopic image dataset to the anatomic model dataset and displaying a first vantage point image using the mapped endoscopic image dataset. The first vantage point image is presented from a first vantage point at a distal end of the endoscopic imaging system. The method also comprises displaying a second vantage point image using at least a portion of the mapped endoscopic image dataset. The second vantage point image is presented from a second vantage point, different from the first vantage point.
Projection Scanning System
Imaging systems projecting augmented information on a physical object that at a minimum include a processor, a memory device operably connected to the processor, a projector operably coupled to the processor, and a distance-measuring device operably connected to the processor. The memory device stores augmented image information, and the processor is configured to project augmented image information onto the physical object. The distance-measuring device is configured to measure the distance to the physical object. The processor uses distance measurement information from the distance measuring device to adjust scaling of the augmented image information. The processor provides the scale adjusted augmented image information to the projector. System can also be used for fluorescence imaging during open surgery, for endoscopic fluorescence imaging and for registration of surgical instruments.
LIGHTING ARRANGEMENT FOR A MEDICAL IMAGING SYSTEM
A lighting arrangement for a medical imaging system having a cylindrical wall that forms a tunnel that receives a patient to be scanned. The lighting arrangement includes a transparent wall section formed in the wall, wherein the transparent wall section extends along a transparent portion of a wall circumference. The imaging system also includes a lighting device located adjacent an outer surface of the transparent wall section. The lighting device extends along a device portion of a wall circumference corresponding to the transparent portion wherein light emitted by the lighting device is transmitted through the transparent wall section in a direction orthogonal to a longitudinal axis of the tunnel to circumferentially illuminate the tunnel. In addition, a system status is indicated by a color of light emitted by the LEDs. Further, light emitted by the lighting device varies in intensity to indicate a changing count rate.
SYSTEMS AND METHODS FOR USING REGISTERED FLUOROSCOPIC IMAGES IN IMAGE-GUIDED SURGERY
A method performed by a computing system comprises receiving a fluoroscopic image of a patient anatomy while a portion of a medical instrument is positioned within the patient anatomy. The fluoroscopic image has a fluoroscopic frame of reference. The portion has a sensed position in an anatomic model frame of reference. The method further comprises identifying the portion in the fluoroscopic image and identifying an extracted position of the portion in the fluoroscopic frame of reference using the identified portion in the fluoroscopic image. The method further comprises registering the fluoroscopic frame of reference to the anatomic model frame of reference based on the sensed position of the portion and the extracted position of the portion.
METHOD AND DEVICE FOR DOCUMENTING THE USE OF AT LEAST ONE IMPLANT WHICH IS USED IN A SURGERY AND/OR THE LOCALIZATION THEREOF
A method and device for documenting use of at least one implant used in a surgery and/or for the localization thereof. The implant can be provided for a surgery and used in the surgery. The method includes: a) providing a surgical set having a plurality of implants; b) capturing a first sequence of images of the plurality of implants of the surgical set using a device; c) analyzing the sequence of images of the plurality of implants in order to identify each individual implant; d) optionally outputting a signal when one and/or each implant has been identified; e) capturing a second sequence of images of the plurality of implants of the surgical set using the device after a surgery in order to ascertain missing implants; f) classifying a missing implant as used in surgery.
SURGICAL SKILL TRAINING SYSTEM AND MACHINE LEARNING-BASED SURGICAL GUIDE SYSTEM USING THREE-DIMENSIONAL IMAGING
A surgical skill training system includes: a data collecting unit configured to collect actual surgical skill data on a patient of an operating surgeon; an image providing server configured to generate a 3-dimensional (3D) surgical image for surgical skill training, based on the actual surgical skill data; and a user device configured to display the 3D surgical image, wherein the image providing server includes: a patient image generating unit configured to generate a patient image, based on patient information of the patient; a surgical stage classifying unit configured to classify the actual surgical skill data into actual surgical skill data for each surgical stage performed by the operating surgeon; and a 3D image generating unit configured to generate the 3D surgical image by using the patient image, and feature information detected from the actual surgical skill data.