A61B2090/367

ANATOMICAL FEATURE EXTRACTION AND PRESENTATION USING AUGMENTED REALITY
20230019543 · 2023-01-19 ·

An ultrasound probe captures real-time images of patient anatomy, which are analyzed by a processor to extract salient features pertaining to an anatomical structure. By tracking the location and orientation of the ultrasound probe, a model of that anatomical structure can be created. A visual indication of the position of segments of the anatomical structure can be presented holographically to a user of an augmented reality headset to provide information extracted from the ultrasound imaging, such as holographic display of a model of the anatomical structure at the approximate location of the visual field of the headset corresponding to the physical location of the actual anatomy being viewed by a user, without presenting the entirety of the ultrasound image to the user.

PROGRAM, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING APPARATUS, AND MODEL GENERATION METHOD

A non-transitory computer-readable medium (CRM) storing computer program code executed by a computer processor that executes a process, an information processing apparatus, and a model generation method that outputs complication information for a medical treatment. The process includes acquiring a medical image obtained by imaging a lumen organ of a patient before treatment, inputting the acquired medical image into a trained model so as to output complication information on a complication that is likely to occur after the treatment when the medical image is received, and outputting the complication information. Preferably, complication information including a type of the complication that is likely to occur and a probability value indicating an occurrence probability of the complication of the type is output.

AUTOMATIC PLANNING METHOD AND DEVICE FOR TISSUE ABLATION

Disclosed are an automatic planning method and a device for tissue ablation. The method includes: obtaining a three dimensional (3D) model of a to-be-ablated tissue through a 3D reconstruction technique; marking a cylindrical ablation point on the 3D model through an ablation planning, an axial direction of the ablation point is the same as a radio frequency direction of thermal ablation; and displaying an ablated area on the 3D model, the reconstruction technique includes: obtaining slice images of the to-be-ablated tissue in a plurality of directions, the slice image in each direction includes a plurality of two dimensional (2D) images; depicting, by a primitive, the to-be-ablated tissue on the 2D images in one direction; and constructing the 3D model of the to-be-ablated tissue through the 3D reconstruction technique based on the original 2D images.

METHODS AND SYSTEMS FOR DISPLAYING PREOPERATIVE AND INTRAOPERATIVE IMAGE DATA OF A SCENE

Mediated-reality imaging systems, methods, and devices are disclosed herein. In some embodiments, an imaging system includes a camera array configured to (i) capture intraoperative image data of a surgical scene in substantially real-time and (ii) track a tool through the scene. The imaging system is further configured to receive and/or store preoperative image data, such as medical scan data corresponding to a portion of a patient in the scene. The imaging device can register the preoperative image data to the intraoperative image data, and display the preoperative image data and a representation of the tool on a user interface, such as a head-mounted display.

Stereotactic Computer Assisted Surgery Method and System
20230218323 · 2023-07-13 ·

A computer assisted surgical system that includes an apparatus for imaging a region of interest of a portion of an anatomy of a subject; a memory containing executable instructions; and a processor programmed using the instructions to receive two or more two-dimensional images of the region of interest taken at different angles from the apparatus and process the two or more two-dimensional images to produce three dimensional information associated with the region of interest.

Surgical display
11696813 · 2023-07-11 · ·

Disclosed herein are visualization systems, methods, devices and database configurations related to the real-time depiction, in 2 D and 3 D on monitor panels as well as via 3 D holographic visualization, of the internal workings of patient surgery, such as patient intervention site posture as well as the positioning, in some cases real time positioning, of an object foreign to the patient.

REAL TIME IMAGE GUIDED PORTABLE ROBOTIC INTERVENTION SYSTEM

An image-guided robotic intervention system (“IGRIS”) may be used to perform medical procedures on patients. IGRIS provides a real-time view of patient anatomy, as well as an intended target or targets for the procedures, software that allows a user to plan an approach or trajectory path using either the image or the robotic device, software that allows a user to convert a series of 2D images into a 3D volume, and localizes the 3D volume with respect to real-time images during the procedure. IGRIS may include sensors to estimate pose of the imaging device relative to the patient to improve the performance of that software with respect to runtime, robustness, and accuracy.

System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction

A system and method for navigating to a target using fluoroscopic-based three dimensional volumetric data generated from two dimensional fluoroscopic images, including a catheter guide assembly including a sensor, an electromagnetic field generator, a fluoroscopic imaging device to acquire a fluoroscopic video of a target area about a plurality of angles relative to the target area, and a computing device. The computing device is configured to receive previously acquired CT data, determine the location of the sensor based on the electromagnetic field generated by the electromagnetic field generator, generate a three dimensional rendering of the target area based on the acquired fluoroscopic video, receive a selection of the catheter guide assembly in the generated three dimensional rendering, and register the generated three dimensional rendering of the target area with the previously acquired CT data to correct the position of the catheter guide assembly.

Systems, methods, and computer-readable storage media for controlling aspects of a robotic surgical device and viewer adaptive stereoscopic display

A system includes a robotic arm, an autosteroscopic display, a user image capture device, an image processor, and a controller. The robotic arm is coupled to a patient image capture device. The autostereoscopic display is configured to display an image of a surgical site obtained from the patient image capture device. The image processor is configured to identify a location of at least part of a user in an image obtained from the user image capture device. The controller is configured to, in a first mode, adjust a three dimensional aspect of the image displayed on autostereoscopic display based on the identified location, and, in a second mode, move the robotic arm or instrument based on a relationship between the identified location and the surgical site image.

CLINICAL DIAGNOSTIC AND PATIENT INFORMATION SYSTEMS AND METHODS
20230210473 · 2023-07-06 ·

A patient information system is described. A virtual representation of a patient is generated using a captured image of the patient. A graphical user interface is displayed on the display, the graphical user interface including a first screen portion configured to display the virtual representation of the patient and a second screen portion configured to display information. A selection of an area of interest is received on the virtual representation of the patient. Clinical and diagnostic information associated with the selected area of interest is received and displayed in the second screen portion of the graphical user interface. The graphical user interface also includes a timeline that displays events of interest. A selection of an event of interest causes the second screen portion of the graphical user interface to display clinical and diagnostic information associated with the selected event of interest.