A61B2090/364

Surgical system with combination of sensor-based navigation and endoscopy

A set of pre-operative images may be captured of an anatomical structure using an endoscopic camera. Each captured image is associated with a position and orientation of the camera at the moment of capture using image guided surgery (IGS) techniques. This image data and position data may be used to create a navigation map of captured images. During a surgical procedure on the anatomical structure, a real-time endoscopic view may be captured and displayed to a surgeon. The IGS navigation system may determine the position and orientation of the real-time image; and select an appropriate pre-operative image from the navigation map to display to the surgeon in addition to the real-time image.

Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications

A surgical system includes an XR headset and an XR headset controller. The XR headset is configured to be worn by a user during a surgical procedure and includes a see-through display screen configured to display an XR image for viewing by the user. The XR headset controller is configured to receive a plurality of two-dimensional (“2D”) image data associated with an anatomical structure of a patient. The XR headset controller is further configured to generate a first 2D image from the plurality of 2D image data based on a pose of the XR headset. The XR headset controller is further configured to generate a second 2D image from the plurality of 2D image data based on the pose of the XR headset. The XR headset controller is further configured to generate the XR image by displaying the first 2D image in a field of view of a first eye of the user and displaying the second 2D image in a field of view of a second eye of the user.

Technique of Providing User Guidance For Obtaining A Registration Between Patient Image Data And A Surgical Tracking System

A method of providing user guidance. First patient image data of a patient's body is obtained. A registration instruction indicative of where to acquire a registration point relative to a surface of the body is determined. Second patient image data of the body, having been acquired by an augmented reality device, is obtained. A transformation between coordinate systems of the first and the second patient image data is determined. Based on the transformation, display of the registration instruction on a display of the AR device is triggered such that a user of the AR device is presented an augmented view with the registration instruction being overlaid onto the patient's body. The augmented view guides the user where to acquire the registration point. Also disclosed are a computing system, a surgical navigation system, and a computer program product.

SURGICAL SYSTEM AND INFORMATION PROCESSING METHOD
20220370155 · 2022-11-24 · ·

A surgical system includes: an endoscope capable of acquiring an endoscopic image of a surface of target tissue; an ultrasonic probe capable of acquiring an ultrasonic tomographic image of the target tissue; a treatment instrument; a display; and a controller including a memory and a processor. In response to the ultrasonic probe being inserted into the body cavity and the ultrasonic tomographic image being acquired, the processor is configured to: detect a position of the ultrasonic probe with respect to the endoscope; store the ultrasonic tomographic image associated with the position of the ultrasonic probe; and in a state in which the treatment instrument remains inserted in the body cavity, detect a position of the treatment instrument, read out the stored ultrasonic tomographic image on a basis of the detected position of the treatment instrument, and command the display to display the read-out ultrasonic tomographic image.

Surgical Simulation System With Coordinated Imagining

An interactive and dynamic surgical simulation system may be used in the context of a computer-implemented interactive surgical system. The surgical simulation system may provide coordinated surgical imagining. A processor may be configured to execute a simulation of a surgical procedure. The surgical procedure may be simulated in a simulated surgical environment. The processor may generate a first visual representation and a second visual representation. The first visual representation may be of a first portion of the simulated surgical environment. The second visual representation may also be of the first portion of the simulated surgical environment. The processor may coordinate generation of the first visual representation and the second visual representation such that the first visual representation and the second visual representation correspond to a common event in the surgical procedure. And the processor may present the first visual representation and the second visual representation for user interaction within the simulated surgical environment.

Transperineal imaging-guided prostate needle placement

Prostate biopsy systems are provided that include a 3D ultrasound probe support configured to receive an ultrasound probe for transperineal imaging. One or more template grids can have a plurality of apertures extending therethrough to receive and guide a biopsy needle along a trajectory associated with respective apertures when the template grid is fixed to the support and the biopsy system is positioned in the perineal area of a patient. Patient-specific template grids can also be developed and produced. This system enables fully transperineal prostate biopsy (i.e. both imaging and needle placement are perineal) and eliminates the need for an external racking device for image fusion as well as needle tracking. In addition, it reduces the infection risk associated to transrectal approach.

Three-dimensional imaging and modeling of ultrasound image data

The position and orientation of an ultrasound probe is tracked in three dimensions to provide highly-accurate three-dimensional bone surface images that can be used for anatomical assessment and/or procedure guidance. The position and orientation of a therapy applicator can be tracked in three dimensions to provide feedback to align the projected path of the therapy applicator with a desired path for the therapy applicator or to provide feedback to align the potential therapy field of a therapy applicator with a target anatomical site. The three-dimensional bone surface images can be fit to a three-dimensional model of the anatomical site to provide or display additional information to the user to improve the accuracy of the anatomical assessment and/or procedure guidance.

Probe with radiopaque tag

A medical procedure system, including a medical instrument to be inserted into a body part, and including position-tracking transducers to provide position signals, a distal end, and at least one radiopaque marker, a position tracking sub-system to compute a position including at least one location and orientation of the distal end in a position-tracking sub-system coordinate frame responsively to the position signals, a fluoroscope to capture fluoroscopic images of an interior of the body part and the radiopaque marker(s), and a registration sub-system to render, to a display, the captured fluoroscopic images including at least one marker-image of the radiopaque marker(s), and at least one graphical representation indicative of the computed position of the distal end, receive user-alignment input aligning the graphical representation(s) with the marker-image(s), and register the position-tracking sub-system coordinate frame with a coordinate frame of the fluoroscope responsively to the received user-alignment input.

Robotic systems and methods for navigation of luminal network that detect physiological noise

Provided are robotic systems and methods for navigation of luminal network that detect physiological noise. In one aspect, the system includes a set of one or more processors configured to receive first and second image data from an image sensor located on an instrument, detect a set of one or more points of interest the first image data, and identify a set of first locations and a set of second location respectively corresponding to the set of points in the first and second image data. The set of processors are further configured to, based on the set of first locations and the set of second locations, detect a change of location of the instrument within a luminal network caused by movement of the luminal network relative to the instrument based on the set of first locations and the set of second locations.

System and Method for Computation of Coordinate System Transformations
20220361959 · 2022-11-17 ·

The present invention relates to a medical system (100) for determining a coordinate transformation between a coordinate system (INN.sub.1) of an internal structure (I.sub.1) inside a physical object (1) and a coordinate system (IMA) of a 3D image or model thereof.