A61B2034/2063

Surgical systems and methods for identifying tools guided by surgical robots
11534242 · 2022-12-27 · ·

A surgical system for assisting a user in performing a surgical procedure at a surgical site, comprising a tool having a checkpoint, a pointer having a tip, and a localizer to determine a position of the pointer within a field of view. A memory comprises identification data associated with a plurality of tools. A controller is configured to prompt the user to position the tip of at the checkpoint, to receive position data from the localizer associated with the pointer within the field of view, to compare position data associated with the pointer against the identification data of the memory to determine an identity of the tool, and to present the user with the identity of the tool.

Three-dimensional segmentation from two-dimensional intracardiac echocardiography imaging

For three-dimensional segmentation from two-dimensional intracardiac echocardiography imaging, the three-dimension segmentation is output by a machine-learnt multi-task generator. The machine-learnt multi-task generator is trained from 3D information, such as a sparse ICE volume assembled from the 2D ICE images. The machine-learnt multi-task generator is trained to output both the 3D segmentation and a complete volume. The 3D segmentation may be used to project to 2D as an input with an ICE image to another network trained to output a 2D segmentation for the ICE image. Display of the 3D segmentation and/or 2D segmentation may guide ablation of tissue in the patient.

SYSTEMS AND METHODS UTILIZING MACHINE-LEARNING FOR IN VIVO NAVIGATION

A method of providing in vivo navigation of a medical device includes: receiving input medical imaging data of a patient's anatomy; receiving input non-optical in vivo image data from a sensor on a distal end of the device in the anatomy; using a trained model to locate the distal end in the input imaging data, wherein: the model is trained, based on (i) training medical imaging data and training non-optical in vivo image data of one or more individuals' anatomy and (ii) registration data associating the training image data with locations in the training imaging data as ground truth, to learn associations between the training image data and the training imaging data; determining an output location of the medical device using the learned associations and the input data; modifying the input imaging data to depict the determined location; and causing a display to output the modified input imaging data.

PATH PREPARATION SYSTEM FOR PREPARING A PATH FOR A DEVICE
20220401155 · 2022-12-22 ·

A path preparation system for preparing a path for a device. The path preparation system includes an ultrasound transmitter and a tracking system. The tracking system is configured to determine a current position of the device on the path. The ultrasound transmitter is configured to focus an ultrasound wave onto a focus position that lies in front of the device in the direction of the path, in spatial relation to the current position of the device.

Computer Vision Systems and Methods for Time-Aware Needle Tip Localization in 2D Ultrasound Images

Computer vision systems and methods for time-aware needle tip localization in two-dimensional (2D) ultrasound images are provided. A consecutive fused image sequence, derived from fusion of the enhanced frames and the corresponding B-mode frames, is processed by a time-aware neural network which includes a unified convolutional neural network (CNN) and a long short-term memory (LSTM) recurrent neural network. The CNN acts as a feature extractor, with stacked convolutional layers which progressively create a hierarchy of more abstract features. The LSTM models temporal dependencies in time-series data. The system learns spatiotemporal features associated with needle tip movement, for example, needle tip appearance and trajectory information, and successfully localizes the needle tip in the presence of abrupt intensity changes and motion artifacts.

DEVICES, SYSTEMS, AND METHODS FOR TRANS-VAGINAL, ULTRASOUND-GUIDED HYSTEROSCOPIC SURGICAL PROCEDURES
20220401071 · 2022-12-22 ·

An ultrasound device includes an ultrasound body having a shaft and an ultrasound sensor assembly disposed at a distal end portion of the shaft. The ultrasound sensor assembly is configured to enable ultrasound imaging. A clip is configured for positioning about a portion of a surgical tool. The clip is configured to releasably engage the ultrasound body to thereby releasably couple the surgical tool with the ultrasound body. A surgical system includes the ultrasound device and the surgical tool.

Wireless position determination

The present invention relates to a system SY for determining a position of an RF transponder circuit RTC respective an ultrasound emitter unit UEU. The RF transponder circuit RTC emits RF signals that are modulated based on received ultrasound signals that are emitted or reflected by the ultrasound emitter unit UEU. The position of the RF transponder circuit RTC respective the ultrasound emitter unit UEU is determined based on a time difference ΔT1 between the emission of an ultrasound signal by the ultrasound emitter unit UEU and the detection by the RF detector unit RFD of a corresponding modulation in the RF signal emitted or reflected by the RF transponder circuit (RTC).

Instrument guiding device

The invention relates to a medical instrument guiding device comprising the medical instrument, a monitoring device (2) comprising a support (3) and a medical imaging probe (4) which is arranged on the support, a screen (10), and a control unit (11) of the device which is connected to the screen and the probe for generating at least one three-dimensional image, the control unit being configured to generate at least one two-dimensional image on the screen showing a deformation of the instrument from at least the three-dimensional image, the control unit being configured to estimate a virtual path of the instrument from the deformation of the instrument for extending the insertion thereof to a target, and deduce therefrom at least one distance between the virtual path and the target.

Biopsy apparatus and system

Certain aspects relate to biopsy apparatuses, systems and techniques for biopsy using a biopsy pattern. Some aspects relate to moving a distal portion of a medical instrument to one or more sample locations of the biopsy pattern and guiding the instrument to obtain tissue samples from the sample locations within the biopsy pattern. Some aspects relate to obtaining the biopsy pattern and adjusting the sample locations within the biopsy pattern based on various factors such as anatomical features.

MULTI-ARM ROBOTIC SYSTEMS AND METHODS FOR MONITORING A TARGET OR PERFORMING A SURGICAL PROCEDURE
20220395342 · 2022-12-15 ·

Multi-arm robotic systems and methods for monitoring a target are provided. The system may include a first robotic arm configured to orient a first component and a second robotic arm configured to orient a second component. The first robotic arm and the second robotic arm may be co-registered. The first robotic arm may be caused to orient the first component at a first pose. The second robotic arm may be caused to orient the second component at a second pose. At least one image may be received from the first component and the second component.