Patent classifications
A61B2090/367
Three-dimensional segmentation from two-dimensional intracardiac echocardiography imaging
For three-dimensional segmentation from two-dimensional intracardiac echocardiography imaging, the three-dimension segmentation is output by a machine-learnt multi-task generator. The machine-learnt multi-task generator is trained from 3D information, such as a sparse ICE volume assembled from the 2D ICE images. The machine-learnt multi-task generator is trained to output both the 3D segmentation and a complete volume. The 3D segmentation may be used to project to 2D as an input with an ICE image to another network trained to output a 2D segmentation for the ICE image. Display of the 3D segmentation and/or 2D segmentation may guide ablation of tissue in the patient.
POSITIONING AND TRACKING MEMBER, METHOD FOR RECOGNIZING MARKER, STORAGE MEDIUM, AND ELECTRONIC DEVICE
A positioning tracking member, a method for recognizing a maker (20), a storage medium, and an electronic device. By directly sticking a positioning tracking member onto the body of a patient, a rigid connection between the positioning tracking member and the human body is not required, thereby avoiding damage to the human body. Furthermore, in combination with a recognition algorithm of the maker (20), recognition of the maker (20) in the image space is quickly achieved by comparing the actual size of each candidate connected region in a three-dimensional medical model with that of the marker (20), the recognition speed being high and the recognition accuracy being high.
MEDICAL OBSERVATION SYSTEM, CONTROL DEVICE, AND CONTROL METHOD
A medical observation system includes: a plurality of types of sensor units that measure information regarding an internal environment; an acquisition unit (131) that acquires individual sensor values of the plurality of types of sensor units; a comparison unit (132) that compares the individual sensor values of the plurality of types of sensor units acquired by the acquisition unit (131); and a determination unit (134) that determines a sensor unit to be used for observing the internal environment among the plurality of types of sensor units based on a comparison result obtained by the comparison unit (132).
DEVICES, SYSTEMS, AND METHODS FOR TRANS-VAGINAL, ULTRASOUND-GUIDED HYSTEROSCOPIC SURGICAL PROCEDURES
An ultrasound device includes an ultrasound body having a shaft and an ultrasound sensor assembly disposed at a distal end portion of the shaft. The ultrasound sensor assembly is configured to enable ultrasound imaging. A clip is configured for positioning about a portion of a surgical tool. The clip is configured to releasably engage the ultrasound body to thereby releasably couple the surgical tool with the ultrasound body. A surgical system includes the ultrasound device and the surgical tool.
Device and method for tracking the position of an endoscope within a patient's body
Systems and methods of tracking the position of an endoscope within a patient's body during an endoscopic procedure is disclosed. The devices and methods include determining a position of the endoscope within the patient in the endoscope's coordinate system, capturing in an image fiducial markers attached to the endoscope by an external optical tracker, transforming the captured fiducial markers from the endoscope's coordinate system to the optical tracker's coordinate system, projecting a virtual image of the endoscope on a model of the patient's organ, and projecting or displaying the combined image.
DEVICE AND METHOD FOR ASSISTING LAPAROSCOPIC SURGERY - DIRECTING AND MANEUVERING ARTICULATING TOOL
A surgical controlling system for controlling the 3D spatial position of at least one articulating surgical tool includes a controller configured to provide instructions to control movement of the surgical tool. The controller comprises a processor to determine a location of the surgical tool using at least one location estimating feature and to determine movement of the surgical tool using a database in communication with at least one movement detection feature. The location estimating feature is configured to real-time locate the 3D spatial position of the surgical tool at any given time t, and the movement detection feature is configured to detect movement of the surgical tool.
AUGMENTED REALITY-ASSISTED METHOD FOR PERFORMING SURGERY
An augmented reality-assisted method for performing surgery comprises: disposing a position sensing element at a facial positioning point of a patient before craniotomy to obtain skull space and intracranial space information for defining a coordinate space; obtaining a brain anatomical image for constructing a three-dimensional graphic, the graphic comprising a graphic positioning point and a feature associated with a gyrus feature; defining a relative positional relationship between the graphic and the space, aligning the facial positioning point with the graphic positioning point; using a probe to obtain a spatial position of the gyrus feature after craniotomy, using the gyrus feature as a calibration reference point; generating a displacement and rotation parameter based on a coordinate difference of the feature relative to the reference point; adjusting a position and/or an angle of the graphic on a display according to the parameter, and the display displaying the calibrated three-dimensional graphic.
ARTICULATED STRUCTURED LIGHT BASED-LAPAROSCOPE
In a method of using a structured-light based system, real-time 2D images of a portion of a field of view are captured using an endoscope. A portion of an object in the field of view is illuminated with a structured light pattern, and light reflected from the field of view is detected. From the reflected light, a 3D image of the field of view is constructed, and 3D locations of points on a surface of the object are determined. The real time 3D spatial position of the endoscope and/or a surgical tool is determined. If a distance between the surface the endoscope and/or surgical tool, as determined using the 3D spatial position, falls below a predetermined distance, an alert is generated to notify a user.
INSTRUMENTS FOR ROBOTIC KNEE REVISION
A device for registering a bone for a robotic knee arthroplasty with a surgical robot can include a plate and a registration device. The plate can be engageable with the bone and can include a lateral portion, a medial portion, and a hinge. The registration device can be connected to the plate and can be configured to interface with the surgical robot for registration of the plate and the bone.
System for navigating a surgical instrument
The invention relates to a system for navigating a surgical instrument (1), comprising a processor configured for: obtaining a first 3D medical image of a first volume (V1) of a patient's body, said first volume (V1) comprising a reference marker (M), registering the first 3D image with said reference marker (M), obtaining a second 3D medical image of a second volume (V2) of the patient's body, said second volume (V2) being different from the first volume (V1) and not containing the reference marker (M) in its entirety, said first and second 3D images being obtained by a single imaging device, registering the second 3D medical image with the first 3D medical image, obtaining a virtual position of the surgical instrument (1) with respect to the reference marker (M) from a tracking system, determining a virtual position of the surgical instrument (1) with respect to the second 3D medical image.