Patent classifications
A61B2090/364
Visualization system for visualizing of a three-dimensional target region of an object
A system for visualizing a three-dimensional target area of an object with a measuring device which determines a distance of a surgical instrument in a target area with respect to a predetermined structure in the target area, a display unit for representing the views, and a control unit. The control unit controls the display unit such that the display unit is in a first display mode when a determined distance is greater than a predetermined first limit value, and switches from the first display mode into a second display mode when the determined distance changes from being greater than a predetermined second limit value, which is smaller than or equal to the predetermined first limit value, to smaller than the predetermined second limit value.
Surgical evacuation sensor arrangements
Surgical systems can include evacuation systems for evacuating smoke, fluid, and/or particulates from a surgical site. A surgical evacuation system can be intelligent and may include one or more sensors for detecting one or more properties of the surgical system, evacuation system, surgical procedure, surgical site, and/or patient tissue, for example.
Method and system of providing visual information about a location and shape of a tumour under a body surface of a human or animal body
In a method and system for providing visual information about a tumour location in human or animal body, an electromagnetic tumour sensor is provided in the tumour and tracked to determine its location in space, which is mapped to a tumour model. A surgical tool sensor is provided on a surgical tool, and tracked to determine its location in space, which is mapped to a surgical tool model. The body is scanned to obtain information about an anatomical structure. A reference sensor is provided on the body, and tracked to determine its location in space, which is mapped to the anatomical structure. A virtual image is displayed showing the tumour model, located with the at least one tumour sensor, in spatial relationship to the surgical tool model, located with the at least one surgical tool sensor, and the anatomical structure, located with the at least one reference sensor.
Systems and methods for registering images obtained using various imaging modalities and verifying image registration
Embodiments of the present invention provide systems and methods to detect a moving anatomic feature during a treatment sequence based on a computed and/or a measured shortest distance between the anatomic feature and at least a portion of an imaging system.
System and method for RF ablation with generated images of ablated tissue lesions
The invention includes a system for generating virtual images of proposed and designated areas on a patient's anatomy that are to be treated in a RFA procedure. The images include a size, shape, and location of lesion/ablation patterns. The virtual images include dynamic (developing) or static (developed) lesions selected for the RFA procedure. The images are provided on at least one user interface that superimposes or overlays the lesion pattern(s) on an image of a patient's anatomy that undergoes the procedure. The images can be used to accurately and efficiently conduct RFA procedures and to record the procedures with enhanced visual data to confirm treated tissue areas. The invention further includes a diagnostic method of generating images in preparation for a RFA procedure, and a method of conducting the RFA procedure in which measured parameters determine the size and shape of the ablated areas achieved in the procedure.
Bone registration methods for robotic surgical procedures
A computer-implemented method to improve the point collection process during registration of a bone for a computer-assisted surgical procedure is provided. Based on bone digitization data, a simulation is performed to confirm the accuracy of the registration for different digitization regions. Results are tested to identify which digitization regions meet a predefined accuracy requirement. The resulting information is used to perform a computer-assisted surgical procedure. A computerized simulation method for registration of a bone for a computer-assisted surgical procedure is also provided based on processor executing random stroking an expected exposed surface of a bone model with multiple of stroke curves to cover most of the bone model surface with uniform noise and a random sample consensus is applied to remove outlying point to yield the best registration results, to find the top subset as to overlap. A method to perform computer-assisted surgery is also provided.
Enhanced catheter navigation methods and apparatus
Methods, apparatus, and systems are provided for facilitating the navigation of a catheter between first and second locations within a subject based on display of serial images corresponding to positions of the catheter at successive incremental times. Image production includes sensing catheter positions to produce location data for each time increment. For each position P.sub.i, the corresponding location data is processed to respectively produce an image I.sub.i reflecting the position of the catheter at a time T.sub.i. Each image I.sub.i is successively displayed at a time equal to T.sub.i+d, where d is an image processing visualization delay. Upon a condition that the catheter is displaced to a selected interim location between the first and second locations, the processing of the location data is switched from being performed by a first process associated with a first visualization delay to a second process associated with a second different visualization delay.
SCENE PERCEPTION SYSTEMS AND METHODS
Scene perception systems and methods are described herein. In certain illustrative examples, a system combines data sets associated with imaging devices included in a dynamic multi-device architecture and uses the combined data sets to perceive a scene (e.g., a surgical scene) imaged by the imaging devices. To illustrate, the system may access tracking data for imaging devices capturing images of a scene and fuse, based on the tracking data, data sets respectively associated with the imaging devices to generate fused sets of data for the scene. The tracking data may represent a change in a pose of at least one of the image devices that occurs while the imaging devices capture images of the scene. The fused sets of data may represent or be used to generate perceptions of the scene. In certain illustrative examples, scene perception is dynamically optimized using a feedback control loop.
In-scale tablet display for medical device position guidance
An in-scale display device is provided. The in-scale display device includes a tablet or mobile device having an electronic display screen that is configured to display at least one reference image in-scale with a subject. A medical device position guidance system including the in-scale display device and an invasive medical device system, and a method of using the medical device position guidance system, are also provided.
Medical apparatus with optical sensing, and related devices and methods
A medical apparatus can include an instrument comprising a shaft and a jaw assembly coupled to an end of the shaft; an image capture device; and a controller operably coupled to the image capture device to receive image data from the image capture device. The image data is from images of material gripped between jaw members of the jaw assembly and captured by the image capture device, with the controller programmed to process the received image data using at least one of optical flow and digital image correlation. A medical apparatus can include an instrument comprising a shaft, and a jaw assembly coupled to an end of the shaft, the jaw assembly comprising a pair of jaw members having opposing surfaces configured to grasp material between the opposing surfaces, wherein at least a portion of the opposing surface of a first jaw member of the pair of jaw members is transparent.