Patent classifications
A61B2090/363
Three-dimensional segmentation from two-dimensional intracardiac echocardiography imaging
For three-dimensional segmentation from two-dimensional intracardiac echocardiography imaging, the three-dimension segmentation is output by a machine-learnt multi-task generator. The machine-learnt multi-task generator is trained from 3D information, such as a sparse ICE volume assembled from the 2D ICE images. The machine-learnt multi-task generator is trained to output both the 3D segmentation and a complete volume. The 3D segmentation may be used to project to 2D as an input with an ICE image to another network trained to output a 2D segmentation for the ICE image. Display of the 3D segmentation and/or 2D segmentation may guide ablation of tissue in the patient.
APPARATUS AND METHOD FOR SENSING AND ANALYZING SKIN CONDITION
A skin imaging and diagnostic method and apparatus comprising, a frame, configured to circumscribe a target tissue on the skin of a patient. An electro-optics unit of the apparatus comprising: an illuminator assembly comprising illuminating elements, configured to provide illumination light on the target tissue; an imaging optics assembly; and an image sensor assembly, comprising an image sensor, wherein the imaging optics assembly is configured to collect backscattered said illumination light from the target tissue and focus the collected backscattered illumination light on the image sensor; and the image sensor is disposed to consequently sense an image of the target tissue. A controller configured to activate illuminating elements and to capture each image from the image sensor.
Robotic navigation of robotic surgical systems
In certain embodiments, the systems, apparatus, and methods disclosed herein relate to robotic surgical systems with built-in navigation capability for patient position tracking and surgical instrument guidance during a surgical procedure, without the need for a separate navigation system. Robotic based navigation of surgical instruments during surgical procedures allows for easy registration and operative volume identification and tracking. The systems, apparatus, and methods herein allow re-registration, model updates, and operative volumes to be performed intra-operatively with minimal disruption to the surgical workflow. In certain embodiments, navigational assistance can be provided to a surgeon by displaying a surgical instrument's position relative to a patient's anatomy. Additionally, by revising pre-operatively defined data such as operative volumes, patient-robot orientation relationships, and anatomical models of the patient, a higher degree of precision and lower risk of complications and serious medical error can be achieved.
In-situ additive implants
An in-situ additive-manufacturing system for growing an implant in-situ for a patient. The system has a multi-nozzle dispensing subsystem and a distal control arm. The multi-nozzle dispensing subsystem in one embodiment includes first and second dispensing nozzles. The first and second nozzles include first and second printing-material delivery channels, respectively. In another embodiment, the in-situ additive-manufacturing system includes a multi-material subsystem having a dispensing nozzle including first and second printing material delivery channels. Controlling computing and robotics componentry are provided. In various aspects, respective storage for first and second printing materials, and one or more pumping structures, are provided.
SYSTEM, METHOD AND COMPUTER PROGRAM PRODUCT FOR IMPROVED MINI-SURGERY USE CASES
An imaging system aka 3d camera operative in conjunction with a tube having two open ends, the system comprising active portions small enough to fit into the tube and an electronic subsystem including a hardware processor operative to receive image/s from the active portions and to generate therefrom at least one 3D image of a scene visible via one of the tube's open ends. The system may comprise a tracker configured to be secured to the tube, and a method for monitoring location, e.g. absolute location, of the tube, accordingly.
Surgical Simulation System With Coordinated Imagining
An interactive and dynamic surgical simulation system may be used in the context of a computer-implemented interactive surgical system. The surgical simulation system may provide coordinated surgical imagining. A processor may be configured to execute a simulation of a surgical procedure. The surgical procedure may be simulated in a simulated surgical environment. The processor may generate a first visual representation and a second visual representation. The first visual representation may be of a first portion of the simulated surgical environment. The second visual representation may also be of the first portion of the simulated surgical environment. The processor may coordinate generation of the first visual representation and the second visual representation such that the first visual representation and the second visual representation correspond to a common event in the surgical procedure. And the processor may present the first visual representation and the second visual representation for user interaction within the simulated surgical environment.
Mobile surgical tracking system with an integrated fiducial marker for image guided interventions
A mobile surgical tracking system comprises a mobile surgical tracking device comprising an integrated fiducial marker and an imaging device. The imaging device is configured to generate an image of a patient's anatomical structure. The mobile surgical tracking system comprises a tracking system coordinate frame. The integrated fiducial marker has a position which has a known relation to the tracking system coordinate frame for the direct registration of the image to the coordinate system of the mobile surgical tracking device.
System and Method for Computation of Coordinate System Transformations
The present invention relates to a medical system (100) for determining a coordinate transformation between a coordinate system (INN.sub.1) of an internal structure (I.sub.1) inside a physical object (1) and a coordinate system (IMA) of a 3D image or model thereof.
REGISTRATION AND ALIGNMENT OF IMPLANTABLE SONIC WINDOWS
A medical device and a method of use thereof for frameless stereotaxy guided intracranial surgery. The medical device includes a central section made from a material that is transparent to ultrasound providing a sonic window, and an ultrasound reflective frame surrounding the central section. The method includes the steps of registering the ultrasound reflective frame with the frameless stereotaxy system for localization of the medical device during surgery. The medical device allows use of ultrasound imaging wherein the output of ultrasound imaging can be computationally combined with MRI or CT imaging data to compensate for anatomical changes in brain during surgery and enhanced localization and navigation to the surgery target.
SYSTEMS AND METHODS FOR GENERATING MULTIPLE REGISTRATIONS
Systems and methods for generating multiple registrations is provided. An image depicting a portion of a patient's anatomy and a tracking device affixed to an accurate robot may be received. A first registration of a patient coordinate space to a robotic coordinate space may be generated based on the image. A second registration of the patient coordinate space to a navigation coordinate space based at least in part on a position of second markers on the tracking device detected by a navigation system may be generated. The first registration and the second registration may be independent of each other.