Patent classifications
A61B2034/2048
AUGMENTED REALITY SYSTEM AND METHODS FOR STEREOSCOPIC PROJECTION AND CROSS-REFERENCING OF LIVE X-RAY FLUOROSCOPIC AND COMPUTED TOMOGRAPHIC C-ARM IMAGING DURING SURGERY
A method for performing a procedure on a patient includes acquiring a three-dimensional image of a location of interest on the patient and a two-dimensional image of the location of interest can be acquired. A computer system can relate the three-dimensional image with the two-dimensional image to form a holographic image dataset. The computer system can register the holographic image dataset with the patient. The augmented reality system can render a hologram based on the holographic image dataset from the patient. The hologram can include a projection of the three-dimensional image and a projection of the two-dimensional image. The practitioner can view the hologram with the augmented reality system and perform the procedure on the patient. The practitioner can employ the augmented reality system to visualize a point on the projection of the three-dimensional image and a corresponding point on the projection of the two-dimensional image during the procedure.
PLANNING AND NAVIGATION IN SUPERSELECTIVE DRUG DELIVERY VIA THE TRACHEOBRONCHIAL AIRWAY
Devices, systems, and methods for localized delivery of a chemotherapy, hormonal therapy, or targeted drug/biologic therapy to a target tissue area of an internal body organ of a patient. Computer systems may be used for planning and navigation in super selective drug delivery via a tracheobronchial airway. A catheter may be used to form a sealed treatment chamber in a natural lumen extending through the target tissue area. Air is purged from the chamber, which is then filled with a liquid drug solution for an adequate treatment session time, solution volume and drug concentration to saturate the target tissue area, thereby providing the treatment. The liquid drug solution may be circulated or recirculated through the chamber or maintained stationary therewithin to saturate the target tissue area. The chamber is evacuated at the end of the treatment session.
MULTIPLE-INPUT INSTRUMENT POSITION DETERMINATION
A robotic system includes an instrument including an elongate shaft, a robotic manipulator configured to manipulate the elongate shaft of the instrument, and control circuitry communicatively coupled to the robotic manipulator and configured to determine a first estimated position of at least a portion of the elongate shaft of the instrument based at least in part on robotic command data, determine a second estimated position of the at least a portion of the elongate shaft of the instrument based at least in part on position sensor data, compare the first estimated position and the second estimated position, and generate a third estimated position based at least in part on the comparison of the first estimated position to the second estimated position.
METHOD OF LOCATING A MOBILE PERCEPTION DEVICE INTENDED TO BE WORN OR CARRIED BY A USER IN A SURGICAL SCENE
The method of locating at least one mobile perception device of a navigation platform, the mobile perception device intended to be worn or carried by a user in a surgical scene, the navigation platform including at least one perception sensor, comprises: —acquiring, by the at least one perception sensor, a plurality of successive images of the scene including the portion of the body of the patient intended to be subjected to the surgical operation; —processing the plurality of successive images to evaluate a relative position of the mobile perception device and the portion of the body intended to be subjected to the surgical operation, wherein the relative position of the mobile perception device and the portion of the body takes into account a movement of the mobile perception device.
Virtual reality training, simulation, and collaboration in a robotic surgical system
A virtual reality system providing a virtual robotic surgical environment, and methods for using the virtual reality system, are described herein. Within the virtual reality system, various user modes enable different kinds of interactions between a user and the virtual robotic surgical environment. For example, one variation of a method for facilitating navigation of a virtual robotic surgical environment includes displaying a first-person perspective view of the virtual robotic surgical environment from a first vantage point, displaying a first window view of the virtual robotic surgical environment from a second vantage point and displaying a second window view of the virtual robotic surgical environment from a third vantage point. Additionally, in response to a user input associating the first and second window views, a trajectory between the second and third vantage points can be generated sequentially linking the first and second window views.
ALGORITHM-BASED METHODS FOR PREDICTING AND/OR DETECTING A CLINICAL CONDITION RELATED TO INSERTION OF A MEDICAL INSTRUMENT TOWARD AN INTERNAL TARGET
Provided are computer-implemented methods and systems for generating and/or utilizing data analysis algorithm(s) for predicting and/or detecting a clinical condition related to insertion of a medical instrument toward a target in a body of a patient based, inter alia, on data related to an automated medical device and/or to operation thereof.
Medical holding apparatus and medical observation system
A medical holding apparatus includes: a support including a plurality of arms, and a plurality of joints configured to connect the plurality of arms, the support being configured to support an imaging unit at a distal end thereof; a load applying mechanism arranged in at least one of the joints and configured to apply a resistance load against operation of the at least one of the joints to the support; and a processor comprising hardware, the processor being configured to: set torque to be applied by the load applying mechanism based on an operating state of the imaging unit; and apply a load corresponding to the set torque to the load applying mechanism when a rotation inhibit state of each of the arms of the support is released.
Surgical instruments, control assemblies, and surgical systems facilitating manipulation and visualization
A surgical instrument includes a housing, a shaft extending from the housing, an end effector assembly extending from the shaft and configured to rotate and/or articulate relative to the housing, a motor disposed within the housing and operably coupled to the end effector assembly to rotate and/or articulate of the end effector assembly relative to the housing, and a sensing assembly configured to sense movement of the housing relative to a reference position and to drive the motor to rotate and/or articulate of the end effector assembly relative to the housing based upon the sensed movement. The sensing assembly is configured to operate in each of a standard mode, wherein movement of the housing effects rotation and/or articulation of the end effector assembly in a similar direction, and a reversed mode, wherein movement of the housing effects rotation and/or articulation of the end effector assembly in an opposite direction.
ULTRASONIC ROBOTIC SURGICAL NAVIGATION
Surgical robot systems, anatomical structure tracker apparatuses, and US transducer apparatuses are disclosed. A surgical robot system includes a robot, a US transducer, and at least one processor. The robot includes a robot base, a robot arm coupled to the robot base, and an end-effector coupled to the robot arm. The end-effector is configured to guide movement of a surgical instrument. The US transducer is coupled to the end-effector and operative to output US imaging data of anatomical structure proximately located to the end-effector. The least one processor is operative to obtain an image volume for the patient and to track pose of the end-effector relative to anatomical structure captured in the image volume based on the US imaging data.
CAPSULE ENDOSCOPE APPARATUS AND METHOD OF SUPPORTING LESION DIAGNOSIS
Provided are a capsule endoscope apparatus for supporting a lesion diagnosis and a lesion diagnosis supporting method using the same. The capsule endoscope apparatus for supporting lesion diagnosis includes an imaging unit configured to capture one or more images of an inside of a body, a control unit configured to detect a suspected lesion region in the images and perform a precision diagnosis procedure when a suspected lesion region corresponding to a value equal to or greater than a certain threshold is detected, an image processing unit configured to process the images in the precision diagnosis procedure, and a communication module configured to transmit and receive processed images to another capsule endoscope apparatus or a terminal by using a wireless communication method.