Patent classifications
A61B2017/00203
SURGERY SYSTEM, CONTROL METHOD, SURGICAL APPARATUS, AND PROGRAM
The present technology relates to a surgery system, a control method, a surgical apparatus, and a program that allow easy updating of a function of a medical apparatus. A configuration includes an information processor, and a surgical apparatus. The information processor includes a storage section storing an application, and a transmission section transmitting, in response to a request from the surgical apparatus, the application stored in the storage section. The surgical apparatus includes a reception section receiving the application transmitted by the transmission section, and an execution section executing the application received by the reception section.
SYSTEM AND METHOD TO DETECT AND TRACK SURGICAL INSTRUMENTS AND/OR SURGICAL MATERIAL
Certain embodiments may relate to apparatuses and methods for performing surgical procedures. For example, a method may comprise initiating detection and tracking of at least one surgical instrument (including associated surgical material) within a surgical area. The method may further comprise performing a surgical procedure with the at least one surgical instrument and material and ending detection and tracking of the at least one surgical instrument and material within the surgical area. The method may further comprise displaying at least one indication of location status of the at least one surgical instrument.
SURGICAL SKILL TRAINING SYSTEM AND MACHINE LEARNING-BASED SURGICAL GUIDE SYSTEM USING THREE-DIMENSIONAL IMAGING
A surgical skill training system includes: a data collecting unit configured to collect actual surgical skill data on a patient of an operating surgeon; an image providing server configured to generate a 3-dimensional (3D) surgical image for surgical skill training, based on the actual surgical skill data; and a user device configured to display the 3D surgical image, wherein the image providing server includes: a patient image generating unit configured to generate a patient image, based on patient information of the patient; a surgical stage classifying unit configured to classify the actual surgical skill data into actual surgical skill data for each surgical stage performed by the operating surgeon; and a 3D image generating unit configured to generate the 3D surgical image by using the patient image, and feature information detected from the actual surgical skill data.
METHODS AND SYSTEMS FOR USING VOICE INPUT TO CONTROL A SURGICAL ROBOT
Methods, apparatuses, and systems for using speech input to control a surgical robot are disclosed. A surgical robot is disclosed that can be controlled by a surgeon using speech input in a conversational manner. The surgical robot is provided either general commands or specific instructions, assessing whether the instructions can be completed within the capabilities of the available hardware and resources, and seeking approval from the surgeon prior to executing the instructions. Alternatively, the embodiments disclosed allow the surgeon to perform an action that cannot be safely completed by the surgical robot.
Methods for performing medical procedures using a surgical robot
Embodiments are directed to a medical robot system including a robot coupled to an end-effectuator element with the robot configured to control movement and positioning of the end-effectuator in relation to the patient. One embodiment is a method for removing bone with a robot system comprising: taking a two-dimensional slice through a computed tomography scan volume of target anatomy; placing a perimeter on a pathway to the target anatomy; and controlling a drill assembly with the robot system to remove bone along the pathway in the intersection of the perimeter and the two-dimensional slice.
Medical voice command integration
System and methods for controlling healthcare devices and systems using voice commands are presented. In some aspects a listening device may receive voice command from a person. The voice command may be translated into human readable or machine readable text via a speech-to-text service. A control component may receive the text and send device-specific instructions to a medical device associated with a patient based on the translated voice command. In response to the instructions, the medical device may take an action on a patient. Some examples of actions taken may include setting an alarm limit on a monitor actively monitoring a patient and adjusting the amount of medication delivered by an infusion pump. Because these devices may be controlled using a voice command, in some cases, no physical or manual interaction is needed with the device. As such, multiple devices may be hands-free controlled from any location.
SURGICAL VIEWING SYSTEM
A surgical viewing system including an X-ray source, a surgical tool and an actuator. The X-ray source creates a beam of radiation used in an image creating process. The surgical tool has the X-ray source coupled thereto, and the surgical tool has an axis of rotation. The actuator is coupled to the surgical tool causing the beam of radiation to be shifted relative to the axis of rotation.
METHOD, APPARATUS AND SYSTEM FOR CONTROLLING AN IMAGE CAPTURE DEVICE DURING SURGERY
A system for controlling a medical image capture device during surgery is provided, the system including circuitry configured to acquire first image data from the medical image capture device, the first image data being of an appearance of a surgical scene at a first instance of time; determine, based on a predicted appearance of the surgical scene based on the first image data at a second instance of time after the first instance of time, one or more desired image capture properties of the medical image capture device; and control the medical image capture device at a third instance of time, the third instance of time being between the first instance of time and the second instance of time, in accordance with the one or more desired image capture properties of the medical image capture device.
SYSTEMS AND METHODS FOR IDENTIFYING AND FACILITATING AN INTENDED INTERACTION WITH A TARGET OBJECT IN A SURGICAL SPACE
An exemplary system includes a memory storing instructions and a processor communicatively coupled to the memory. The processor may be configured to execute the instructions to: detect an intent of a user of a computer-assisted surgical system to use a robotic instrument attached to the computer-assisted surgical system to interact with a target object while the target object is located in a surgical space; determine a pose of the target object in the surgical space; and perform, based on the detected intent of the user to interact with the target object and the determined pose of the target object in the surgical space, an operation with respect to the target object.
ACCURACY SYSTEM
An accuracy system configured to determine the accuracy of a stereotactic system The accuracy system is configured to determine a displacement between a pointer tip positioned by the stereotactic system and a target point defined by a phantom base. The accuracy system is configured to mechanically engage the phantom base when the phantom base mechanically engages the stereotactic system to determine the displacement. In examples, a gauge support mechanically engages a pin of the phantom base and determines the displacement using one or more visible indicia. In examples, a gauge frame supports one or more cameras and determines the displacement using a first image and a second image obtained by the one or more cameras. The accuracy system provides an output viewable by a practitioner to indicate the determined displacement.