Patent classifications
G06V2201/034
SURGICAL NAVIGATION SYSTEM, AND IMAGING METHOD OF THE SAME
A surgical navigation system includes a first tracking unit, a second tracking unit and a processing unit. The first tracking unit captures a first infrared image of a position identification unit that includes a reference target fixed on a patient and an instrument target disposed on a surgical instrument. The second tracking unit captures a second infrared image of the position identification unit. The processing unit performs image recognition on the first and second infrared images with respect to the position identification unit, and uses, based on a result of the image recognition, a pathological image and one of the first and second infrared images to generate an augmented reality image. When both the first and second images have both the reference target and the instrument target, one of the first image and the second image with a higher accuracy is used to generate the augmented reality image.
SURGICAL KIT INSPECTION SYSTEMS AND METHODS FOR INSPECTING SURGICAL KITS HAVING PARTS OF DIFFERENT TYPES
Surgical kit inspection systems and methods are provided for inspecting surgical kits having parts of different types. The surgical kit inspection system comprises a vision unit including a first camera unit and a second camera unit to capture images of parts of a first type and a second type in each kit and to capture images of loose parts from each kit that are placed on a light surface. A robot supports the vision unit to move the first and second camera units relative to the parts in each surgical kit. One or more controllers obtain unique inspection instructions for each of the surgical kits to control inspection of each of the surgical kits and control movement of the robot and the vision unit accordingly to provide output indicating inspection results for each of the surgical kits.
IMAGE-BASED PAIRING AND CONTROLLING OF DEVICES IN A CLINICAL ENVIRONMENT
An example method includes capturing images using a camera and detecting a medical device in a first image among the images. A request is transmitted to the medical device. Based on transmitting the request, the example method includes determining that the medical device has output a chirp signal in a second image among the images. Based on the chirp signal, the method includes causing the medical device to perform an action by transmitting a control message to the medical device.
MEDICAL IMAGE PROCESSING APPARATUS, MEDICAL IMAGE PROCESSING METHOD, ENDOSCOPE SYSTEM, AND MEDICAL IMAGE PROCESSING PROGRAM
An object of the present invention is to provide a medical image processing apparatus, an endoscope system, a medical image processing method, and a medical image processing program that are capable of displaying a region of interest with appropriate distinguishability. A medical image processing apparatus according to an aspect of the present invention is a medical image processing apparatus including a processor. The processor is configured to perform an image acquisition process of acquiring an observation image of a subject; a region-of-interest recognition process of recognizing a region of interest from the observation image; a tool information recognition process of recognizing tool information from the observation image, the tool information being information on a tool used for treatment of the subject; and a display control process of causing a display device to distinguishably display the observation image in a manner in which the region of interest has distinguishability based on a recognition result of the tool information.
SEGMENTATION GENERATION FOR IMAGE-RELATED FUNCTIONS
A computer system may perform an image-related function using a segmentation (e.g., an image mask) that has been generated by a custom segmentation machine learning model. To begin, the system may receive image data corresponding to a surgical scene including a background that includes an anatomical feature and at least one surgical tool. The system may also generate segmentation data using the custom segmentation machine learning model based on inputting the first image data to the custom segmentation machine learning model. The system may also include generating a segmentation of the at least one surgical tool using the segmentation data. Once the segmentation has been generated, the system may perform an image-related function using the segmentation.
On-board tool tracking system and methods of computer assisted surgery
A number of improvements are provided relating to computer aided surgery utilizing an on tool tracking system. The various improvements relate generally to both the methods used during computer aided surgery and the devices used during such procedures. Other improvements relate to the structure of the tools used during a procedure and how the tools can be controlled using the OTT device. Still other improvements relate to methods of providing feedback during a procedure to improve either the efficiency or quality, or both, for a procedure including the rate of and type of data processed depending upon a CAS mode.
INDICATION OF THE COUPLE PAIR OF REMOTE CONTROLS WITH REMOTE DEVICES FUNCTIONS
A method of assessing inter-device communication pairing in a surgical setting, may include transmitting, by a first intelligent medical device, wireless communication data within the surgical setting, receiving, by a second intelligent medical device, the wireless communication data from the first intelligent medical device, determining, by the second intelligent medical device, communication pairing data indicative of an inter-device communication pairing of the second intelligent medical device with the first intelligent medical device, transmitting, by the second intelligent medical device, the communication pairing data to a modular control tower, and displaying, by the modular control tower on a display device, an augmented reality display comprising one or more virtual objects indicative of the inter-device communication pairing. An interactive surgical system may include multiple intelligent medical devices and displays which can form communication pairs in this manner.
METHOD FOR THE RECOMPOSITION OF A KIT OF SURGICAL INSTRUMENTS AND CORRESPONDING APPARATUS
A method and an apparatus are described for the recomposition of a kit (11) of surgical instruments (12), in which it is provided to dispose a plurality of surgical instruments (12) on a support plane (21) comprised in a support device (13); acquire an image (26) of the surgical instruments (12) by means of at least one optical detection device (14); and recognize each surgical instrument (12) by processing the acquired images (26).
Automatic ablation antenna segmentation from CT image
Provided in accordance with the present disclosure are systems and methods for identifying a percutaneous tool in image data. An exemplary method includes receiving image data of at least a portion of a patient's body, identifying an entry point of a percutaneous tool through the patient's skin in the image data, analyzing a portion of the image data including the entry point of the percutaneous tool through that patient's skin to identify a portion of the percutaneous tool inserted through the patient's skin, determining a trajectory of the percutaneous tool based on the identified portion of the percutaneous tool inserted through the patient's skin, identifying a remaining portion of the percutaneous tool in the image data based on the identified entry point and the determined trajectory of the percutaneous tool, and displaying the identified portions of the percutaneous tool on the image data.
METHOD FOR MONITORING OBJECT FLOW WITHIN A SURGICAL SPACE DURING A SURGERY
One variation of a method for tracking objects within a surgical space during a surgery includes: based on a first image depicting the surgical space at a first time, detecting a first object and a constellation of objects in the surgical space, estimating distances from each object—in the constellation of objects—to the first object, and calculating a contamination risk of the first object based on contamination scores and distances to the first object for each object in the constellation of objects; calculating a contamination score of the first object based on a combination of the contamination risks of the first object during the surgery; and, in response to the contamination score of the first object exceeding a threshold contamination score prior to contact between the first object and a patient, serving a prompt within the surgical space to address sterility of the first object.