Patent classifications
A61B90/37
Systems and methods for visualizing navigation of medical devices relative to targets
Systems and methods for visualizing navigation of a medical device with respect to a target using a live fluoroscopic view. The methods include displaying, in a screen, a three-dimensional (3D) view of a 3D model of a target from the perspective of a medical device tip. The methods also include displaying, in the screen, a live two-dimensional (2D) fluoroscopic view showing a medical device, and displaying a target mark, which corresponds to the 3D model of the target, overlaid on the live 2D fluoroscopic view. The methods may include determining whether the medical device tip is aligned with the target, displaying the target mark in a first color if the medical device tip is aligned with the target, and displaying the target mark in second color different from the first color if the medical device tip is not aligned with the target.
Using pulmonary vein isolation for patients with atrial fibrillation
A method for ablating a patient, consisting of ascertaining a CHA.sub.2DS.sub.2-VASc score for the patient and inserting a probe into the patient, so as to contact a pulmonary vein of the patient. The method further includes applying energy via the probe so as to ablate the pulmonary vein until pulmonary vein isolation (PVI) is achieved. When PVI is achieved and the CHA.sub.2DS.sub.2-VASc score is less than a preset value, ablation of the pulmonary vein is ceased. When PVI is achieved and the CHA.sub.2DS.sub.2-VASc score is greater than or equal to the preset value, energy is applied to perform a further ablation.
Position detection based on tissue discrimination
A system is suggested comprising an optical sensing means and a processing unit. The optical sensing means may include an optical guide with a distal end, wherein the optical guide may be configured to be arranged in a device to be inserted into tissue in a region of interest. The processing unit may be configured to receive information of a region of interest including different tissue types as well as of a path through the tissues, to determine a sequence of tissue types along the path, to determine a tissue type at the distal end of the optical guide based on information received from the optical sensing means, to compare the determined tissue type with the tissue types on the path, to determine possible positions of the distal end of the optical guide on the path based on the comparison of tissue types, and to generate a signal indicative for the possible positions.
Control apparatus and medical observation system
A control apparatus includes: an acquisition unit configured to acquire an operation instruction made by a voice input to an imaging device including: an optical system including a focus lens; and an image sensor; and a controller configured to control the focus lens moving at a first velocity to stop movement when the operation instruction is an instruction to stop an operation of the focus lens, and control the focus lens to move at a second velocity lower than the first velocity.
MEDICAL INSTRUMENT DISPLAYS AND MEDICAL INSTRUMENT DISPLAY PROGRAMS
A medical instrument display includes an image storage that stores a plurality of image data of a medical instrument, a data analyzer that searches a plurality of image data stored in the image storage on a certain condition, and a display controller that causes a display to display a first medical instrument image that has been selected from thumbnail images, the thumbnail images being based on a plurality of the image data obtained by the search, and a second medical instrument image in good order, the second medical instrument image being different from the first medical instrument image.
Magnetic Resonance Breast Support
Disclosed herein is a medical instrument (100, 200, 300, 400, 500, 600, 900) comprising a subject support (102) configured for supporting a subject (110) in a Fowler's position during a magnetic resonance imaging examination. The subject support comprises a leg support region (104) configured for supporting a leg region of the subject horizontally. The subject support further comprises a thoracic support (106) configured for supporting an upper body region of the subject. The subject support is configured such that the thoracic support is inclined (108) with respect to the leg support region to hold the subject in the Fowler's position. The medical instrument further comprises a breast support (114). The breast support comprises a planar support surface (116) configured for supporting breasts of the subject. The breast support is connected to the subject support. The support surface is configured for being horizontal during the magnetic resonance imaging examination.
COMPUTER ASSISTED SURGERY SYSTEM, SURGICAL CONTROL APPARATUS AND SURGICAL CONTROL METHOD
A computer assisted surgery system comprising: a computerised surgical apparatus; and a control apparatus; wherein the control apparatus comprises circuitry configured to: receive information indicating a first region of a surgical scene from which information is obtained by the computerised surgical apparatus to make a decision; receive information indicating a second region of the surgical scene from which information is obtained by a medical professional to make a decision; determine if there is a discrepancy between the first and second regions of the surgical scene; and if there is a discrepancy between the first and second regions of the surgical scene: perform a predetermined process based on the discrepancy.
SYSTEM FOR CHECKING INSTRUMENT STATE OF A SURGICAL ROBOTIC ARM
A surgical robotic system includes: a surgical console having a display and a user input device configured to generate a user input and a surgical robotic arm having a surgical instrument configured to treat tissue and being actuatable in response to the user input; and a video camera configured to capture video data that is displayed on the display. The system also includes a control tower coupled to the surgical console and the surgical robotic arm. The control tower is configured to: process the user input to control the surgical instrument and to record the user input as input data; train a machine learning system using the input data and the video data; and execute the at least one machine learning system to determine probability of failure of the surgical instrument.
METHODS AND ARRANGEMENTS TO DESCRIBE DEFORMITY OF A BONE
Logic may determine how to reduce bone segments. Logic may communicate one or more images to display with at least two bone segments. Logic may identify a first reduction point and a third point on a first bone segment and identify a second reduction point and a fourth point on the second bone segment. Logic may identify a fifth point on the first bone segment and a sixth point on the second bone segment. Logic may also divide the one or more images along a line or plane between the bone segments, bring the second reduction point and the associated image segment to the first reduction point, align the line or plane of the second bone segment with a line or plane of the first bone segment. Furthermore, logic may adjust alignment and record the movement of the image segments or compare original and final positions, to determine deformity parameters.
Robotic catheter system including imaging system control
A robotic catheter procedure system includes a bedside system and a workstation. The bedside system includes an actuating mechanism configured to engage and to impart movement to a percutaneous device. The workstation includes a user interface and a control system configured to be operatively coupled to the user interface, the bedside system, and a medical imaging system. The control system is responsive to a first input and to a second input, and the user interface receives the second input from a user. The control system is configured to generate a first control signal to the medical imaging system based on the first input, and the medical imaging system captures at least one image in response to the first control signal. The control system is configured to generate a second control signal to the actuating mechanism based on the second input, and the actuating mechanism causes movement of the percutaneous device in response to the second control signal. The first input is indicative of upcoming percutaneous device movement.