A61B2090/368

Control device and master slave system

Provided is a control device including a control unit that calculates a first positional relationship between an eye of an observer observing an object displayed on a display unit and a first point in a master-side three-dimensional coordinate system, and controls an imaging unit that images the object so that a second positional relationship between the imaging unit and a second point corresponding to the first point in a slave-side three-dimensional coordinate system corresponds to the first positional relationship.

Mixed-reality surgical system with physical markers for registration of virtual models

An example method includes obtaining, a virtual model of a portion of an anatomy of a patient obtained from a virtual surgical plan for an orthopedic joint repair surgical procedure to attach a prosthetic to the anatomy; identifying, based on data obtained by one or more sensors, positions of one or more physical markers positioned relative to the anatomy of the patient; and registering, based on the identified positions, the virtual model of the portion of the anatomy with a corresponding observed portion of the anatomy.

Surgical navigation system and method

The present disclosure relates to a surgical navigation system for the alignment of a surgical instrument and methods for its use, wherein the surgical navigation system may comprise a head-mounted display comprising a lens. The surgical navigation system may further comprise tracking unit, herein the tracking unit may be configured to track a patient tracker and/or a surgical instrument. Patient data may be registered to the patient tracker. The surgical instrument may define an instrument axis. The surgical navigation system may be configured to plan one or more trajectories based on the patient data. The head-mounted display may be configured to display augmented reality visualization, including an augmented reality position alignment visualization and/or an augmented reality angular alignment visualization related to the surgical instrument on the lens of the head-mounted display.

SURGEON HEAD-MOUNTED DISPLAY APPARATUSES

An augmented reality surgical system includes a head mounted display (HMD) with a see-through display screen, a motion sensor, a camera, and computer equipment. The motion sensor outputs a head motion signal indicating measured movement of the HMD. The computer equipment computes the relative location and orientation of reference markers connected to the HMD and to the patient based on processing a video signal from the camera. The computer equipment generates a three dimensional anatomical model using patient data created by medical imaging equipment, and rotates and scales at least a portion of the three dimensional anatomical model based on the relative location and orientation of the reference markers, and further rotate at least a portion of the three dimensional anatomical model based on the head motion signal to track measured movement of the HMD. The rotated and scaled three dimensional anatomical model is displayed on the display screen.

Systems and methods for surgical navigation

Disclosed are systems, methods, and techniques for registering a HMD coordinate system of a head-mounted display (HMD) and a localizer coordinate system of a surgical navigation localizer. A camera of the HMD captures at least one image of a registration device having a registration coordinate system and a plurality of registration markers. The registration markers are analyzed in the at least one image to determine a pose of the HMD coordinate system relative to the registration coordinate system. One or more position sensors comprised in the localizer detect a plurality of tracking markers comprised in the registration device to determine a pose of the registration coordinate system relative to the localizer coordinate system. The HMD coordinate system and the localizer coordinate system are registered using the registration device, wherein positions of the registration markers are known with respect to positions of the tracking markers in the registration coordinate system.

METHOD, APPARATUS AND SYSTEM FOR CONTROLLING AN IMAGE CAPTURE DEVICE DURING SURGERY

A system for controlling a medical image capture device during surgery, the system including: circuitry configured to receive a first image of the surgical scene, captured by the medical image capture device from a first viewpoint, and additional information of the scene; determine, for the medical image capture device, in accordance with the additional information and previous viewpoint information of surgical scenes, one or more candidate viewpoints from which to obtain an image of the surgical scene; provide, in accordance with the first image of the surgical scene, for each of the one or more candidate viewpoints, a simulated image of the surgical scene from the candidate viewpoint; control the medical image capture device to obtain an image of the surgical scene from the candidate viewpoint corresponding to a selection of one of the one or more simulated images of the surgical scene.

SYSTEMS AND METHODS FOR SURGICAL NAVIGATION
20230008222 · 2023-01-12 ·

Imaging systems and methods may facilitate positioning an imaging device in a procedure room. A 3D image of a subject may be obtained, where the subject is to have a procedure performed thereon. A view of the 3D image of the subject may be adjusted to a desired view and an associated 2D image reconstruction at the desired view may be obtained. A position for the imaging device that is associated with the desired view of the 3D image of the subject may be identified. Adjusting a view of the 3D image to a desired view and obtaining a 2D image reconstruction may be performed pre-procedure, such that a user may be able to create a list of desired views pre. A user may adjust a physical position of the imaging device to obtain reconstructed 2D preview images at the adjusted physical position of the imaging device prior to capturing an image.

Systems, methods, and computer-readable storage media for controlling aspects of a robotic surgical device and viewer adaptive stereoscopic display

A system includes a robotic arm, an autosteroscopic display, a user image capture device, an image processor, and a controller. The robotic arm is coupled to a patient image capture device. The autostereoscopic display is configured to display an image of a surgical site obtained from the patient image capture device. The image processor is configured to identify a location of at least part of a user in an image obtained from the user image capture device. The controller is configured to, in a first mode, adjust a three dimensional aspect of the image displayed on autostereoscopic display based on the identified location, and, in a second mode, move the robotic arm or instrument based on a relationship between the identified location and the surgical site image.

Methods and systems for touchless control of surgical environment

A method facilitates touchless control of medical equipment devices in an OR. The method involves: providing a three-dimensional control menu, which comprises a plurality of menu items selectable by the practitioner by one or more gestures made in a volumetric spatial region corresponding to the menu item; displaying an interaction display unit (IDU) image corresponding to the three-dimensional control menu to provide indicia of any selected menu items; estimating a line of sight of a practitioner; and when the estimated line of sight is directed within a first spatial range around a first medical equipment device, determining that the practitioner is looking at the first medical equipment device. Then the method involves providing a first device-specific three-dimensional control menu displaying a first device-specific IDU image.

Medical manipulator system and image display method therefor
11534241 · 2022-12-27 · ·

A medical manipulator system includes: an endoscope; a first manipulator equipped with a first treatment tool at a distal end thereof; a second manipulator equipped with a second treatment tool at a distal end thereof; a display for a user to view; and a controller configured to generate an image to be displayed on the display. The controller is configured to: acquire a first image taken by the endoscope, the first image contains the first treatment tool; and in response to determining that the second treatment tool does not exist in the first image: calculate a relative distance and a relative direction between the first treatment tool and the second treatment tool; generate a second image showing the relative distance and the relative direction between the first treatment tool and the second treatment tool; and send the first image and the second image to the display.