Patent classifications
A61B2090/368
SURGICAL VIRTUAL REALITY USER INTERFACE
A surgical virtual reality user interface generating system comprising a sensor and tracking unit for sensing and tracking a position a user and generating position data based on movement of the user, a computing unit for receiving the position data and processing the position data and generating control signals. The system also includes a surgical robot system for receiving the control signals and having a camera assembly for generating image data, and a virtual reality computing unit for generating a virtual reality world. The virtual reality computing unit includes a virtual reality rendering unit for generating an output rendering signal for rendering the image data for display, and a virtual reality object generating unit for generating virtual reality informational objects and for emplacing the informational objects in the virtual reality world. A display unit is provided for displaying the virtual reality world and the informational objects to the user.
Control device, control method, and surgical system
The present technology relates to a control device, a control method, and a surgical system that enable an operator to implement operation without a burden. A control unit controls a plurality of patterns of operations of a surgical instrument, and an acquisition unit obtains motion information indicating a motion of a user. Furthermore, the control unit controls the operations of the respective patterns corresponding to the motion information obtained by the acquisition unit in parallel using only a single operation performed by the user as a trigger. The present technology can be applied to a control device of a surgical system.
Stereo microscope for use in microsurgical operations on a patient and method for controlling the stereo microscope
A stereo microscope includes a stand, two optical image acquisition units configured to connect to the stand to capture a stereoscopic image, which define an imaging plane using two optical axes of the image acquisition units, a pair of video glasses including two optical image reproduction units, each having an optical axis and a display for reproducing an image, which together define an image plane, wherein the optical image reproduction units are arranged to produce a stereoscopic image impression, and two optical axes of the optical image reproduction units define an image reproduction plane, a detection device configured to determine spatial orientation of the video glasses, the image reproduction plane, the image plane and the imaging plane, and a control unit configured to pivot the stand so that the intersection lines of the image plane and the imaging plane on the image reproduction plane are made parallel. Methods are also disclosed.
Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
A surgical system includes an XR headset and an XR headset controller. The XR headset is configured to be worn by a user during a surgical procedure and includes a see-through display screen configured to display an XR image for viewing by the user. The XR headset controller is configured to receive a plurality of two-dimensional (“2D”) image data associated with an anatomical structure of a patient. The XR headset controller is further configured to generate a first 2D image from the plurality of 2D image data based on a pose of the XR headset. The XR headset controller is further configured to generate a second 2D image from the plurality of 2D image data based on the pose of the XR headset. The XR headset controller is further configured to generate the XR image by displaying the first 2D image in a field of view of a first eye of the user and displaying the second 2D image in a field of view of a second eye of the user.
Technique of Providing User Guidance For Obtaining A Registration Between Patient Image Data And A Surgical Tracking System
A method of providing user guidance. First patient image data of a patient's body is obtained. A registration instruction indicative of where to acquire a registration point relative to a surface of the body is determined. Second patient image data of the body, having been acquired by an augmented reality device, is obtained. A transformation between coordinate systems of the first and the second patient image data is determined. Based on the transformation, display of the registration instruction on a display of the AR device is triggered such that a user of the AR device is presented an augmented view with the registration instruction being overlaid onto the patient's body. The augmented view guides the user where to acquire the registration point. Also disclosed are a computing system, a surgical navigation system, and a computer program product.
Augmented Reality Display Systems for Fitting, Sizing, Trialing and Balancing of Virtual Implant Components on the Physical Joint of the Patient
Devices and methods for performing a surgical step or surgical procedure with visual guidance using an optical head mounted display are disclosed.
Virtual reality surgical system including a surgical tool assembly with haptic feedback
Virtual reality (VR) surgical systems with haptic feedback are described herein that can be used to simulate several instruments and surgeries. The instruments can include a tool assembly; first and second brackets each having first shafts that are rotatably coupled at first and second end regions of the tool assembly and second shafts that are rotatably coupled to first and second robotic arms; a surgical tool assembly coupled to an end portion of the tool assembly and having an elongated member that extends within the tool assembly; an elongated member position sensor assembly configured to provide position information of a position of the elongated member to a computing unit; and an elongated member force feedback assembly housed within the cavity and coupled to the elongated member. The elongated member force feedback assembly is configured to provide haptic feedback to the user of the instrument.
METHOD AND SYSTEM FOR FACILITATING REMOTE PRESENTATION OR INTERACTION
A facilitation system for facilitating remote presentation of a physical world includes a first object and an operating environment of the first object. The facilitation system includes a processing system configured to obtain an image frame depicting the physical world, identify a depiction of the first object in the image frame, and obtain a first spatial registration registering an object model with the first object in the physical world. The object model is of the first object. The processing system is further configured to obtain an updated object model corresponding to the object model updated with a current state of the first object, and generate a hybrid frame using the image frame, the first spatial registration, and the updated object model. The hybrid frame includes the image frame with the depiction of the first object replaced by a depiction of the updated object model.
LEVERAGING TWO-DIMENSIONAL DIGITAL IMAGING AND COMMUNICATION IN MEDICINE IMAGERY IN THREE-DIMENSIONAL EXTENDED REALITY APPLICATIONS
A surgical system includes an XR headset and an XR headset controller. The XR headset is configured to be worn by a user during a surgical procedure and includes a see-through display screen configured to display an XR image for viewing by the user. The XR headset controller is configured to receive a plurality of two-dimensional (“2D”) image data associated with an anatomical structure of a patient. The XR headset controller is further configured to generate a first 2D image from the plurality of 2D image data based on a pose of the XR headset. The XR headset controller is further configured to generate a second 2D image from the plurality of 2D image data based on the pose of the XR headset. The XR headset controller is further configured to generate the XR image by displaying the first 2D image in a field of view of a first eye of the user and displaying the second 2D image in a field of view of a second eye of the user.
Device For Navigating A Medical Instrument Relative To A Patient Anatomy
The present invention relates to a device for navigating a medical instrument relative to a patient anatomy, a method for navigating a medical instrument relative to a patient anatomy, and a program element which, when executed by a computer, executes this method. The device comprises a position determination unit, a computing unit and a navigation display. The position determination unit comprises a sensor module configured to acquire current 3D data of the patient anatomy. The position determination unit further comprises a position sensor which is configured to acquire current movement data of the medical instrument. The computing unit is configured to match the current 3D data of the patient anatomy and the current movement data of the medical instrument with preoperative image data of the patient anatomy and, on this basis, to calculate navigation information for the medical instrument. The navigation display is configured to show the calculated navigation information.