A61B2017/00216

Systems, methods, and computer-readable storage media for controlling aspects of a robotic surgical device and viewer adaptive stereoscopic display

A system includes a robotic arm, an autosteroscopic display, a user image capture device, an image processor, and a controller. The robotic arm is coupled to a patient image capture device. The autostereoscopic display is configured to display an image of a surgical site obtained from the patient image capture device. The image processor is configured to identify a location of at least part of a user in an image obtained from the user image capture device. The controller is configured to, in a first mode, adjust a three dimensional aspect of the image displayed on autostereoscopic display based on the identified location, and, in a second mode, move the robotic arm or instrument based on a relationship between the identified location and the surgical site image.

DETECTION OF SURGICAL TABLE MOVEMENT FOR COORDINATING MOTION WITH ROBOTIC MANIPULATORS

An position sensor such as an IMU is removably positioned on a patient bed used to support a patient during a robotic surgical procedure in which a robotic manipulator is used to manipulate a surgical instrument. When the bed is moved during the course of surgery, signals corresponding to a sensed changed in the bed's position are received by a processor, which causes a corresponding repositioning of the robotic manipulator.

Methods and systems for touchless control of surgical environment

A method facilitates touchless control of medical equipment devices in an OR. The method involves: providing a three-dimensional control menu, which comprises a plurality of menu items selectable by the practitioner by one or more gestures made in a volumetric spatial region corresponding to the menu item; displaying an interaction display unit (IDU) image corresponding to the three-dimensional control menu to provide indicia of any selected menu items; estimating a line of sight of a practitioner; and when the estimated line of sight is directed within a first spatial range around a first medical equipment device, determining that the practitioner is looking at the first medical equipment device. Then the method involves providing a first device-specific three-dimensional control menu displaying a first device-specific IDU image.

Use of eye tracking for tool identification and assignment in a robotic surgical system
11690677 · 2023-07-04 · ·

A robotic surgical system includes an eye gaze sensing system in conjunction with a visual display of a camera image from a surgical work site. Detected gaze of a surgeon towards the display is used as input to the system. This input may be used by the system to assign an instrument to a control input device (when the user is prompted to look at the instrument), or it may be used as input to a computer vision algorithm to aid in object differentiation and seeding information, facilitating identification/differentiation of instruments, anatomical features or regions.

Extended reality AR/VR system
11691073 · 2023-07-04 · ·

A system includes a mobile device having one or more cameras to take images; a sensor detecting reflected light from one or more lasers and a diffuser to detect object range or dimension; code for motion tracking, environmental understanding by detecting planes in an environment, and estimating light and dimensions of the surrounding based on the one or more lasers; code to estimate a three-dimensional (3D) volume of an object from multiple perspectives and from projected laser beams to measure size or scale and determine locations of points on the object's surface in a plane or a slice using time-of-flight, wherein positions and cross-sections for different slices are correlated to construct a 3D model of the object, including object position and shape; the device receiving user request to select a content from one or more augmented, virtual, or extended reality contents and rendering a reality view of the environment.

SYSTEMS AND METHODS FOR IDENTIFYING AND FACILITATING AN INTENDED INTERACTION WITH A TARGET OBJECT IN A SURGICAL SPACE

An exemplary system includes a memory storing instructions and a processor communicatively coupled to the memory. The processor may be configured to execute the instructions to: detect an intent of a user of a computer-assisted surgical system to use a robotic instrument attached to the computer-assisted surgical system to interact with a target object while the target object is located in a surgical space; determine a pose of the target object in the surgical space; and perform, based on the detected intent of the user to interact with the target object and the determined pose of the target object in the surgical space, an operation with respect to the target object.

SURGICAL VIRTUAL REALITY USER INTERFACE
20220387128 · 2022-12-08 ·

A surgical virtual reality user interface generating system comprising a sensor and tracking unit for sensing and tracking a position a user and generating position data based on movement of the user, a computing unit for receiving the position data and processing the position data and generating control signals. The system also includes a surgical robot system for receiving the control signals and having a camera assembly for generating image data, and a virtual reality computing unit for generating a virtual reality world. The virtual reality computing unit includes a virtual reality rendering unit for generating an output rendering signal for rendering the image data for display, and a virtual reality object generating unit for generating virtual reality informational objects and for emplacing the informational objects in the virtual reality world. A display unit is provided for displaying the virtual reality world and the informational objects to the user.

DEVICE AND METHOD OF PREDICTING USE INSTRUMENT, AND SURGERY ASSISTING ROBOT
20220387116 · 2022-12-08 · ·

A use instrument predicting device includes a motion recognizing module that recognizes a motion of a surgeon during a surgical operation based on motion detection data that is obtained by detecting the surgeon's motion, a situation recognizing module that recognizes a surgery situation based on the motion recognized result of the motion recognizing module, and a predicting module that predicts at least one kind of surgical instrument to be used next by the surgeon out of a plurality of kinds of surgical instruments given beforehand, based on the situation recognized result of the situation recognizing module.

Control device, control method, and surgical system
11517395 · 2022-12-06 · ·

The present technology relates to a control device, a control method, and a surgical system that enable an operator to implement operation without a burden. A control unit controls a plurality of patterns of operations of a surgical instrument, and an acquisition unit obtains motion information indicating a motion of a user. Furthermore, the control unit controls the operations of the respective patterns corresponding to the motion information obtained by the acquisition unit in parallel using only a single operation performed by the user as a trigger. The present technology can be applied to a control device of a surgical system.

Interchangeable input handles for a surgeon console of a robotic surgical system

A user interface for a surgical robotic system includes a plurality of handles, each removably attachable to a user interface assembly by a quick release connector. The selection of handles can include handles of varying size, degree of complexity, handles adapted for laparoscopic motion, handles adapted for true cartesian motion or handles customized to surgeon anthropometric data, etc.