A61B34/37

Machine-learning-based visual-haptic system for robotic surgical platforms

Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.

Method of hub communication, processing, display, and cloud analytics

A method of displaying an operational parameter of a surgical system is disclosed. The method includes receiving, by a cloud computing system of the surgical system, first usage data, from a first subset of surgical hubs of the surgical system; receiving, by the cloud computing system, second usage data, from a second subset of surgical hubs of the surgical system; analyzing, by the cloud computing system, the first and the second usage data to correlate the first and the second usage data with surgical outcome data; determining, by the cloud computing system, based on the correlation, a recommended medical resource usage configuration; and displaying, on respective displays on the first and the second subset of surgical hubs, indications of the recommended medical resource usage configuration.

Hand controller for robotic surgery system
11576736 · 2023-02-14 · ·

A Robotic control system has a wand, which emits multiple narrow beams of light, which fall on a light sensor array, or with a camera, a surface, defining the wand's changing position and attitude which a computer uses to direct relative motion of robotic tools or remote processes, such as those that are controlled by a mouse, but in three dimensions and motion compensation means and means for reducing latency.

Hand controller for robotic surgery system
11576736 · 2023-02-14 · ·

A Robotic control system has a wand, which emits multiple narrow beams of light, which fall on a light sensor array, or with a camera, a surface, defining the wand's changing position and attitude which a computer uses to direct relative motion of robotic tools or remote processes, such as those that are controlled by a mouse, but in three dimensions and motion compensation means and means for reducing latency.

Control apparatus

A control apparatus detects a misalignment between an irradiation position of a transdermal treatment device and a target irradiation position. When the misalignment is detected, the control unit stops irradiation of the transdermal treatment device or moves the irradiation position closer to the target irradiation position.

Control apparatus

A control apparatus detects a misalignment between an irradiation position of a transdermal treatment device and a target irradiation position. When the misalignment is detected, the control unit stops irradiation of the transdermal treatment device or moves the irradiation position closer to the target irradiation position.

Virtual reality training, simulation, and collaboration in a robotic surgical system

A virtual reality system providing a virtual robotic surgical environment, and methods for using the virtual reality system, are described herein. Within the virtual reality system, various user modes enable different kinds of interactions between a user and the virtual robotic surgical environment. For example, one variation of a method for facilitating navigation of a virtual robotic surgical environment includes displaying a first-person perspective view of the virtual robotic surgical environment from a first vantage point, displaying a first window view of the virtual robotic surgical environment from a second vantage point and displaying a second window view of the virtual robotic surgical environment from a third vantage point. Additionally, in response to a user input associating the first and second window views, a trajectory between the second and third vantage points can be generated sequentially linking the first and second window views.

Control device and master slave system

Provided is a control device including a control unit that calculates a first positional relationship between an eye of an observer observing an object displayed on a display unit and a first point in a master-side three-dimensional coordinate system, and controls an imaging unit that images the object so that a second positional relationship between the imaging unit and a second point corresponding to the first point in a slave-side three-dimensional coordinate system corresponds to the first positional relationship.

Systems and instruments for tissue sealing

Provided is a robotic system that includes a surgical instrument with a wrist including an elongate shaft extending between a proximal end and a distal end, a wrist extending from the distal end of the elongate shaft, and an end effector extending from the wrist. The end effector may include a first jaw and a second jaw, the first and second jaw being moveable between an open position in which ends of the jaws are separated from each other, and a closed position in which the ends of the jaws are closer to each other as compared to the open position. The surgical instrument may also include at least one rotary cutter extending from the wrist and positioned at least partially within a recess formed in a face of the first jaw.

Controlling a surgical instrument

A control system for regulating operative control of a surgical instrument by a remote surgeon input device. The surgical instrument is supported by an articulated robot arm, and comprises an end effector connected to a shaft by an articulated coupling. The remote surgeon input device is capable of operatively controlling the surgical instrument by controlling articulation of the end effector, and controlling articulation of the robot arm and coupling. On receiving a request to engage operative control of the surgical instrument by the surgeon input device, the control system initially engages operative control of articulation of the robot arm and coupling by the surgeon input device, whilst maintaining disengagement of operative control of articulation of the end effector by the surgeon input device. Subsequently, the control system engages operative control of articulation of the end effector by the surgeon input device following a manipulation of the surgeon input device.