G05B2219/36432

Neural monitor-based dynamic haptics

A computer-assisted surgery system may have a robotic arm including a surgical tool and a processor communicatively connected to the robotic arm. The processor may be configured to receive, from a neural monitor, a signal indicative of a distance between the surgical tool and a portion of a patient's anatomy including nervous tissue. The processor may be further configured to generate a command for altering a degree to which the robotic arm resists movement based on the signal received from the neural monitor; and send the command to the robotic arm.

Device for assisting with the handling of an instrument or tool

A device for assisting with the handling of an instrument or tool, the device comprising a jointed mechanical structure on a support, wherein an instrument or tool may be attached, motor drives configured to actuate the jointed mechanical structure, according to a number of degrees of freedom of less than that which the structure provides to the instrument or tool, and an automatic control, wherein the automatic control drives the motor drives in order to facilitate the meeting of a constraint on position and/or velocity parameters of the instrument or tool, which constraint the motor drives by themselves, independently of handling by an operator, cannot entirely meet.

SYNTHETIC REPRESENTATION OF A SURGICAL ROBOT

A synthetic representation of a robot tool for display on a user interface of a robotic system. The synthetic representation may be used to show the position of a view volume of an image capture device with respect to the robot. The synthetic representation may also be used to find a tool that is outside of the field of view, to display range of motion limits for a tool, to remotely communicate information about the robot, and to detect collisions.

Surgical guidance system and method with acoustic feedback

A surgical system includes a surgical tool, a tracking system configured to obtain tracking data indicative of positions of the surgical tool relative to an anatomical feature, an acoustic device, and a computer system programmed to control the acoustic device to provide acoustic feedback to a user based on the tracking data.

Method for modifying the rendering of a region of a 3D scene in an immersive environment
11430173 · 2022-08-30 · ·

A computer-implemented method for modifying the rendering of a region of a 3D scene in an immersive environment, the region being computed based on a 3D position of a head tracking device and a 3D position of at least one hand tracking device.

Neural monitor-based dynamic haptics

A surgical system includes a robotic device, and a surgical tool coupled to the robotic device and comprising a distal end. The system further includes a neural monitor configured to generate an electrical signal and apply the electrical signal to the distal end of the surgical tool, wherein the electrical signal causes innervation of a first portion of a patient's anatomy which generates an electromyographic signal, and a sensor configured to measure the electromyographic signal. The neural monitor is configured to determine a distance between the distal end of the surgical tool and a portion of nervous tissue based on the electrical signal and the electromyographic signal, and cause feedback to be provided to a user based on the distance.

SURGICAL SYSTEM WITH PASSIVE AND MOTORIZED JOINTS
20210379773 · 2021-12-09 · ·

A method includes obtaining an implant plan, defining a range of motion for a surgical tool based on the implant plan, adjusting, by an actuator and based on the range of motion, a passive joint coupled between the actuator and the surgical tool, and allowing manual movement of the surgical tool through the range of motion via rotation at the passive joint.

3D POSITION AND ORIENTATION CALCULATION AND ROBOTIC APPLICATION STRUCTURE USING INERTIAL MEASURING UNIT (IMU) AND STRING-ENCODER POSITIONS SENSORS
20220193919 · 2022-06-23 ·

A 3D position and orientation calculation and robotic application structure using an inertial measuring unit and string—encoder positions sensors.

Method for controlling an industrial robot during lead-through programming of the robot and an industrial robot

An industrial robot having a manipulator and a robot controller configured to control the motions of the manipulator. The robot controller is configured during lead-through programming of the robot to compare a robot position or a robot orientation (TCP) with at least one virtual position or virtual orientation defined in space, and to actively control the motions of the robot in relation to the at least one virtual position or virtual orientation when the difference between the robot position or robot orientation and the least one virtual position or virtual orientation is smaller than an offset value.

SYNTHETIC REPRESENTATION OF A SURGICAL ROBOT

A system comprises a first robotic arm adapted to support and move a tool and a second robotic arm adapted to support and move a camera configured to capture an image of a camera field of view. The system further comprises an input device, a display, and a processor. The processor is configured to display a first synthetic image including a first synthetic image of the tool. The first synthetic image of the tool includes a portion of the tool outside of the camera field of view. The processor is also configured to receive a user input at the input device and responsive to the user input, change the display of the first synthetic image to a display of a second synthetic image including a second synthetic image of the tool that is different from the first synthetic image of the tool.