G05B2219/40195

ROBOT SYSTEM

A robot system (100) of the present disclosure includes a robot (101) installed in a workarea (201), an interface (102), a display (105), and a control device (111). When operating the robot (101) to perform a kind of work defined beforehand to a workpiece (300) based on manipulational command information on the robot (101) inputted from the interface (102), the control device (111) displays on the display (105) a spatial relationship between the workpiece (300) and the robot (101) in a state where the workpiece and the robot are seen from a direction different from a direction in which an operator looks at the robot (101) from a manipulation area (202) that is a space different from the workarea (201), based on three-dimensional model information on the workpiece (300), three-dimensional model information on the robot (101), and the manipulational command information.

ROBOT SYSTEM AND METHOD OF FORMING THREE-DIMENSIONAL MODEL OF WORKPIECE

A robot system includes a robot installed in a work area and controlled by a second control device, a 3D camera operated by an operator, a sensor that is disposed in a manipulation area that is a space different from the work area, and wirelessly detects position information and posture information on the 3D camera, a display, and a first control device. The first control device acquires image information on a workpiece imaged by the 3D camera, acquires, from the sensor, the position information and the posture information when the workpiece is imaged by the 3D camera, displays the acquired image information on the display, forms a three-dimensional model of the workpiece based on the image information, and the acquired position information and posture information, displays the formed three-dimensional model on the display, and outputs first data that is data of the formed three-dimensional model to the second control device.

Teleoperated Robotic System with Impact Responsive Force Feedback
20220410367 · 2022-12-29 ·

A teleoperated robotic system that includes master control arms, slave arms, and a mobile platform. In use, a user manipulates the master control arms to control movement of the slave arms. The teleoperated robotic system can include two master control arms and two slave arms. The master control arms and the slave arms can be mounted on the platform. The platform can provide support for the master control arms and for a teleoperator, or user, of the robotic system. Thus, a mobile platform can allow the robotic system to be moved from place to place to locate the slave arms in a position for use. Additionally, the user can be positioned on the platform, such that the user can see and hear, directly, the slave arms and the workspace in which the slave arms operate.

SURGICAL ROBOT AND CONTROL METHOD OF SURGICAL ROBOT

A surgical robot includes a controller configured or programmed to control a display to superimpose guide information indicating a moving direction of a medical cart based on a steering angle of a steering device on an image captured by an imaging device and display the guide information.

Surgical equipment control input visualization field

A control console includes a first input control operable by an operator and outside of a field of view of the operator, a display within the field of view of the operator, and a first image capture device. The first image capture device acquires a first image of a physical environment surrounding the first input control and at least part of the first input control. The first image is output on the display. In some embodiments, a second image of a work site may also be displayed, so that the operator can locate and operate the first input control while continuously viewing the second image of the work site.

Remotely operated pneumatic manipulator based on kinect
11491657 · 2022-11-08 ·

The invention disclosure a remotely operated pneumatic manipulator based on Kinect, comprising Kinect sensor, computer, D/A embedded board, PWM piezoelectric pneumatic ratio valve, pneumatic triad, air compressor, artificial muscle, spring and finger joint, wherein the Kinect sensor is provided on one side of the finger joint, a camera module of the Kinect sensor is faced to the finger joint. The pneumatic humanoid manipulator of the invention has basically the same dimensions as human hands, can achieve human-computer interaction and remotely operation, the transmission structure thereof is novel, simple and compact, the fingers thereon are convenient to control and flexible to move, the finger movement range is large for wide application, moreover, the PWM piezoelectric pneumatic ratio valve is with advantages of fast dynamic response, low cost, strong resistance to noise, and high detection accuracy of Kinect sensor.

HAPTIC USER INTERFACE FOR ROBOTICALLY CONTROLLED SURGICAL INSTRUMENTS

A powered user interface for a robotic surgical system operates in accordance with a mode of operation in which the actuators are operated to permit motion of the handle in pitch and yaw motion constrained with respect to a virtual fulcrum in a work space of the user interface, and insertion motion is constrained along an axis passing through the virtual fulcrum. In a virtual fulcrum setting mode, a user is prompted to give input to the system selecting a desired point in space for the virtual fulcrum. The selected point in space is then set as the virtual fulcrum

SYSTEMS, DEVICES, AND METHODS FOR DEVELOPING ROBOT AUTONOMY
20230122611 · 2023-04-20 ·

A method of operation of a robot includes determining a set of candidate actions to be performed by the robot based on an objective. A level of autonomy of the robot is determined from a control model associated with the robot. A subset of candidate actions for which the level of autonomy of the robot is below a threshold level of autonomy is determined from the set of candidate actions. The robot receives a set of instructions for at least one candidate action in the subset of candidate actions from a tele-operation system. The robot executes the set of instructions and updates the control model based on a result of executing the set of instructions.

SYSTEMS, DEVICES, AND METHODS FOR DEVELOPING ROBOT AUTONOMY
20220324115 · 2022-10-13 ·

In a method of operation of a robot, the robot identifies a set of candidate actions that may be performed by the robot, and collects, for each candidate action of the set of candidate actions, a respective set of ancillary data. The robot transmits a request for instructions to a tele-operation system that is communicatively coupled to the robot. The request for instructions includes each candidate action and each respective set of ancillary data. The robot receives, and executes, the instructions from the tele-operation system. The robot updates a control model, based at least in part on each candidate action, each respective set of ancillary data, and the instructions, to increase a level of autonomy of the robot. The robot may transmit the request for instructions to the tele-operation system in response to determining the robot is unable to select a candidate action to perform in furtherance of an objective.

TELE-MANUFACTURING SYSTEM
20230112463 · 2023-04-13 ·

A tele-manufacturing system comprising a manufacturing environment containing equipment used for a manufacturing process; a plurality of sensors positioned within the manufacturing environment in proximity to the manufacturing equipment, wherein each sensor is configured to gather data from the manufacturing environment; at least one digitizer in communication with the sensors for receiving data from sensors and converting the data into one or more three-dimensional digital maps or point clouds; at least one processor in communication with the at least one digitizer, wherein the processor includes software for receiving and analyzing the digital maps or point clouds; and at least one manual controller in communication with the processor, wherein the manual controller receives motion input from a user, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the manufacturing equipment by the processor, and wherein the manufacturing equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing process.