B25J13/02

Robotic end effector interface systems
11707837 · 2023-07-25 · ·

Embodiments of the present disclosure are directed to methods, computer program products, and computer systems of a robotic apparatus with robotic instructions replicating a food preparation recipe. In one embodiment, a robotic control platform, comprises one or more sensors; a mechanical robotic structure including one or more end effectors, and one or more robotic arms; an electronic library database of minimanipulations; a robotic planning module configured for real-time planning and adjustment based at least in part on the sensor data received from the one or more sensors in an electronic multi-stage process file, the electronic multi-stage process recipe file including a sequence of minimanipulations and associated timing data; a robotic interpreter module configured for reading the minimanipulation steps from the minimanipulation library and converting to a machine code; and a robotic execution module configured for executing the minimanipulation steps by the robotic platform to accomplish a functional result.

FOLDIBLE REACHING AND GRASPING TOOL
20230234213 · 2023-07-27 ·

A reaching and grabbing tool is provided. The tool includes a trigger assembly having at least one trigger and a handle. A jaw assembly is provided having at least one jaw operably connected to the at least one trigger to move the at least one jaw. A hinge assembly is disposed between the trigger assembly and the jaw assembly, the hinge assembly being selectively lockable between an operating position and a folded position.

FOLDIBLE REACHING AND GRASPING TOOL
20230234213 · 2023-07-27 ·

A reaching and grabbing tool is provided. The tool includes a trigger assembly having at least one trigger and a handle. A jaw assembly is provided having at least one jaw operably connected to the at least one trigger to move the at least one jaw. A hinge assembly is disposed between the trigger assembly and the jaw assembly, the hinge assembly being selectively lockable between an operating position and a folded position.

Collaborative device with optimised control
20230027368 · 2023-01-26 ·

A collaborative device includes: a robotic arm including at least one motor; a tool secured to a free end of the robotic arm; a computer unit connected to the robotic arm to transmit instructions for controlling the robotic arm; and a joint having a flexible connection. The device integrates at least one sensor parameterised to detect forces exerted on the flexible connection. The computer unit is configured to: receive data from the sensor; translate the data into torques applied at the motor(s) of the robotic arm; generate instructions for attenuating the applied torques; and control the motor(s) of the robotic arm with the attenuation instructions.

Method for Estimating Intention Using Unsupervised Learning
20230021447 · 2023-01-26 ·

This patent proposal document provides a complete robot hand control scheme using myoelectric intention estimation of the human being using the kernel Principal Component Analysis Algorithm (kPCA). The robot hand system includes a biometric EMG sensor system, a robot hand including with multiple fingers, a controller connected with the biometric EMG sensor system, and a robot hand. The controller acquires the biometric EMG signal by means of a biometric sensor system, estimates myoelectric motion intention by applying the kernel principal component analysis (kPCA) algorithm using a kernel function, and delivers a control command corresponding to the estimated motion intention of the user to the robot hand.

Method for Estimating Intention Using Unsupervised Learning
20230021447 · 2023-01-26 ·

This patent proposal document provides a complete robot hand control scheme using myoelectric intention estimation of the human being using the kernel Principal Component Analysis Algorithm (kPCA). The robot hand system includes a biometric EMG sensor system, a robot hand including with multiple fingers, a controller connected with the biometric EMG sensor system, and a robot hand. The controller acquires the biometric EMG signal by means of a biometric sensor system, estimates myoelectric motion intention by applying the kernel principal component analysis (kPCA) algorithm using a kernel function, and delivers a control command corresponding to the estimated motion intention of the user to the robot hand.

CONTROL SYSTEM
20230226700 · 2023-07-20 · ·

A control system (10) includes: a manual operating device (1) that generates an operation signal in accordance with rotation of a dial (11) by an operator; a machine tool controller (4) and a robot controller (5) that are connected in such a manner as to be communicable with each other and that control a machine tool (6) and a robot (7), respectively, based on the operation signal; and an operational-target setting unit (2, 3) that sets an operational target for the manual operating device (1) selectively between the machine tool (6) and the robot (7). The manual operating device (1) is connected to one of the controllers (4, 5) and inputs the operation signal to the one of the controllers. When the operational-target setting unit (2, 3) sets the operational target as a control target (6) or (7) to be controlled by the other of the controllers (4, 5), the one of the controllers (4, 5) transmits the operation signal or a signal based on the operation signal to the other of the controllers.

CONTROL SYSTEM
20230226700 · 2023-07-20 · ·

A control system (10) includes: a manual operating device (1) that generates an operation signal in accordance with rotation of a dial (11) by an operator; a machine tool controller (4) and a robot controller (5) that are connected in such a manner as to be communicable with each other and that control a machine tool (6) and a robot (7), respectively, based on the operation signal; and an operational-target setting unit (2, 3) that sets an operational target for the manual operating device (1) selectively between the machine tool (6) and the robot (7). The manual operating device (1) is connected to one of the controllers (4, 5) and inputs the operation signal to the one of the controllers. When the operational-target setting unit (2, 3) sets the operational target as a control target (6) or (7) to be controlled by the other of the controllers (4, 5), the one of the controllers (4, 5) transmits the operation signal or a signal based on the operation signal to the other of the controllers.

Detection of user touch on controller handle
11559364 · 2023-01-24 · ·

Implementations relate to detecting user touch on a controller handle. In some implementations, a non-controlling mode of a control system is activated, and in the non-controlling mode, one or more actuators are controlled to cause a vibration to be provided on a handle of a controller. The vibration is sensed with one or more sensors, and a difference in the vibration is determined to have occurred relative to a reference vibration using the one or more sensors, where the difference satisfies one or more predetermined thresholds. A controlling mode of the system is activated in response to determining the difference in the vibration, and the vibration is modified on the handle in response to detecting the change in the vibration.

Teleoperation system, method, apparatus, and computer-readable medium
11559898 · 2023-01-24 · ·

Embodiments of the present disclosure provide a system, method, apparatus and computer-readable medium for teleoperation. An exemplary system includes a robot machine having a machine body, at least one sensor, at least one robot processor, and at least one user processor operable to maintain a user simulation model of the robot machine and the environment surrounding the robot machine, the at least one user processor being remote from the robot machine. The system further includes at least one user interface comprising a haptic user interface operable to receive user commands and to transmit the user commands to the user simulation model, a display operable to display a virtual representation of the user simulation model.