Patent classifications
G05B2219/39466
THREE-FINGER MECHANICAL GRIPPER SYSTEM AND TRAINING METHOD THEREOF
A three-finger mechanical gripper system is provided, which includes a torque sensor, a three-finger mechanical gripper, an image capturing module and a controller. The three-finger mechanical gripper is connected to the torque sensor. The controller is connected to the torque sensor, the three-finger mechanical gripper and the image capturing module. The image capturing module captures the image of a training object. The controller controls the three-finger mechanical gripper to grip the training object by a plurality of gripper postures respectively and calculates the torque information of each gripper posture according to the measured values of the torque sensor. Then, the controller performs a training process according to the image of the training object and the torque information of the gripper postures in order to obtain a training result of the training object.
ROBOT HAND
A robot hand is provided. The robot hand includes a first and second drive gears rotated by first actuator and second actuators; a first interlocked gear interlocked with the second drive gear to rotate in opposite directions; a second interlocked gear interlocked with the first drive gear to rotate in opposite directions; a first inner link engaged with rotation of the first drive gear; a first outer link engaged with rotation of the first interlocked gear; a first end link connected to the first inner link and the first outer link opposite the first actuator; a second inner link engaged with rotation of the second interlocked gear; a second outer link engaged with rotation of the second drive gear; and a second end link connected to the second inner link and the second outer link opposite the second actuator.
REGION-BASED GRASP GENERATION
A region-based robotic grasp generation technique for machine tending or bin picking applications. Part and gripper geometry are provided as inputs, typically from CAD files, along with gripper kinematics. A human user defines one or more target grasp regions on the part, using a graphical user interface displaying the part geometry. The target grasp regions are identified by the user based on the user's knowledge of how the part may be grasped to ensure that the part can be subsequently placed in a proper destination pose. For each of the target grasp regions, an optimization solver is used to compute a plurality of quality grasps with stable surface contact between the part and the gripper, and no part-gripper interference. The computed grasps for each target grasp region are placed in a grasp database which is used by a robot in actual bin picking operations.
Robotic system with handling mechanism and method of operation thereof
A gripper including: an orientation sensor configured to generate an orientation reading for a target object; a first grasping blade and a second grasping blade configured to secure the target object in conjunction with the first grasping blade and at an opposite end of the target object relative to the first grasping blade; a first position sensor, of the first grasping blade, configured to generate a first position reading of the first grasping blade relative to the target object; a second position sensor, of the second grasping blade, configured to generate a second position reading of the second grasping blade relative to the target object; and a blade actuator configured to secure the target object with the first grasping blade and the second grasping blade based on a valid orientation of the orientation reading and based on the first position reading and the second position reading indicating a stable condition.
User-assisted robotic control systems
Exemplary embodiments relate to user-assisted robotic control systems, user interfaces for remote control of robotic systems, vision systems in robotic control systems, and modular grippers for use by robotic systems. Systems, methods, apparatuses and computer-readable media instructions are disclosed for interactions with and control of robotic systems, in particular, pick and place systems using soft robotic actuators to grasp, move and release target objects.
USER-ASSISTED ROBOTIC CONTROL SYSTEMS
Exemplary embodiments relate to user-assisted robotic control systems, user interfaces for remote control of robotic systems, vision systems in robotic control systems, and modular grippers for use by robotic systems. The systems, methods, apparatuses and computer-readable media instructions described interact with and control robotic systems, in particular pick and place systems using soft robotic actuators to grasp, move and release target objects.
Robot and method for controlling a robot
A robot having a robot manipulator with an effector, wherein the robot manipulator is designed and constructed for picking up, handling, and releasing an object and is controlled by a control unit, the robot including a first sensor means designed and constructed to determine a persisting adherence of the object to an effector after a release of the object by the effector, and where such an adherence persists, to generate a signal S, wherein when a signal S is present, the control unit is designed and constructed to control the robot manipulator in such a manner that it executes a predefined movement B in which the effector with the persistently adhering object is passed by a wiping object in such a manner that the adhering object is wiped off the effector on a surface or an edge of the wiping object.
Work machine
A work machine including a work head including a holding tool to pick up and hold a component, a rotating device to rotate the holding tool about an axis of the holding tool, a pivoting device to pivot the holding tool between a first attitude in which a distal end portion of the holding tool faces downward and a second attitude in which the distal end portion of the holding tool faces sideways; a moving device to move the work head; and a control device to control operation of the work head and the moving device, the control device including an operation control section to cause the holding tool to pivot from the first attitude to the second attitude, and to cause the holding tool to rotate from a holding angle that is a rotation angle when the component was picked up to a target angle.
Drive unit of an automation component, in particular a gripping, clamping, and changing or pivoting unit
Drive unit of an automation component, in particular a gripping, clamping, changing, linear or pivoting unit, whereby the drive unit includes a drive for driving the movable parts of the automation component and a control unit which controls the drive, whereby the control unit includes at least one computing device, and the drive unit together with the drive, control unit and computing device is arranged in or on a base housing of the automation component.
Machine learning driven computer numerical control of a robotic machine tool
A modular robotic apparatus includes one or more sensors configured to generate sensor signals representing a manufacturing environment in which the modular robotic apparatus is located. A machine learning module is communicably coupled to the one or more sensors and includes a computer processor. The computer processor generates, by a machine learning model trained based on one or more manufacturing parameters, a computer numerical control (CNC) configuration. The one or more manufacturing parameters define a manufacturing task to be performed by the modular robotic apparatus. The machine learning model adjusts the CNC configuration based on the sensor signals. A robotic machine tool is communicably coupled to the machine learning module and includes an end effector. The robotic machine tool is configured to operate the end effector in accordance with the adjusted CNC configuration.