G05B19/423

Teaching system of dual-arm robot and method of teaching dual-arm robot

The purpose is to perform a teaching work of a dual-arm robot instinctively and easily. A dual-arm robot including two arms made of a plurality of links coupled to each other with joint shafts, and two instructing parts provided to tip ends of the two arms, respectively, and configured to indicate coordinate points in space and to be grippable by a teacher, and a control device configured to acquire the coordinate points indicated by the teacher moving the two instructing parts directly and simultaneously with both hands as teaching points, and teach the dual-arm robot operation corresponding to the acquired teaching points.

Teaching system of dual-arm robot and method of teaching dual-arm robot

The purpose is to perform a teaching work of a dual-arm robot instinctively and easily. A dual-arm robot including two arms made of a plurality of links coupled to each other with joint shafts, and two instructing parts provided to tip ends of the two arms, respectively, and configured to indicate coordinate points in space and to be grippable by a teacher, and a control device configured to acquire the coordinate points indicated by the teacher moving the two instructing parts directly and simultaneously with both hands as teaching points, and teach the dual-arm robot operation corresponding to the acquired teaching points.

Method for teaching a robotic arm to pick or place an object
10059005 · 2018-08-28 · ·

A method for teaching a robotic arm to pick or place an object includes the following steps. Firstly, the robot arm is pushed until a target appears within a vision. Then, an appearance position of the target is set as a visual point. Then, a first image is captured. Then, the robot arm is pushed to a target position from the visual point. Then, the target position is set as a pick and place point. Then, an automatic movement control of the robot arm is activated. Then, the robot arm automatically picks and places the object and returns to the visual point from the pick and place point. Then, a second image is captured. Then, a differential image is formed by subtracting the second image from the first image, the target image is set according to the differential image, and image characteristic of the target are automatically learned.

Method for teaching a robotic arm to pick or place an object
10059005 · 2018-08-28 · ·

A method for teaching a robotic arm to pick or place an object includes the following steps. Firstly, the robot arm is pushed until a target appears within a vision. Then, an appearance position of the target is set as a visual point. Then, a first image is captured. Then, the robot arm is pushed to a target position from the visual point. Then, the target position is set as a pick and place point. Then, an automatic movement control of the robot arm is activated. Then, the robot arm automatically picks and places the object and returns to the visual point from the pick and place point. Then, a second image is captured. Then, a differential image is formed by subtracting the second image from the first image, the target image is set according to the differential image, and image characteristic of the target are automatically learned.

ROBOT TEACHING DEVICE, AND ROBOT TEACHING METHOD

To enhance the productivity of an offline teaching work by visualizing the working position, path, and the like of an end effector in offline teaching. A robot teaching device, includes: a teaching point marker display unit configured to display, on a GUI, teaching point markers for marking teaching points in a three-dimensional modeling space in which at least a three-dimensional model of a robot including an end effector and a three-dimensional model of a work piece are arranged, the teaching points serving as target passing points of the end effector; a joined path display unit configured to display, on the GUI, a joined path, which is a path connecting successive points among the teaching points to each other; and a changing point marker display unit configured to display, on the GUI, a changing point marker marking a point at which a working state of the end effector changes.

ROBOT TEACHING DEVICE, AND ROBOT TEACHING METHOD

To enhance the productivity of an offline teaching work by visualizing the working position, path, and the like of an end effector in offline teaching. A robot teaching device, includes: a teaching point marker display unit configured to display, on a GUI, teaching point markers for marking teaching points in a three-dimensional modeling space in which at least a three-dimensional model of a robot including an end effector and a three-dimensional model of a work piece are arranged, the teaching points serving as target passing points of the end effector; a joined path display unit configured to display, on the GUI, a joined path, which is a path connecting successive points among the teaching points to each other; and a changing point marker display unit configured to display, on the GUI, a changing point marker marking a point at which a working state of the end effector changes.

GENERATING A ROBOT CONTROL POLICY FROM DEMONSTRATIONS COLLECTED VIA KINESTHETIC TEACHING OF A ROBOT
20180222045 · 2018-08-09 ·

Generating a robot control policy that regulates both motion control and interaction with an environment and/or includes a learned potential function and/or dissipative field. Some implementations relate to resampling temporally distributed data points to generate spatially distributed data points, and generating the control policy using the spatially distributed data points. Some implementations additionally or alternatively relate to automatically determining a potential gradient for data points, and generating the control policy using the automatically determined potential gradient. Some implementations additionally or alternatively relate to determining and assigning a prior weight to each of the data points of multiple groups, and generating the control policy using the weights. Some implementations additionally or alternatively relate to defining and using non-uniform smoothness parameters at each data point, defining and using d parameters for stiffness and/or damping at each data point, and/or obviating the need to utilize virtual data points in generating the control policy.

ROBOT SYSTEM INCLUDING FORCE-CONTROLLED PUSHING DEVICE
20180210434 · 2018-07-26 ·

A robot system including a force-controlled pushing device which causes, when a robot is guided and moved, an object provided at a tip end of the robot to be brought into appropriate contact with another object. The robot system includes the robot, the force-controlled pushing device, a robot operation input measuring part, a robot movement command calculating part, a pushing direction setting part, a target pushing force setting part, a force measuring part, and a force-controlled pushing device movement command calculating part. The pushing direction setting part sets a pushing direction of the force-controlled pushing device, based on at least one of: the position/orientation of the first object; a force-controlled pushing device movement command for moving the first object; the position/orientation of the movement mechanism part of the force-controlled pushing device; the position/orientation of the robot; and a robot movement command for moving the robot.

ROBOT SYSTEM INCLUDING FORCE-CONTROLLED PUSHING DEVICE
20180210434 · 2018-07-26 ·

A robot system including a force-controlled pushing device which causes, when a robot is guided and moved, an object provided at a tip end of the robot to be brought into appropriate contact with another object. The robot system includes the robot, the force-controlled pushing device, a robot operation input measuring part, a robot movement command calculating part, a pushing direction setting part, a target pushing force setting part, a force measuring part, and a force-controlled pushing device movement command calculating part. The pushing direction setting part sets a pushing direction of the force-controlled pushing device, based on at least one of: the position/orientation of the first object; a force-controlled pushing device movement command for moving the first object; the position/orientation of the movement mechanism part of the force-controlled pushing device; the position/orientation of the robot; and a robot movement command for moving the robot.

Ascertaining An Input Command For A Robot, Said Input Command Being Entered By Manually Exerting A Force Onto The Robot

A method for automatically ascertaining an input command for a robot, wherein the input command is entered by manually exerting an external force onto the robot. The input command is ascertained on the basis of the joint force component attempting to cause a movement of the robot in only one robot joint coordinate sub-space which is specific to the input command. The joint forces are imprinted with the external force.