INPUT SHAPING CONTROL OF A ROBOT ARM IN DIFFERENT REFERENCE SPACES
20230191603 · 2023-06-22
Assignee
Inventors
- Dan Kielsholm THOMSEN (Hinnerup, DK)
- Rune SØE-KNUDSEN (Årslev, DK)
- Jeppe Barsøe JESSEN (Odense S, DK)
- Christian Valdemar LORENZEN (Odense S, DK)
Cpc classification
G05B2219/41217
PHYSICS
B25J9/1641
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1664
PERFORMING OPERATIONS; TRANSPORTING
B25J9/0009
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/39195
PHYSICS
International classification
Abstract
A robot controller for controlling a robot arm comprising: —a first space shaping module configured to provide a shaped first space target motion by convolving a first space target motion with an impulse train, where the first space target motion defines a target motion in a first reference space; —a second space shaping module configured to provide a shaped second space target motion by convolving a second target motion with the impulse train; where the second target motion defines the target motion in a second reference space; and —a motor controller module configured to generate motor control signals to the joint motors based on the shaped first space target motion and the shaped second space target motion. This makes it possible to dynamically adjust in which reference space the input shaping shall be performed whereby vibrations and deviation in one reference space caused by input shaping in another reference space can be reduced.
Claims
1. A robot controller for controlling a robotic arm, where the robotic arm comprises joints connecting a base of the robotic arm and a tool flange, and where at least one of the joints comprises an output flange that is movable relative to a body of a joint and a motor configured to move the output flange relative to the body, the robot controller comprising: a shaping module configured to shape a target motion of the robotic arm; and a motor controller module configured to generate at least one motor control signal for the motor, the shaping module comprising: a first space shaping module configured to produce a shaped first space target motion by convolving a first space target motion with an impulse train, where the first space target motion defines the target motion in a first reference space; and a second space shaping module configured to produce a shaped second space target motion by convolving a second space target motion with an impulse train, where the second space target motion defines the target motion in in a second reference space; and wherein the motor controller module is configured to generate the at least one motor control signal based on at least one of the shaped first space target motion or the shaped second space target motion.
2. The robot controller of claim 1, wherein motor controller module is configured to generate the at least one motor control signal based on both the shaped first space target motion and the shaped second space target motion.
3. The robot controller of claim 1, wherein the target motion comprises a continuous motion of at least a part of the robotic arm; and wherein for at least a part of the target motion the robot controller is configured to generate the at least one motor control signal based on the shaped first space target motion and based on the shaped second space target motion.
4. The robot controller of claim 1, wherein the target motion comprises a continuous motion of at least a part of said robot the robotic arm; and wherein the robot controller is configured to perform operations comprising: in a first part of the target motion, generating the at least one motor control signal based on the shaped first space target motion and not based on the shaped second space target motion; and in a second part of the target motion, generating the at least one motor control signal based on the shaped second space target motion and not based on the shaped first space target motion.
5. The robot controller of claim 1, further comprising: a combining module configured to combine the shaped first space target motion and the shaped second space target motion into a combined shaped target motion; wherein the motor controller module is configured to generate the at least one motor control signal based on the combined shaped target motion.
6. The robot controller of claim 1, wherein the motor controller module comprises: a first space motor control module configured to generate a first motor control signal based on the shaped first space target motion and a first dynamic model of the robot arm, where the first dynamic model is defined in the first reference space; a second space motor control module configured to generate a second motor control signal based on the shaped second space target motion and a second dynamic model of the robot arm, where the second dynamic model is defined in the second reference space; and a motor control signal combining module configured to generate the at least one motor control signal based on the first motor control signal and the second motor control signal.
7. The robot controller of claim 1, wherein the first reference space and the second reference space are different.
8. The robot controller of claim 1, wherein the first reference space is a joint reference space in which kinematics of at least a part of the robotic arm are based on joint parameters, where the joint parameters are based on indicate the kinematics of at least one joint motor and at least one output flange.
9. The robot controller of claim 1, wherein the second reference space is a coordinate space in which kinematics of at least a part of the robotic arm are defined relative to to a reference point.
10. The robot controller of claim 1, further comprising: at least one space transformation module configured to transform the target motion into at least one of the first space target motion in the first reference space or the second space target motion in the second reference space.
11. The robot controller of claim 1, further comprising: at least one space transformation module configured to transform at least one of: the shaped first space target motion into a shaped first target motion in at least one of a target reference space of the target motion or the second reference space; or the shaped second space target motion into a shaped second target motion in at least one of a target reference space of the target motion or the first reference space.
12. The robot controller of claim 1, further comprising: at least one scaling module configured to scale at least one of: the shaped first space target motion according to a first space scaling parameter to produce a scaled shaped first space target motion; and the shaped second space target motion according to a second space scaling parameter to produce a scaled shaped second space target motion.
13. The robot controller of claim 12, further comprising: a shaped target motion combining module configured to combine the scaled shaped first space target motion and the scaled shaped second space target motion into a combined shaped target motion.
14. The robot controller of claim 13, further comprising: at least one scaling module configured to scale the target motion based on a target motion scaling parameter to produce a scaled target motion.
15. The robot controller of claim 14, further comprising: a shaped target motion combining module configured to combine the scaled shaped first space target motion, the scaled shaped second space target motion, and the scaled target motion into a combined shaped target motion.
16. A method of controlling a robotic arm, where the robotic arm comprises joints connecting a base of the robotic arm and a tool flange, and where at least one of the joints comprises an output flange that is movable relative to a body of a joint and a motor configured to move the output flange relative to the body, the method comprising: generating a target motion of the robotic arm; generating a shaped first space target motion by convolving a first space target motion with an impulse train, where the first space target motion defines the target motion of the robotic arm in a first reference space; generating a shaped second space target motion by convolving a second space target motion with an impulse train, where the second space target motion defines the target motion of the robot arm in a second reference space; and generating at least one motor control signal for at least one joint motor of the robotic arm based on at least one of the shaped first space target motion or the shaped second space target motion.
17. The method according to of claim 16, wherein the at least one motor control signal is based on both the shaped first space target motion and the shaped second space target motion.
18. The method of claim 16, wherein the target motion comprises a continuous motion of at least a part of the robotic arm, and wherein the method further comprises, for at least a part of the target motion, generating said at least one motor control signal based on the shaped first space target motion and based on the shaped second space target motion.
19. The method of claim 16, wherein the target motion comprises a continuous motion of at least a part of the robotic arm; and wherein the method comprises: for a first part of the target motion, generating the at least one motor control signal based on the shaped first space target motion and not based on the shaped second space target motion; and for a second part of the target motion, generating the at least one motor control signal based on the shaped second space target motion and not based on the shaped first space target motion (Q.sub.t*).
20. The method of claim 16, further comprising: combining the shaped first space target motion and the shaped second space target motion into a combined shaped target motion, where generating the at least one motor control signal is based on the combined shaped target motion.
21. The method of claim 16, further comprising: generating a first motor control signal based on the shaped first space target motion and a first dynamic model of the robotic arm, where the first dynamic model is defined in the first reference space; generating a second motor control signal based on the shaped second space target motion X.sub.t* and a second dynamic model of the robotic arm, where the second dynamic model is defined in the second reference space; and combining the first motor control signal and the second motor control signal to generate the at least one motor control signal.
22. The method of claim 16, wherein the first reference space and the second reference space are different.
23. The method of claim 16, wherein the first reference space is a joint reference space in which kinematics of at least a part of the robotic arm are based on robot joint parameters, where the robot joint parameters correspond to kinematics of at least one joint motor and kinematics of at least one output flange.
24. The method of claim 16, wherein the second reference space is a coordinate space in which kinematics of at least a part of the robotic arm are relative to a reference point.
25. The method of claim 16, further comprising: transforming the target motion into the first space target motion in the first reference space; and transforming the target motion into the second space target motion in the second reference space.
26. The method of claim 16, further comprising: transforming the shaped first space target motion into a shaped target motion in at least one of a target reference space of the target motion and the second reference space; and transforming the shaped second space target motion into a shaped target motion in at least one of a target reference space of the target motion and the first reference space.
27. The method of claim 16, further comprising: scaling the shaped first space target motion based on a first space scaling parameter to produce a scaled shaped first space target motion; scaling the shaped second space target motion based on a second space scaling parameter to produce a scaled shaped second space target motion.
28. The method of claim 27, further comprising: combining the scaled shaped first space target motion and the scaled shaped second space target motion to produce a combined shaped target motion.
29. The method of claim 28, further comprising: scaling the target motion based on a target space scaling parameter to produce a scaled target motion.
30. The method of claim 29, further comprising: combining the scaled target motion with the scaled shaped first space target motion the scaled shaped second space target motion to produce the combined shaped target motion.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
DETAILED DESCRIPTION OF THE INVENTION
[0043] The present invention is described in view of exemplary embodiments only intended to illustrate the principles of the present invention. The skilled person will be able to provide several embodiments within the scope of the claims. Throughout the description, the reference numbers of similar elements providing similar effects have been given the same last two digits. Further it is to be understood that in the case that an embodiment comprises a plurality of the same features then only some of the features may be labeled by a reference number.
[0044]
[0045] The robot arm 101 comprises a plurality of robot joints 102a, 102b, 102c, 102d, 102e, 102f connecting a robot base 103 and a robot tool flange 104. A base joint 102a is configured to rotate the robot arm around a base axis 105a (illustrated by a dashed dotted line) as illustrated by rotation arrow 106a; a shoulder joint 102b is configured to rotate the robot arm around a shoulder axis 105b (illustrated by a cross indicating the axis) as illustrated by rotation arrow 106b; an elbow joint 102c is configured to rotate the robot arm around an elbow axis 105c (illustrated by a cross indicating the axis) as illustrated by rotation arrow 106c; a first wrist joint 102d is configured to rotate the robot arm around a first wrist axis 105d (illustrated by a cross indicating the axis) as illustrated by rotation arrow 106d and a second wrist joint 102e is configured to rotate the robot arm around a second wrist axis 105e (illustrated by a dashed dotted line) as illustrated by rotation arrow 106e. Robot joint 102f is a robot tool joint comprising the robot tool flange 104, which is rotatable around a tool axis 105f (illustrated by a dashed dotted line) as illustrated by rotation arrow 106f. The illustrated robot arm is thus a six-axis robot arm with six degrees of freedom with six rotational robot joints, however it is noticed that the present invention can be provided in robot arms comprising less or more robot joints and also other types of robot joints such as prismatic robot joints providing a translation of parts of the robot arm for instance a linear translation.
[0046] The robot joints may comprise a robot joint body and an output flange rotatable or translatable in relation to the robot joint body and the output flange is connected to a neighbor robot joint either directly or via an arm section as known in the art. The robot joint comprises a joint motor configured to rotate or translate the output flange in relation to the robot joint body, for instance via a gearing or directly connected to the motor shaft. The robot joint body can for instance be formed as a joint housing and the joint motor can be arranged inside the joint housing and the output flange can extend out of the joint housing. Additionally, the robot joints can comprise at least one joint sensor providing a sensor signal for instance indicative of at least one of the following parameters: an angular and/or linear position of the output flange, an angular and/or linear position of the motor shaft of the joint motor, a motor current of the joint motor or an external force and/or torque trying to rotate the output flange or motor shaft. For instance, the angular position of the output flange can be indicated by an output encoder such as optical encoders, magnetic encoders which can indicate the angular position of the output flange in relation to the robot joint. Similarly, the angular position of the joint motor shaft can be provided by an input encoder such as optical encoders, magnetic encoders which can indicate the angular position of the motor shaft in relation to the robot joint. It is noted that both output encoders indicating the angular position of the output flange and input encoders indicating the angular position of the motor shaft can be provided, which in embodiments where a gearing have been provided makes it possible to determine a relationship between the input and output side of the gearing. The joint sensor can also be provided as a current sensor indicating the current through the joint motor and thus be used to obtain the torque provided by the motor. For instance, in connection with a multiphase motor, a plurality of current sensors can be provided in order to obtain the current through each of the phases of the multiphase motor. It is also noted that some of the robot joints may comprise a plurality of output flanges rotatable and/or translatable by joint actuators, for instance one of the robot joints may comprise a first output flange rotating/translating a first part of the robot arm in relation to the robot joint and a second output flange rotating/translating a second part of the robot arm in relation to the robot joint. The joint sensor can also be provided as a force-torque sensor or an acceleration sensor. For instance, a force and/or torque sensor may be provided at the tool joint and configured to indicate force and/or torque provided to the tool flange and an acceleration sensor may also be provided at the tool joint and configured to indicate the acceleration of the tool joint. However, the other parts of the robot arm may also comprise force-torque sensors or acceleration sensors.
[0047] A robot tool flange reference point 107 also known as a TCP (Tool Center Point) is indicated at the robot tool flange and defines the origin of a tool flange coordinate system defining three coordinate axes x.sub.flange, y.sub.flange, z.sub.flange. In the illustrated embodiment the origin of the robot tool flange coordinate system has been arrange on the tool flange axis 105f with one axis (z.sub.flange) parallel with the tool flange axis and with the other axes x.sub.flange, y.sub.flange parallel with the outer surface of the robot tool flange 104. Further a base reference point 108 is coincident with the origin of a robot base coordinate system defining three coordinate axes x.sub.base, y.sub.base, z.sub.base. In the illustrated embodiment the origin of the robot base coordinate system has been arrange on the base axis 105a with one axis (z.sub.base) parallel with the base axis 105a axis and with the other axes x.sub.base, y.sub.base parallel with at the bottom surface of the robot base. The direction of gravity 109 in relation to the robot arm is also indicated by an arrow and it is to be understood that the robot arm can be arrange at any position and orientation in relation to gravity.
[0048] The robot system comprises at least one robot controller 110 configured to control the robot arm 101. The robot controller is configured to control the motions of the parts of the robot arm and the robot joints for instance by controlling the motor torque provided to the joint motors based on a dynamic model of the robot arm, the direction of gravity acting and the joint sensor signal. Further the robot controller may control the motions of the robot arm based on a robot program stored in a memory of the robot controller. The controller can be provided as an external device as illustrated in
[0049] The robot controller can comprise an interface device 111 enabling a user to control and program the robot arm. The interface device can for instance be provided as a teach pendent as known from the field of industrial robots which can communicate with the controller via wired or wireless communication protocols. The interface device can for instanced comprise a display 112 and a number of input devices 113 such as buttons, sliders, touchpads, joysticks, track balls, gesture recognition devices, keyboards, microphones etc. The display may be provided as a touch screen acting both as display and input device. The interface device can also be provided as an external device configured to communicate with the robot controller, for instance in form of smart phones, tablets, PCs, laptops etc.
[0050] The robot system may also comprise an end effector 126 (illustrated in dotted lines) attached to the robot tool flange and is illustrated in form of a gripper, however it is to be understood that the end effector can be any kind of end effector such as grippers, vacuum grippers, magnetic grippers, screwing machines, welding equipment, gluing equipment, dispensing systems, painting equipment, visual systems, cameras etc.
[0051]
[0052] The end effector 126 connected to the robot tool flange 104 may be connected to the robot controller and the robot controller may be configured to control the end effector via an end effector control signal 228. Further the end effector may provide an effector feedback signal 229 to the robot controller for instance in order to indicate the status of the end effector, status of various end effector sensors etc.
[0053] The robot controller 110 comprises a processer 221, memory 222 and communication interfaces for communicating with external devices such as the user interface, the robot joints, the end effector etc. The processor comprises a motion planner module 230, a shaping module 231, an impulse generation module 237, a combining module 238 and a motor controller module 232. The motion planner module 230, the shaping module 231, the impulse generation module 237, the combining module 238 and the motor controller module 232 can for instance be provided as processes executed by the processor 221, however it is noted that they also can be provided and executed on separate processor units.
[0054] The motion planner module 230 is configured to provide target motions of the robot arm, for instance by generating trajectories of parts of the robot arm. The trajectories can for instance be generated based on a robot program stored in a memory 222, based on an external control signal 224 and/or user inputs provided via an interface device 111. In the illustrated embodiment the motion planner module provides a target motion M.sub.t of parts of the robot arm. The target motion may indicate the kinematics of at least at part of the robot arm, for instance a path along which a part of the robot arm shall move, the speed of a part of the robot arm, the acceleration of a part of the robot arm, a waypoint to which a part of the robot arm shall move, or a force/torque to be generated by part of the robot arm. The target motion can for instance be indicated in a target reference space, such as a cartesian space in reference to the robot base coordinate system, the tool flange coordinate system or any other reference coordinate systems, such as a polar coordinate system. Also, the target motion can be indicated in joint space where the kinematics of the robot joints are indicated; e.g. as angular position q.sub.t of output axles of the joint transmissions, a desired angular velocity {dot over (q)}.sub.t of output axles of the joint transmissions, a desired angular acceleration {umlaut over (q)}.sub.t of the robot transmission.
[0055] The shaping module 231 is configured to provide at least one shaped target motion based on the target motion M.sub.t and the impulse train A, S, in order to utilize input shaping reducing the vibrations of the robot arm. The impulse train comprises a number of impulses {right arrow over (A)} separated by a time distance {right arrow over (Δ)}. In the illustrated embodiment the impulse train is generated by the impulse generation module 237 which is configured to generate the impulse train based on the vibrational properties of the robot arm as known in the art of input shaping, for instance based on the configuration/pose of the robot arm. For instance, the configuration/pose of the robot arm can be obtained based on the target motion or the joint sensor parameters, such as the angular position of the output flanges of the robot joints. The impulse train can also be obtained from memory 222.
[0056] According to the present invention, the shaping module 231 comprises a first space shaping module 233 and a second space shaping module 234. The first space shaping module 233 is configured to provide a shaped first space target motion Q.sub.t* by convolving a first space target motion Q.sub.t with the impulse train ({right arrow over (A)},{right arrow over (Δ)}), where the first space target motion Q.sub.t defines the target motion in a first reference space. The second space shaping module 234 is configured to provide a shaped second space target motion X.sub.t* by convolving a second space target motion X.sub.t with the impulse train ({right arrow over (A)},{right arrow over (Δ)}), where the second space target motion defines the target motion in a second reference space.
[0057] The shaping module may optionally comprise a target space to first space transformation module 235 configured to transform the target motion M.sub.t into the first space target motion in the first reference space Q.sub.t. This can be achieved by utilizing a mapping functions transforming the target motion into the first reference space, for instance the target motion M.sub.t may define the kinematics of a part of the robot in relation to a reference point in a coordinate space and the target space to first space transformation module 235 can be configured to utilize inverse kinematics as known from the field of robotics to transform the target motion into for instance a joint reference space, where the kinematics of at least a part of the robot arm is indicated based on robot joint parameters such as the kinematics of joint motors or the kinematics of the output flanges. It is to be understood that the target space to first space conversion module 235 may be omitted in embodiments where the first target motion M.sub.t indicates the target motion of the robot arm in the first reference space, as consequently the first space shaping module 233 can provide the shaped first space target motion by convolving the target motion M.sub.t with the impulse train.
[0058] The shaping module may optionally comprise a target space to second space transformation module 236 configured to transform the target motion M.sub.t into the second space target motion in the second reference space X.sub.t. This can be achieved by utilizing a mapping functions transforming the target motion into the second reference space. For instance, the target motion M.sub.t may define the kinematics of a part of the robot arm in a joint reference space, where the kinematics of at least a part of the robot arm is indicated based on robot joint parameters such as the kinematics of joint motors or the kinematics of the output flanges, and the target space to second space transformation module 236 can be configured to utilize forward kinematics as known from the field of robotics to transform the target motion into for a coordinate space where the kinematics of the robot arm is indicated in relation to a reference point. It is to be understood that the target second space transformation module 236 may be omitted in embodiments where the second target motion M.sub.t indicates the target motion of the robot arm in the second reference space, as consequently the second space shaping module 234 can provide the shaped second space target motion by convolving the target motion M.sub.t with the impulse train.
[0059] The combining module 238 is configured to combine the shaped first space target motion Q.sub.t* and the shaped second space target motion X.sub.t* into a combined shaped target motion M.sub.t*, based on which the motor controller module generates the motor control signals. Consequently, the motor controller module can be provided as known in the art of robot control as the motor controller module receives a shaped target motion which is of the same kind as an ordinary target motion. The combination module can for instance be configured to transform the shaped first space target motion Q.sub.t* and the shaped second space target motion X.sub.t* into the reference space of the target motion M.sub.t and then adding the two shaped target motions. In one embodiment the two shaped target motions can be scaled in relation to each other.
[0060] The motor controller module 232 is configured to generate the at least one motor control signal 223a-223f to the joint motors based on at least one of the shaped first space target motion Q.sub.t* and the shaped second space target X.sub.t* which in the illustrated embodiment is provided as the combined shaped target motion M.sub.t* provided by the combining module. The motor controller module 232 is configured to generate at least one motor control signal to the joint motors, for instance in form of motor control signals 223a, 223b, 223f indicating control parameters for the joint motors, which can be used to control the joint motors as desired. For instance the control parameters can indicate the motor torque T.sub.motor,a, T.sub.motor,b, and T.sub.motor,f that each joint motor shall provide to the output flanges and the robot controller is configured to determine the motor torque based on a dynamic model of the robot arm as known in the prior art. The motor controller module 232 is configured to generate the motor control signals 223a, 223b, 223f based on the combined shaped target motion M.sub.t* and a dynamic model of the robot arm D.sub.robot. The dynamic model of the robot arm D.sub.robot can for instance be stored in a memory 222. The dynamic model makes it possible for the controller to calculate which torque the joint motors shall provide to each of the joint motors to make the robot arm perform a target motion, where a target motion indicate a motion of at least a part of the robot arm. The motor controller module may additionally also as illustrated by dotted line be configured to generate the motor control signal 223a, 223b, 223f based on at least one sensor signal 220a, 220b, 220f indicative of at least one joint sensor parameter J.sub.sensor,a, J.sub.sensor,b, J.sub.sensor,f and/or other sensor signals indicating other robot parameters. The sensor signal can for instance indicate the angular position q of the output flange; the angular position θ of the motor axle; the motor torque T.sub.motor provided to the motor axle by the joint motor. For instance, the joint motors can be provided as multiphase electromotors and the robot controller can be configured to adjust the motor torque provided by the joint motors by regulating the current through the phases of the multiphase motors as known in the art of motor regulation.
It is noted that the motor controller module 232 also can be configured to generate the at least one motor control signal 223a-223f to the joint motors directly based on at least one of the shaped first space target motion Q.sub.t* and the shaped second space target X.sub.t*. The shaped first space target motion Q.sub.t* and the shaped second space target X.sub.t* can thus be directly provided to the motor controller module 232 and the combination module 238 can thus be omitted. Such an embodiment is illustrated in
[0061] Providing both a shaped first space target motion Q.sub.t* and a shaped second space target motion X.sub.t* makes it possible to utilize impulse shaping in two different references spaces whereby impulse shaping in two different reference spaces can be utilized. Consequently, the user of the robot arm can choose in which reference space that he/she wants to reduce vibrations of the robot arm and also online switch between in which reference space the impulse shaping shall be implemented. This is beneficial during a target motion defining a continuous motion, where at least a part of the robot arm constantly moves; meaning that the part of the robot arm during the continues motion does not experience a standstill where the speed of the part is zero.
[0062] In one embodiment the robot controller is configured to generate the at least one motor control signal (223a-223f) based on the shaped first space target motion (Q.sub.t*) and the shaped second space target motion (X.sub.t*). This is useful in connection with parts of a continues motion where the target motion changes from moving in relation to a first reference space to moving in relation to a second reference space or in connection with blend parts of a continuous motion.
[0063] In one embodiment the robot controller is configured to: [0064] in a first part of the continuous motion to generate the motor control signals (223a-223f) based on the shaped first space target motion (Q.sub.t*) and not based on the shaped second space target motion (X.sub.t*); and [0065] in a second part of the target motion to generate the motor control signals (223a-223f) based on the shaped second space target motion (X.sub.t*) and not based on the shaped first space target motion (Q.sub.t*).
This is useful in connection with continues motions where the target motion has parts moving in relation to one reference space and other parts moving in relation to a second reference space.
[0066]
[0067] In step 362 the shaped first space target motion Q.sub.t* is generated by convolving a first space target motion Q.sub.t with an impulse train ({right arrow over (A)},{right arrow over (Δ)}); where the first space target motion defines the target motion in a first reference space. In case the original reference space of the target motion M.sub.t is the same as the first reference space then the target motion and the first space target motion is the same and the shaped first space target motion can be provided by convolving the target motion M.sub.t with the impulse train. In case the original reference space of the target motion is different from the first reference space then the method can comprise an optional step 361 of transforming the target motion into a first space target motion indicating the target motion in the first reference space and the shaped first space target motion can then be generated by convolving the transformed target motion with the impulse train. The impulse train comprises a number of impulses A separated by a time distance A′ and is provided based on the vibrational properties of the robot arm as known in the art of input shaping, for instance based on the configuration/pose of the robot arm.
[0068] In step 364 the shaped second space target motion X.sub.t* is generated by convolving a second space target motion with an impulse train ({right arrow over (A)},{right arrow over (Δ)}); where the second space target motion defines the target motion in a second reference space. In case the original reference space of the target motion M.sub.t is the same as the second reference space then the target motion and the second space target motion is the same and the shaped second space target motion can be provided by convolving the target motion M.sub.t with the impulse train. In case the original reference space of the target motion is different from the second reference space then the method can comprise a step 363 of transforming the target motion into a second space target motion X.sub.t indicating the target motion in the second reference space and the shaped second space target motion can be generated by convolving the transformed target motion with the impulse train.
[0069] In the illustrated embodiment the method comprises a step 370 of combining the shaped first space target motion Q.sub.t* and the shaped second space target motion X.sub.t* into a combined shaped target motion M.sub.t*. This can for instance be achieved by transforming the shaped first space target motion and the shaped second space target motion into a same reference space for instance the reference space of the target motion and then add the transformed shaped first space target motion and the transformed shaped second space target motion together.
[0070] Step 380 of generating at least one motor control signal (223a-223f) for the joint motors based on the combined shaped target motion can be performed as known in the art of robot motor control where a target motion is converted into motor control signals such as motor torques and/or motor currents and is performed based on a dynamic model of the robot arm. For instance, the motor control signal(s) may be generated based on the combined shaped target motion M.sub.t* whereby the motor control signal(s) will be generated based on input shaping in two different reference spaces whereby input shaping in two different reference spaces can be utilized. Consequently, the user of the robot arm can control the robot arm by choosing in which reference space that he/she wants to reduce vibrations of the robot arm and also online switch between in which reference space the input shaping shall be implemented. This is beneficial during a target motion defining a continuous motion, where at least a part of the robot arm constantly moves; meaning that the part of the robot arm during the continues motion does not experience a standstill where the speed of the part is zero.
[0071] In one embodiment the method comprises a step of generating the at least one motor control signal (223a-223f) based on the shaped first space target motion (Q.sub.t*) and the shaped second space target motion (X.sub.t*). This is useful in connection with parts of a continues motion where the target motion changes from moving in relation to a first reference space to moving in relation to a second reference space or in connection with blend parts of a continuous motion.
[0072] In one embodiment the method comprises steps of: [0073] in a first part of the continuous motion generating the motor control signals (223a-223f) based on the shaped first space target motion (Q.sub.t*) and not based on the shaped second space target motion (X.sub.t*); and [0074] in a second part of the target motion to generating the motor control signals (223a-223f) based on the shaped second space target motion (X.sub.t*) and not based on the shaped first space target motion (Q.sub.t*).
This is useful in connection with continues motions where the target motion has parts moving in relation to one reference space and other parts moving in relation to a second reference space.
[0075] In an embodiment of the robot controller/method, the first reference space and the second reference space are different and can be any combination of two of: [0076] a joint reference space wherein the kinematics of at least a part of the robot arm are indicated based on robot joint parameters, where the robot joint parameters indicate the kinematics of at least one of the joint motors and the kinematics of the output flanges, whereby the first or second space target motion indicates the target motion in terms of robot joint parameters; [0077] a cartesian coordinate space wherein the kinematics of at least a part of the robot arm in relation to a reference point are indicated in terms of cartesian coordinates, whereby the first or second space target motion indicates the target motion in terms of cartesian coordinates; [0078] a polar coordinate space wherein the kinematics of at least a part of the robot arm in relation to a reference point are indicated in terms of polar coordinates, whereby the first or second space target motion indicates the target motion in terms of polar coordinates; [0079] a cylindrical coordinate system wherein the kinematics of at least a part of the robot arm in relation to a reference point are indicated in terms of cylindrical coordinates, whereby the first or second space target motion indicates the target motion in terms of cylindrical coordinates; [0080] a spherical coordinate system wherein the kinematics of at least a part of the robot arm in relation to a reference point are indicated in terms of spherical coordinates, whereby the first or second space target motion indicates the target motion in terms of spherical coordinates.
[0081] In an embodiment of the robot controller/method, the first reference space and the second reference space are different, and the first reference space is a joint reference space while the second reference space is a coordinate space. In the joint reference space, the kinematics of at least a part of the robot arm are indicated based on robot joint parameters, where the robot joint parameters indicate the kinematics of at least one of the joint motors and the kinematics of the output flanges, whereby the first space target motion indicates the target motion in terms of robot joint parameters. In the coordinate space, the kinematics of at least a part of the robot arm are indicated in relation to a reference point, whereby the second space target motion indicates the target motion in terms of coordinates of the coordinate space. The coordinate space may for instance be any one of: [0082] a cartesian coordinate space wherein the kinematics of at least a part of the robot arm in relation to the reference point are indicated in terms of cartesian coordinates, whereby the second space target motion indicates the target motion in terms of cartesian coordinates; [0083] a polar coordinate space wherein the kinematics of at least a part of the robot arm in relation to the reference point are indicated in terms of polar coordinates, whereby the second space target motion indicates the target motion in terms of polar coordinates; [0084] a cylindrical coordinate system wherein the kinematics of at least a part of the robot arm in relation to the reference point are indicated in terms of cylindrical coordinates, whereby the second space target motion indicates the target motion in terms of cylindrical coordinates; [0085] a spherical coordinate system wherein the kinematics of at least a part of the robot arm in relation to the reference point are indicated in terms of spherical coordinates, whereby the second space target motion indicates the target motion in terms of spherical coordinates
[0086] In an embodiment of the robot controller/method, the first reference space and the second reference space are different in that: [0087] the first reference space is a coordinate reference space wherein the kinematics of at least a part of the robot arm are indicated in relation to a first reference point, and in that [0088] the second reference space is a coordinate reference space wherein the kinematics of at least a part of the robot arm are indicated in relation to a second reference point; and in that [0089] the first reference point and the second reference point are different.
The coordinate space may for instance be any one of the coordinate spaces listed in paragraph [0045]. This makes it possible to utilize input shaping in two reference spaces of the same kind having different reference points. For instance, in an embodiment where the robot arm is mounted on a moving support such as a vehicle. The first reference point can be defined as a fixed point on the moving support and the second reference point as a fixed point in relation to the moving support. Also, the first reference point can be defined as a fixed point defined in relation to a fixed part of the robot arm such as the robot base and the second reference point can be defined as a moving point defined in relation to a moving part of the robot arm such as the tool flange. This makes it possible for the user to choose in relation to which reference point the vibrations shall be reduced utilizing input shaping.
[0090]
[0091] In the illustrated embodiment a first space to target space transformation module 439 is configured to transform the shaped first space target motion Q.sub.t* into the reference space of the target motion M.sub.t and thereby provide a shaped first space target motion in the target space M.sub.t1. This can be achieved by utilizing a mapping functions transforming the shaped first space target motion Q.sub.t* into the target reference space. For instance, if the target space defines the kinematics of a part of the robot arm in relation to a reference point in a coordinate space and the first reference space defines the kinematics of a part of the robot arm in a joint space, then the first space to target space transformation module 439 can be configured to utilize forward kinematics as known from the field of robotics to transform the shaped first space target motion Q.sub.t* into a shaped first space target motion in the target space M.sub.t1.
[0092] It is to be understood that the first space to target space transformation module 439 may be omitted in embodiments where the target motion M.sub.t indicates the target motion of the robot arm in the first reference space. Also, in embodiments where the target motion M.sub.t indicates the target motion of the robot arm in the second reference space then the first space to target space transformation module 439 can be configure to transform the shaped first space target motion Q.sub.t* into the second reference space.
[0093] In the illustrated embodiment a second space to target space transformation module 440 is configured to transform the shaped second space target motion X.sub.t* into the reference space of the target motion M.sub.t and thereby provide a shaped second space target motion in the target space M.sub.t2. This can be achieved by utilizing a mapping functions transforming the shaped second space target motion X.sub.t* into the target reference space. For instance, if the target space defines the kinematics of a part of the robot arm in a joint space and the second reference space defines the kinematics of a part of the robot arm in a coordinate space, then the second space to target space transformation module 440 can be configured to forward kinematics as known from the field of robotics to transform the shaped second space target motion X.sub.t* into a shaped second space target motion in the target space M.sub.t1.
[0094] It is to be understood that the second space to target space transformation module 440 may be omitted in embodiments where the target motion M.sub.t indicates the target motion of the robot arm in the second reference space. Also, in embodiments where the target motion M.sub.t indicates the target motion of the robot arm in the first reference space then the second space to target space transformation module 440 can be configure to transform the shaped second space target motion X.sub.t* into the first reference space.
[0095] Transforming the shaped first space target motion and the shaped second space target motion into the same reference space makes it possible to combine the two signals in an addition module 443 configured to add the two signals together.
[0096] In the illustrated embodiment a first space scaling module 441 is configured to scale the shaped first space target motion according to a first space scaling parameter K.sub.1. This can be achieved by multiplying the shaped first space target motion with the first space scaling parameter and the multiplication is performed in the target space. Similarly, a second space scaling module 442 is configured to scale the shaped second space target motion according to a second space scaling parameter K.sub.2. This can be achieved by multiplying the shaped first space target motion with the second space scaling parameter and the multiplication is performed in the target space.
[0097] Scaling the first space target motion and the second space target motion makes it possible to adjust the effect of the input shaping performed in the first space and in the second space in relation to each other. Which for instance can be in connection with robot arm movement where the robot arm in one part of the movement is controlled in joint space and in another part of the movement is controlled in a coordinate space. A gradually scaling of the shaped first space target motion and the shaped second spaced target motion can then be in a part of the motion where the movements blend together.
[0098] Consequently, the combined shaped target motion M.sub.t* provided to the motor controller will be a linear combination of the shaped first space target motion and the shaped second space target motion in the target space where:
M.sub.t*=K.sub.1M.sub.t1*+K.sub.1M.sub.t2* eq. 1
0≤K.sub.1≤1 eq. 2
0≤K.sub.2≤1 eq. 3
K.sub.1+K.sub.2=1 eq. 4
will result in a position of the robot arm defined between the position indicated by the shaped first space target motion M.sub.t1* and the position indicated by the shaped second space target motion M.sub.t2*. By varying K.sub.1 and K.sub.2 over time, it will be possible to move gradually from M.sub.t1*, towards M.sub.t2* (or opposite), effectively fading between in which of the first reference space and the second reference space the input shaping shall be applied. The restriction provided by eq. 4 ensures that the combined shaped target motion is not scaled and thus the robot arm will end up at the positions as planned by the motion planner module.
[0099] The ability to change from applying input shaping in a first reference space to applying input shaping in a second reference space without a need for standstill of the robot arm is beneficial. For example, robot arms can perform linear motions in joint space or Cartesian space, respectively. This invention makes it possible to make a soft transition from a joint space trajectory into the Cartesian trajectory, and vice-versa, i.e. without a standstill. In connection with robot arms provided by the applicant Universal Robots A/S, the feature is called blending in programming terms of Universal Robots. The concept of blending is illustrated in
[0100] However, the blend can be between any two types of motion, e.g. linear Cartesian, circular Cartesian, or linear joint space trajectory. When the end-effector distance to next waypoint becomes lower that a defined blend radius, the soft transition will start until the distance becomes larger than the blend radius. The present invention makes blending between to kinds of movements possible. For example, the input shaping could be moved from joint space to Cartesian space over 1/10 of a second by:
where t is time, starting from the initialization of the transition.
[0101] It is noted the linear interpolation suggested by the transition function of eq. 5 and eq. 6 is only intended as an illustrating example and that other kind of transition functions may be provided. For instance, a S-shaped transition function may be provided in order to reduce position derivatives, i.e. velocity and acceleration.
[0102] The first space to target space transformation module 439, the second space to target space transformation module 440, the first space scaling module 440 and the second space scaling module 442 is illustrated as a part of the combining module 438, however it is to be understood that they can be provided as separate modules.
[0103]
[0104] In this embodiment the method comprises at least one of the steps of: [0105] transforming 571 the shaped first space target motion Q.sub.t* into a shaped target motion M.sub.t1 in at least one of a target reference space of the target motion and the second reference space; [0106] transforming 573 the shaped second space target motion X.sub.t* into a shaped target motion M.sub.t2 in at least one of a target reference space of the target motion and the first reference space.
The steps of transforming the shaped first space target motion Q.sub.t* and the shaped second space target motion X.sub.t* can be performed as described in connection with the first space to target space transformation module 439 and the second space to target space transformation module 440 in paragraphs [0048]-[0052].
[0107] Further the method comprises at least one of the steps: [0108] scaling 572 the shaped first space target motion according to a first space scaling parameter K.sub.1; [0109] scaling 574 the shaped second space target motion according to a second space scaling parameter K.sub.2.
The step 572 of scaling the shaped first space target motion and step 574 scaling the shaped second space target motion can be performed as described in connection with the a first space scaling module 441 and the second space scaling module 442 in paragraphs [0053]-[0054].
[0110] The method comprises a step 575 of combining the scaled shaped first space target motion M.sub.t1* and the scaled shaped second space target motion M.sub.t2* into the combined shaped target motion M.sub.t*. As described in paragraph [0055] the combined shaped target motion M.sub.t* can be provide as a linear combination of the shaped first space target motion and the shaped second space target motion as defined by eq. 1-eq. 4.
[0111] The method illustrated in
[0112] It is noted that the steps 571, 572, 573, and 574 is illustrates as a part of step 570 of combining the shaped first space target motion and the shaped second space target motion, however it is to be understood that they can be provided as separate method steps.
[0113]
[0114] The robot controller 610 comprises a target motion scaling module 644 configured to scale the target motion M.sub.t according to a target motion scaling parameter K.sub.0. This can be achieved by multiplying the target motion with the target motion scaling parameter and the multiplication is performed in the target space.
[0115] The robot controller comprises an addition module 643 configured to provide the combined shaped target motion M.sub.t* by adding the scaled shaped first space target motion K.sub.1M.sub.t1*, the scaled shaped second space target motion K.sub.2M.sub.t2* and the scaled target motion K.sub.0M.sub.t together in the same reference space.
[0116] This is beneficial in connection with robot arms which can alternate between different types of motion, for example linear joint motion, linear Cartesian motion, servo mode motion, and force mode motion. Different motion strategies require different control strategies and vibration suppression strategies. In some applications or motion strategies, it might be advantageous to disable vibration suppression. This would be applications, were fast response is important, and vibrations are unimportant.
[0117] Normally, a robot standstill is required, in order to enable or disable input shaping filters. Otherwise, discontinuities will appear in the target position reference, which would lead to error or larger vibrations. However, the functionality of the robot controller 410 can be extended as illustrated by the robot controller 610 of
M.sub.t*=K.sub.0M.sub.t+K.sub.1M.sub.t1*+K.sub.1M.sub.t2* eq. 7
0≤K.sub.0≤1 eq. 8
0≤K.sub.1≤1 eq. 9
0≤K.sub.2≤1 eq. 10
K.sub.0+K.sub.1+K.sub.2=1 eq. 11
Thereby, input shaping can be enabled or disabled gradually over time without discontinuities in reference positions e.g. joint angles. As described in connection with
which introduces a gradually blending of an unshaped target motion to a shaped first space target motion.
[0118]
[0119] The method comprises a step of scaling 776 the target motion M.sub.t according to a target space scaling parameter K.sub.0, this can be performed as described in connection with target motion scaling module 644 in paragraph [0067]. The method illustrated in
[0120]
[0121]
[0125]
[0126] In the illustrated embodiment a first space scaling module 1041 is configured to scale the shaped first space target motion Q.sub.t* according to a first space scaling parameter K.sub.1. This can be achieved by multiplying the shaped first space target motion with the first space scaling parameter. The scaled first space target motion K.sub.1Q.sub.t* is then provided to the first space motor control module 945, which is configured to generate the first motor control signal T.sub.motor,Q based on the scaled shaped first space target motion K.sub.1Q.sub.t* and a first dynamic model of the robot arm, where the first dynamic model is defined in the first reference space.
[0127] Similarly, a second space scaling module 1042 is configured to scale the shaped second space target motion according to a second space scaling parameter K.sub.2. This can be achieved by multiplying the shaped second space target motion X.sub.t* with the second space scaling parameter K.sub.2. The scaled second space target motion K.sub.2X.sub.t* is then provided to the second space motor control module 946, which is configured to generate the second motor control signal T.sub.motor,X based on the scaled shaped second space target motion K.sub.2X.sub.t* and a second dynamic model of the robot arm, where the second dynamic model is defined in the second reference space.
[0128] The motor control signal T.sub.motor,Q and the second motor control signal T.sub.motor,X can then be combined into the control signals 223a, 223b, 223f indicating control parameters for the joint motors by the motor control signal combining module 947.
[0129] Scaling the first space target motion and the second space target motion makes it possible to adjust the effect of the input shaping performed in the first space and in the second space in relation to each other and blending between movements in different reference spaces can hereby be achieve. The blending can for instance be performed similar to the description in paragraphs [0054]-[0058], where the first and second scaling parameters fulfill eq. 2, eq. 3 and eq. 4 and as an example is varied according to eq. 5 and eq. 6.
[0130]
[0131] The robot controller 1110 comprises a target motion scaling module 1144 configured to scale the target motion M.sub.t according to a target motion scaling parameter K.sub.0. This can be achieved by multiplying the target motion with the target motion scaling parameter and the multiplication is performed in the target space. Further the motor controller module 1132 is comprises a target space motor control module 1149 configured to a target motor control signal T.sub.motor,M based on the shaped target motion M.sub.t and a target dynamic model of the robot arm, where the target dynamic model is defined in the target reference space. The target motor control signal T.sub.motor,M is a vector indicating motor control signals for the joint motors.
[0132] The motor control signal T.sub.motor,Q, the second motor control signal T.sub.motor,X and the target motor control signal T.sub.motor,M can then be combined into the control signals 223a, 223b, 223f indicating control parameters for the joint motors by the motor control signal combining module 1147.
[0133] This makes is possible to control the robot arm as a combination of the unshaped motion and shaped motions in different reference spaces, this provides similar advantages and can be performed similar to the description in paragraphs [0069]-[0070], where the target, first and second scaling parameters fulfill eq. 8, eq. 9, eq. 10 and eq. 11 and as an example is varied according to eq. 12, eq. 13 and eq. 14.
[0134] Summarizing the present invention makes it possible to reduces the Cartesian path deviation caused by joint space input shaping and discloses a method to implement Cartesian input shaping to handle the limitations of joint space shaping. The presented invention allows to robot programmer to change the filtering space during motion, without additional delay.
The modules of the robot controller can for instance be configured to carry out the described functions and tasks by programming these as steps in a software program executed by a processer. Likewise the method according to the present invention can be implemented as method steps carried out by processors of a robot controller.
TABLE-US-00001 BRIEF DESCRIPTION OF FIGURE REFERENCES 100 robot system 101 robot arm 102a-102f robot joint 103 robot base 104 robot tool flange 105a-105f robot joints axis 106a-106f rotation arrow of robot joints 107 robot tool flange reference point 108 base reference point 109 direction of gravity 110, 410, 610, robot controller 910, 1010, 1110 111 interface device 112 display 113 input devices 216a; 216b; 216f output flange 217a; 217b; 217f joint motors 218a; 218B, 218f output axle 219a; 219b; 219f joint sensor 220a, 220b, 220f joint sensor signal 221, 421, 621 processor 222 memory 223a, 223b, 223f motor control signals 224 external control signal 126 end effector 228 end effector control signal 229 effector feedback signal 230 motion planning module 231 shaping module 232; 932; 1132 motor controller module 233 first space shaping module 234 second space shaping module 235 target space to first space transformation module 236 target space to second space transformation module 237 impulse generator 238; 438; 638 shaped target motion combining module 439 first space to target space transformation module 440 second space to target space transformation module 441; 1041 first space scaling module 442; 1042 second space scaling module 443; 643 addition module 644 target space scaling module 945 first space motor control module 946 second space motor control module 947; 1147 motor control signals combining module 1048 scaling module 1149 target space motor control module 350 generate target motion 360 generate shaped target motions 361 transform target motion into first space target motion 362 generate shaped first space target motion 363 transform target motion into second space target motion 364 generate shaped second space target motion 370, 570, 770 combine shaped target motions 571 transform shaped first space target motion into target space 572 scale shaped first space target motion 573 transform shaped second space target motion into target space 574 scale shaped second space target motion 575, 775 combine shaped target motions 776 scale target motion 380 generate control signal for robot arm