Method of Controlling Industrial Actuator, Control System and Actuator System

20220410393 · 2022-12-29

    Inventors

    Cpc classification

    International classification

    Abstract

    A method of controlling an industrial actuator, the method including providing a plurality of consecutive input target points, of which at least one is an intermediate input target point; for one or more of the at least one intermediate input target point, defining at least one virtual target point associated with the intermediate input target point; for one or more of the at least one virtual target point, defining a blending zone associated with the virtual target point; and defining a movement path on the basis of the at least one virtual target point and the at least one blending zone. A control system is also provided.

    Claims

    1. A method of controlling an industrial actuator, the method comprising: providing a plurality of consecutive input target points, of which at least one is an intermediate input target point; for one or more of the at least one intermediate input target point, defining at least one virtual target associated with the intermediate input target point; for one or more of the at least one virtual target point, defining a blending zone associated with the virtual target point; and defining a movement path on the basis of the at least one virtual target point and the at least one blending zone.

    2. The method according to claim 1, wherein the movement path does not include the at least one intermediate input target point, with which at least one virtual target point is associated.

    3. The method according to claim 1, wherein each virtual target point is defined between a preceding input target point and a succeeding input target point with respect to the input target point with which the respective virtual target point is associated.

    4. The method according to claim 1, wherein the definition of at least one virtual target point includes, for at least one intermediate input target point, defining a preceding virtual target point and a succeeding virtual target point associated with the intermediate input target point.

    5. The method according to claim 4, wherein for each intermediate input target point with which a preceding virtual target point and a succeeding virtual target point are associated, the preceding virtual target point is defined by a preceding virtual target vector from the input target point and the succeeding virtual target point is defined by a succeeding virtual target vector from the input target point, inverse to the preceding virtual target vector.

    6. The method according to claim 1, wherein the definition of at least one virtual target point includes, for at least two intermediate input target points, defining a preceding virtual target point and a succeeding virtual target point associated with the intermediate input target point.

    7. The method according to claim 6, wherein for each intermediate input target point with which a preceding virtual target point and a succeeding virtual target point are associated, the preceding virtual target point is defined by a preceding virtual target vector from the input target point and the succeeding virtual target point is defined by a succeeding virtual target vector from the input target point, inverse to the preceding virtual target vector.

    8. The method according to claim 7, wherein a sum of a length of a projection of the preceding virtual target vector from a succeeding input target point on a straight line between the succeeding input target point and a preceding input target point, and a length of a projection of the succeeding virtual target vector, from the preceding input target point on the straight line is equal to or less than a length of the straight line.

    9. The method according to claim 6, wherein the virtual target points are defined such that a sum of each distance between each pair of a succeeding virtual target point of a preceding input target point and a preceding virtual target point of a succeeding input target point is minimized.

    10. The method according to claim 6, wherein an inclination of an intermediate vector, between a succeeding virtual target point associated with a preceding input target point and a preceding virtual target point associated with a succeeding input target points, lies between an inclination of a succeeding virtual target vector between the preceding input target point and the succeeding virtual target point, and an inclination of a preceding virtual target vector between the preceding virtual target point and the succeeding input target point.

    11. The method according to claim 6, wherein a succeeding virtual target point associated with a preceding input target point and a preceding virtual target point associated with a succeeding input target point are replaced by a single virtual target point if a distance between the succeeding virtual target point and the preceding virtual target point is below a threshold value.

    12. The method according to claim 4, wherein the preceding virtual target point is defined between a preceding input target point and the input target point with which the respective virtual target point is associated, and wherein the succeeding virtual target point is defined between a succeeding input target point and the input target point with which the respective virtual target point is associated.

    13. The method according to claim 4, further comprising for each preceding virtual target point limiting a distance between the preceding virtual target point, and a straight line between a preceding input target point and the input target point with which the preceding virtual target point is associated.

    14. The method according to claim 4, wherein a maximum distance between an input target point and a preceding virtual target point associated with the input target point is limited based on a distance between the input target point and a preceding input target point.

    15. The method according to claim 1, wherein the blending zone associated with one or more of the at least one virtual target point is asymmetric.

    16. The method according to claim 1, wherein for two or more of the at least one intermediate input target point, at least one virtual target point associated with the intermediate input target point is defined, wherein a blending zone is associated with each of two consecutive virtual target points of the at least two virtual target points, and wherein a distance between the blending zones associated with the two consecutive virtual target points is less than 25% of a distance between the two consecutive virtual target points.

    17. The method according to claim 1, wherein the industrial actuator is an industrial robot.

    18. A control system for controlling an industrial actuator, the control system including a data processing device and a memory having a computer program stored thereon, the computer program comprising program code which, when executed by the data processing device-causes the data processing device to perform the steps of: providing a plurality of consecutive input target points, of which at least one is an intermediate input target point; for one or more of the at least one intermediate input target point, defining at least one virtual target point associated with the intermediate input target point; for one or more of the at least one virtual target point, defining a blending zone associated with the virtual target point; and defining a movement path on the basis of the at least one virtual target point and the at least one blending zone.

    19. An actuator system comprising an industrial actuator and a control system for controlling the industrial actuator; the control system including a data processing device and a memory having a computer program stored thereon, the computer program comprising program code which, when executed by the data processing device, causes the data processing device to perform the steps of: providing a plurality of consecutive input target points, of which at least one is an intermediate input target point; for one or more of the at least one intermediate input target point defining at least one virtual target point associated with the intermediate input target point; for one or more of the at least one virtual target point, defining a blending zone associated with the virtual target point; and defining a movement path on the basis of the at least one virtual target point and the at least one blending zone.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0054] Further details, advantages and aspects of the present disclosure will become apparent from the following embodiments taken in conjunction with the drawings, wherein:

    [0055] FIG. 1: schematically represents an actuator system comprising an industrial robot and a control system;

    [0056] FIG. 2: schematically represents a plurality of input target points according to the prior art;

    [0057] FIG. 3: schematically represents a movement path defined on the basis of blending zones associated with the input target points according to the prior art;

    [0058] FIG. 4: schematically represents an alternative movement path comprising alternative blending zones associated with the input target points according the prior art;

    [0059] FIG. 5: schematically represents the input target points and examples of virtual target points;

    [0060] FIG. 6: schematically represents a movement path;

    [0061] FIG. 7: schematically represents blending zones associated with the virtual target points;

    [0062] FIG. 8: schematically represents alternative blending zones associated with the virtual target points;

    [0063] FIG. 9: schematically represents one example of a limitation of the virtual target points;

    [0064] FIG. 10: schematically represents a further example of a limitation of the virtual target points;

    [0065] FIG. 11: schematically represents a further example of virtual target points;

    [0066] FIG. 12: schematically represents a further example of virtual target points;

    [0067] FIG. 13: schematically represents an intermediate vector between two virtual target points; and

    [0068] FIG. 14: schematically represents an intermediate vector and a cone formed by two virtual target vectors.

    DETAILED DESCRIPTION

    [0069] In the following, a method of controlling an industrial actuator, a control system for controlling an industrial actuator, and an actuator system comprising a control system and an industrial actuator, will be described. The same or similar reference numerals will be used to denote the same or similar structural features.

    [0070] FIG. 1 schematically represents an actuator system 10 comprising an industrial actuator, here exemplified as an industrial robot 12, and a control system 14. The industrial robot 12 is exemplified as a seven axis industrial robot but the present disclosure is not limited to this type of industrial robot or industrial actuator. An industrial robot 12 according to the present disclosure may comprise at least three axes. The control system 14 is here exemplified as a robot controller.

    [0071] The industrial robot 12 of this example comprises a base member 16 and a tool 18. The industrial robot 12 further comprises seven link members 20. Each link member 20 is rotationally or translationally movable at a joint 22.

    [0072] The control system 14 is configured to control the industrial robot 12. The control system 14 comprises a data processing device 24 (e.g. a central processing unit, CPU) and a memory 26. A computer program is stored in the memory 26. The computer program comprises program code which, when executed by the data processing device 24, causes the data processing device 24 to perform the steps, or to command performance of the steps, as described herein.

    [0073] In the example of FIG. 1, the control system 14 is in communication with the industrial robot 12 by means of a signal line 28. The control system 14 may however alternatively be integrated inside the industrial robot 12.

    [0074] FIG. 2 schematically represents a plurality of input target points 30-0, 30-1, 30-2, 30-3 and 30-4 according to the prior art. The input target points 30-0, 30-1, 30-2, 30-3 and 30-4 may alternatively be referred to with reference numeral “30”. The input target points 30 may for example be generated by means of a software tool using the geometry of an application as input. As a further example, the input target points 30 may be manually programmed by means of lead through programming.

    [0075] In FIG. 2, the input target point 30-0 is a starting input target point, the input target point 30-4 is an end input target point and each of the input target points 30-1, 30-2 and, 30-3 is an intermediate input target point. The input target points 30 are here illustrated in a single plane. However, the input target points 30 do not have to lie in a single plane. The input target points are used as input for creation of a movement path for the industrial robot 12.

    [0076] The input target points 30 are interconnected by a plurality of movement segments 32-1, 32-2, 32-3 and 32-4. The movement segments 32-1, 32-2, 32-3 and 32-4 may alternatively be referred to with reference numeral “32”. Each movement segment 32 is defined between two input target points 30 such that each intermediate input target point 30-1, 30-2 and 30-3 is between two associated movement segments 32. The movement segments 32 of this example are linear interpolations between the two respective input target points 30.

    [0077] FIG. 3 schematically represents a movement path 34 defined on the basis of the movement segments 32 and blending zones 36-1, 36-2 and 36-3 associated with the input target points 30 in FIG. 2. The blending zones 36-1, 36-2 and 36-3 may alternatively be referred to with reference numeral “36”. Also the movement path 34 defined on the basis of the blending zones 36 according to FIG. 3 belongs to the prior art. The movement path 34 in FIG. 3 is two-dimensional but may alternatively be three-dimensional.

    [0078] The blending zone 36-1 is associated with the intermediate input target point 30-1, the blending zone 36-2 is associated with the intermediate input target point 30-2, and the blending zone 36-3 is associated with the intermediate input target point 30-3. Each blending zone 36 may be either two-dimensional or three-dimensional depending on the characteristics of the associated movement segments 32. The blending zones 36 in FIG. 3 are symmetric, i.e. circles or spheres.

    [0079] A fine point (not illustrated) is associated with each of the starting input target point 30-0 and the end input target point 30-4. Fine points may alternatively be referred to as zero zones. Fine points are one type of stop points, meaning that the industrial robot 12 makes a full stop at these points. A stop point means that the industrial robot 12 must reach the specified position (stand still) before program execution continues with the next instruction.

    [0080] During execution of the movement path 34 by the industrial robot 12 along a movement segment 32, when entering a blending zone 36, the movement path 34 will start to approach the succeeding movement segment 32. When leaving the blending zone 36, the movement path 34 will be along the succeeding movement segment 32. Thus, the industrial robot 12 (e.g. the TCP of the tool 18 thereof) will travel from the starting input target point 30-0 and along the movement segment 32-1 until the blending zone 36-1 is reached.

    [0081] Within the blending zone 36-1, the movement segments 32-1 and 32-2 will be executed simultaneously (i.e. blended). When the industrial robot 12 leaves the blending zone 36-1, the industrial robot 12 will travel along the movement segment 32-2 until the blending zone 36-2 is reached. Within the blending zone 36-2, the movement segments 32-2 and 32-3 will be executed simultaneously. When the industrial robot 12 leaves the blending zone 36-2, the industrial robot 12 will travel along the movement segment 32-3 until the blending zone 36-3 is reached. Within the blending zone 36-3, the movement segments 32-3 and 32-4 will be executed simultaneously. When the industrial robot 12 leaves the blending zone 36-3, the industrial robot 12 will travel along the movement segment 32-4 until the end input target point 30-4 is reached.

    [0082] In the example in FIG. 2, the intermediate input target points 30-1, 30-2 and 30-3 are fly-by points, meaning that these points are not attained when executing the movement path 34 by the industrial robot 12. Instead, the direction of motion is changed before any of the intermediate input target points 30-1, 30-2 and 30-3 is reached.

    [0083] The smoothness of the resulting movement path 34 is limited by the distance between the input target points 30 and the sizes of the blending zones 36. As shown in FIG. 3, the movement path 34 does not pass through the intermediate input target points 30-1, 30-2, 30-3. The movement path 34 is somewhat smooth, but not very accurate since the distances between the movement path 34 and the intermediate input target points 30-1, 30-2, 30-3 are quite large. Therefore, with the approach in FIG. 3, the movement path 34 is guaranteed to not pass through the intermediate input target points 30 (except for an intermediate input target point positioned between two input target points on a straight line).

    [0084] FIG. 4 schematically represents an alternative movement path 38 defined on the basis of the movement segments 32 and blending zones 40-1, 40-2 and 40-3 associated with the input target points 30 in FIG. 2. The blending zones 40-1, 40-2 and 40-3 may alternatively be referred to with reference numeral “40”. Also the movement path 38 defined on the basis of the blending zones 40 according to FIG. 4 belongs to the prior art. Mainly differences with respect to FIG. 3 will be described.

    [0085] In FIG. 4, the sizes of the blending zones 40 are reduced to reduce deviations between the movement path 38 and the input target points 30. However, also the movement path 38 does not pass through the input target points 30.

    [0086] The small blending zones 40 in FIG. 4 increase accelerations along the movement path 38, resulting in increased wear and tear on the industrial robot 12. The small blending zones 40 also cause speed reductions, which for example decrease processing quality in a processing operation. Increased accelerations occur because the industrial robot 12 needs to change moving directions in the blending zone 40. If the size of the blending zone 40 is small, the movement change needs to be more abrupt.

    [0087] If a dynamic optimization of a trajectory along the movement path 38 is performed to get the shortest cycle time, and there are limitations on the acceleration, torque or other acceleration dependent parameters, a higher acceleration could cause a speed reduction in the blending zones 40. The speed reduction increases the cycle time and reduces the quality for applications requiring constant speed.

    [0088] As shown in FIG. 4, the movement path 38 is more accurate in comparison with the movement path 34 in FIG. 3. That is, the distances between the movement path 38 and the intermediate input target points 30-2, 30-3, 30-4 are smaller. However, the movement path 38 is not smooth since the blending zones 40 are rather small. Thus, there are quite long distances between neighboring blending zones 40 where the movement path 38 has to follow the movement segments 32.

    [0089] Thus, by making the blending zones larger, smoothness of the movement path is increased at the cost of accuracy of the movement path. By making the blending zones smaller, accuracy of the movement path is increased at the cost of smoothness of the movement path.

    [0090] FIG. 5 schematically represents the input target points 30 and examples of virtual target points 42-1,1, 42-2,1, 42-1,2, 42-2,2 42-1,3 and 42-2,3 according to the present disclosure. The virtual target points 42-1,1, 42-2,1, 42-1,2, 42-2,2 42-1,3 and 42-2,3 may alternatively be referred to with reference numeral “42”. Also in FIG. 5, the input target points 30 are used as input for a movement path. However, instead of interpolating movement segments between the input target points 30, the virtual target points 42 are defined.

    [0091] In FIG. 5, the input target points 30 are illustrated as interconnected by a plurality of straight lines 44-1, 44-2, 44-3 and 44-4. The straight lines 44-1, 44-2, 44-3 and 44-4 may alternatively be referred to with reference numeral “44”. Each straight line 44 is defined between two input target points 30 such that each intermediate input target point 30-1, 30-2 and 30-3 is between two straight lines 44. The straight lines 44 may alternatively be referred to as virtual movement segments. In some examples, the straight lines 44 are used to define the virtual target points 42 and/or blending zones. In some examples, the straight lines 44 are not needed.

    [0092] The virtual target points 42-1,1 and 42-2,1 are associated with the input target point 30-1, the virtual target points 42-1,2 and 42-2,2 are associated with the input target point 30-2, and the virtual target points 42-1,3 and 42-2,3 are associated with the input target point 30-3. The virtual target points 42-1,1 and 42-2,1 lie between the input target points 30-0 and 30-2, the virtual target points 42-1,2 and 42-2,2 lie between the input target points 30-1 and 30-3, and the virtual target points 42-1,3 and 42-2,3 lie between the input target points 30-2 and 30-4.

    [0093] The virtual target points 42-1,1, 42-1,2 and 42-1,3 are preceding virtual target points to the input target points 30-1, 30-2 and 30-3, respectively. The virtual target points 42-2,1, 42-2,2 and 42-2,3 are succeeding virtual target points to the input target points 30-1, 30-2 and 30-3, respectively.

    [0094] The preceding virtual target point 42-1,1 is defined between the input target points 30-0 and 30-1, the preceding virtual target point 42-1,2 is defined between the input target points 30-1 and 30-2, and the preceding virtual target point 42-1,3 is defined between the input target points 30-2 and 30-3. The succeeding virtual target point 42-2,1 is defined between the input target points 30-1 and 30-2, the succeeding virtual target point 42-2,2 is defined between the input target points 30-2 and 30-3, and the succeeding virtual target point 42-2,3 is defined between the input target points 30-3 and 30-4.

    [0095] To this end, a maximum distance between the input target point 30-1 and the preceding virtual target point 42-1,1 may be limited to not exceed a length of the straight line 44-1, and a maximum distance between the input target point 30-1 and the succeeding virtual target point 42-2,1 may be limited to not exceed a length of the straight line 44-2. A maximum distance between the input target point 30-2 and the preceding virtual target point 42-1,2 may be limited to not exceed a length of the straight line 44-2, and a maximum distance between the input target point 30-2 and the succeeding virtual target point 42-2,2 may be limited to not exceed a length of the straight line 44-3. A maximum distance between the input target point 30-3 and the preceding virtual target point 42-1,3 may be limited to not exceed a length of the straight line 44-3, and a maximum distance between the input target point 30-3 and the succeeding virtual target point 42-2,3 may be limited to not exceed a length of the straight line 44-4.

    [0096] FIG. 5 further shows a plurality of virtual target vectors 46-1,1, 46-2,1, 46-1,2, 46-2,2, 46-1,3 and 46-2,3. The virtual target vectors 46-1,1, 46-2,1, 46-1,2, 46-2,2, 46-1,3 and 46-2,3 may alternatively be referred to with reference numeral “46”.

    [0097] The preceding virtual target point 42-1,1 is defined by a preceding virtual target vector 46-1,1 from the input target point 30-1, the succeeding virtual target point 42-2,1 is defined by a succeeding virtual target vector 46-2,1 from the input target point 30-1, the preceding virtual target point 42-1,2 is defined by a preceding virtual target vector 46-1,2 from the input target point 30-2, the succeeding virtual target point 42-2,2 is defined by a succeeding virtual target vector 46-2,2 from the input target point 30-2, and the preceding virtual target point 42-1,3 is defined by a preceding virtual target vector 46-1,3 from the input target point 30-3, and the succeeding virtual target point 42-2,3 is defined by a succeeding virtual target vector 46-2,3 from the input target point 30-3. The preceding virtual target vector 46-1,1 is inverse to the succeeding virtual target vector 46-2,1, the preceding virtual target vector 46-1,2 is inverse to the succeeding virtual target vector 46-2,2, and the preceding virtual target vector 46-1,3 is inverse to the succeeding virtual target vector 46-2,3.

    [0098] A sum of a length of a projection of the succeeding virtual target vector 46-2,1 on the straight line 44-2 and a projection of the preceding virtual target vector 46-1,2 on the straight line 44-2 is less than a length of the straight line 44-2. A sum of a length of a projection of the succeeding virtual target vector 46-2,2 on the straight line 44-3 and a projection of the preceding virtual target vector 46-1,3 on the straight line 44-3 is equal to the length of the straight line 44-3.

    [0099] The succeeding virtual target point 42-2,1 is defined between the input target point 30-1 and the preceding virtual target point 42-1,2, or at the preceding virtual target point 42-1,2, and the preceding virtual target point 42-1,2 is defined between the succeeding virtual target point 42-2,1 and the input target point 30-2, or at the succeeding virtual target point 42-2,1. The succeeding virtual target point 42-2,2 is defined between the input target point 30-2 and the preceding virtual target point 42-1,3, or at the preceding virtual target point 42-1,3 (which is the case in FIG. 5), and the preceding virtual target point 42-1,3 is defined between the succeeding virtual target point 42-2,2 and the virtual target point 42-3, or at the succeeding virtual target point 42-2,2.

    [0100] The method may employ an algorithm where the input target points 30 are input to the algorithm. Based on the input target points 30, the algorithm may define the at least one virtual target point 42.

    [0101] In the following, one example of an algorithm for the method will be described. The algorithm may be implemented in the computer program in the control system 14. The algorithm uses the input target points 30 as input. A first step 1.1 of the algorithm of this example may be formulated as:


    Providing a plurality of input target points p.sub.i, i=0 . . . N, where N is a positive natural number of at least 2  (1.1)

    [0102] A subsequent step 1.2 of the algorithm of this example may be formulated as:


    for each input target point p.sub.i, i=[1, . . . , N−1], introduce two virtual target points p.sub.v1,i=p.sub.i+υ.sub.i and p.sub.v2,i=p.sub.i−υ.sub.i, where υi are the virtual target vectors 46.  (1.2)

    [0103] A subsequent step 1.3 of the algorithm of this example may be formulated as:


    Find υ.sub.i in step 1.2 such that Σ.sub.i=1.sup.N−2∥p.sub.v2,i−p.sub.v1,i+1∥.sup.2 is minimized  (1.3)

    [0104] In this way, a sum of distances between adjacent virtual target points 42 associated with different input target points 30 can be minimized. With reference to FIG. 5, the sum of a distance between the succeeding virtual target point 42-2,1 and the preceding virtual target point 42-1,2 and a distance between the succeeding virtual target point 42-2,2 and the preceding virtual target point 42-1,3 is minimized.

    [0105] In this example, υ.sub.i is the objective function of the optimization problem in step 1.3. A parametrization of υ.sub.i may be used to solve the optimization problem.

    [0106] In order to understand step 1.3, one may think of a describing analogy where rubber bands are positioned around two adjacent virtual target points 42 pulling these together. For example, it may be thought of one rubber band pulling the virtual target points 42-2,1 and 42-1,2 together, and one rubber band pulling the virtual target points 42-2,2 and 42-1,3 together. It may further be thought of one rubber band pulling the input target point 30-0 and the virtual target point 42-1,1 together, and one rubber band pulling the input target point 30-4 and the virtual target point 42-2,3 together. The objective function would then be to minimize tension in the rubber bands.

    [0107] Step 1.3 constitutes one example of defining a plurality of virtual target points 42 such as to reduce a deviation between the industrial robot 12 and the intermediate input target point 30-1, 30-2 and 30-3 when executing a movement path by the industrial robot 12.

    [0108] A subsequent step 1.4 of the algorithm of this example may be formulated as:


    For all virtual target points 42 where ∥p.sub.v2,i−p.sub.v1,i+1∥.sup.2<ε.sup.2, the two virtual target points 42 are replaced by the average, p.sub.vi  (1.4)

    where ε is a threshold value. The threshold value c may for example be set based on an average length of the straight lines 44. In FIG. 5, the virtual target points 42-2,2 and 42-1,3 are close to each other and are therefore replaced by a single virtual target point 42-2,2/42-1,3, for example the average of the virtual target points 42-2,2 and 42-1,3. In this way, the number of virtual target points 42 can be reduced. The method can thereby be made less computationally heavy. Furthermore, it can be avoided that two target points are too close to each other.

    [0109] In a subsequent step, the algorithm may define a blending zone associated with one or more of the virtual target points 42, such as for each virtual target point 42. The sizes of the blending zones may be maximized such that the entire movement path is covered by blending zones.

    [0110] The result from the algorithm is a movement path defined on the basis of the starting input target point 30-0, the end input target point 30-4, the virtual target points 42, and the blending zones associated with the virtual target points 42. The movement path is defined on the basis of the input target points 30-0, 30-4 and five virtual target points 42. The movement path of this example does however not comprise the intermediate input target point 30-1, 30-2 and 30-3, with which the virtual target points 42 are associated. Thus, the movement path comprises seven target points, which is an increase with only two target points from the five input target points 30. The movement path is therefore only slightly more computationally heavy than the movement paths 34 and 38.

    [0111] The movement path can then be implemented in a program for the industrial robot 12 and be executed by the industrial robot 12. The algorithm may be executed automatically based on a set of input target points 30 and output the movement path.

    [0112] FIG. 6 schematically represents the resulting movement path 48 generated by the algorithm using the input target points 30 as input. The movement path 48 comprises the start input target points 30-0, the end input target point 30-4, the virtual target points 42 and a blending zone associated with each virtual target point 42. Fine points are applied to the start input target point 30-0 and to the end input target point 30-4. The blending zones may be defined in various ways.

    [0113] As shown in FIG. 6, the method enables the movement path 48 to pass through each intermediate input target point 30-1, 30-2 and 30-3. The virtual target points 42 further enable maximum smoothness of the movement path 48 to be achieved. The movement path 48 in FIG. 6 is both smoother than the movement path 34 in FIG. 3 and more accurate than the movement path 38 in FIG. 4. The smoothness of the movement path 48 enables a high speed trajectory along the movement path 48.

    [0114] The method can be at least partly implemented in software tools, such as RobotStudio®. In this way, smooth and accurate movement paths 48 can be generated according to the method in a simple manner.

    [0115] FIG. 7 schematically represents one example of blending zones 50-1,1, 50-2,1, 50-1,2, 50-2,2/50-1,3 and 50-2,3 associated with the virtual target points 42.

    [0116] The blending zones 50-1,1, 50-2,1, 50-1,2, 50-2,2/50-1,3 and 50-2,3 may alternatively be referred to with reference numeral “50”.

    [0117] The starting input target point 30-0 and the end input target point 30-4 of this example are fine points. Thus, zone borders are provided at the input target points 30-0 and 30-4. Zone borders are also provided at each intermediate input target point 30-1, 30-2 and 30-3. The zone border at each intermediate input target point 30-1, 30-2 and 30-3 may be defined as a plane perpendicular to the respective virtual target vectors 46. The zone border at the starting input target point 30-0 may be defined as a plane perpendicular to the associated straight line 44-1 and the zone border at the end input target point 30-4 may be defined as a plane perpendicular to the associated straight line 44-4.

    [0118] As shown in FIG. 7, the blending zones 50 are maximized and asymmetric. In this example, each blending zone 50 is defined as a triangle with one line connecting respective zone borders, and two lines connecting a virtual target point 42 with a respective zone border.

    [0119] FIG. 7 further shows a plurality of movement segments 52-1, 52-2, 52-3, 52-4, 52-5 and 52-6. The movement segments 52-1, 52-2, 52-3, 52-4, 52-5 and 52-6 may alternatively be referred to with reference numeral “52”. The method may further comprise defining movement segments 52 between the virtual target points 42 and some input target points 30, for example the starting input target point 30-0 and an end input target point 30-4.

    [0120] In this example, each movement segment 52 is a linear interpolation between two associated target points. The movement segment 52-1 connects the input target point 30-0 and the virtual target point 42-1,1, the movement segment 52-2 connects the virtual target points 42-1,1 and 42-2,1, the movement segment 52-3 connects the virtual target points 42-2,1 and 42-1,2, the movement segment 52-4 connects the virtual target points 42-1,2 and 42-2,2/42-1,3, the movement segment 52-5 connects the virtual target points 42-2,2/42-1,3 and 42-2,3, and the movement segment 52-6 connects the virtual target point 42-2,3 and the input target point 30-4.

    [0121] In FIG. 7, each blending zone 50 is defined independently in relation to the movement segments 52 associated with the blending zone 50. By defining the blending zones 50 independently, i.e. by determining the blending zones 50 expressed independently in each of the two consecutive movement segments 52 associated with the blending zones 50, a flexible definition of the blending zones 50 is provided. Instead of being limited by symmetry, the shapes of the blending zones 50 according to the present disclosure are allowed to vary and to be asymmetric.

    [0122] The blending zone 50-1,1 is defined as a triangle comprising a line between the input target points 30-0 and 30-1, a line between the input target point 30-0 and the virtual target point 42-1,1 (here also the movement segment 52-1), and a line between the input target point 30-1 and the virtual target point 42-1,1. The blending zone 50-2,1 is defined as a triangle comprising a line between the input target point 30-1 and a zone border between (e.g. halfway between) the virtual target points 42-2,1 and 42-1,2, a line between the input target point 30-1 and the virtual target point 42-2,1, and a line between the zone border between the virtual target points 42-2,1 and 42-1,2 and the virtual target point 42-2,1. The blending zone 50-1,2 is defined as a triangle comprising a line between the zone border between the virtual target points 42-2,1 and 42-1,2 and the input target point 30-2, a line between the zone border between the input virtual target points 42-2,1 and 42-1,2 and the virtual target point 42-1,2, and a line between the input target point 30-2 and the virtual target point 42-1,2. The blending zone 50-2, 2/50-1,3 is defined as a triangle comprising a line between the input target points 30-2 and 30-3, a line between the input target point 30-2 and the virtual target point 42-2,2/42-1,3, and a line between the input target point 30-3 and the virtual target point 42-2,2/42-1,3. The blending zone 50-2,3 is defined as a triangle comprising a line between the input target points 30-3 and 30-4, a line between the input target point 30-3 and the virtual target point 42-2,3, and a line between the input target point 30-4 and the virtual target point 42-2,3 (here also the movement segment 52-6).

    [0123] In this example, each blending zone 50 comprises two zone borders and each zone border is defined in relation to a respective one of the two movement segments 52 associated with the virtual target point 42. Each zone border may for example be defined with a percentage of between 0% and 100% in relation to each of the two consecutive movement segments 52.

    [0124] In FIG. 7, the blending zone 50-1,1 extends from a preceding zone border at 100% of the preceding movement segment 52-1 from the virtual target point 42-1,1 to a succeeding zone border at 50% of the succeeding movement segment 52-2 from the virtual target point 42-1,1. The blending zone 50-2,1 extends from a preceding zone border at 50% of the preceding movement segment 52-2 from the virtual target point 42-2,1 to a second zone border at 50% of the succeeding movement segment 52-3 from the virtual target point 42-2,1. The blending zone 50-1,2 extends from a preceding zone border at 50% of the preceding movement segment 52-3 from the virtual target point 42-1,2 to a succeeding zone border at 50% of the succeeding movement segment 52-4 from the virtual target point 42-1,2. The blending zone 50-2,2/50-1,3 extends from a preceding zone border at 50% of the preceding movement segment 52-4 from the virtual target point 42-2,2/42-1,3 to a succeeding zone border at 50% of the succeeding movement segment 52-5 from the virtual target point 42-2,2/42-1,3. The blending zone 50-2,3 extends from a preceding zone border at 50% of the preceding movement segment 52-5 from the virtual target point 42-2,3 to a succeeding zone border at 50% of the succeeding movement segment 52-6 from the virtual target point 42-2,3.

    [0125] As shown in FIG. 7, the blending zones 50 cover the entire movement path 48. Thus, a distance between the blending zones 50 associated with two consecutive virtual target points 42 is 0. Blending is consequently allowed along the entire movement path 48 between the input target points 30-0 and 30-4.

    [0126] Alternatively, or in addition, each blending zone 50 may be defined with a factor from 0 to 1 in relation to each of the respective two consecutive movement segments 52. The factor may be constituted by an interpolation index that has the value 0 in the virtual target point 42 associated with the blending zone 50 and the value 1 in each adjacent target point.

    [0127] Each blending zone 50 may be defined with a different percentage or factor in relation to each of the respective two consecutive movement segments 52. In case one or more points of the movement path 48 (in addition to the input target points 30-0 and 30-4) are fine points, at least one blending zone 50 associated with a virtual target point 42 may be defined as 100% of the movement segment 52 between the virtual target point 42 and the fine point. The same blending zone 50 may still be defined independently in relation to the other movement segment 52 associated with the blending zone 50.

    [0128] The defined movement path 48 is the same regardless of speeds and accelerations of the industrial robot 12 along the movement path 48. The geometry of the movement path 48 is defined independently of the dynamics of the industrial robot 12. A dynamic coupling, e.g. speeds and accelerations of the industrial robot 12 along the movement path 48, may be generated in a further step to define a movement trajectory. The movement path 48 within the blending zones 50 may however be blended in various ways. Instead of curves, the movement path 48 may for example adopt various polynomial shapes within the blending zones 50. The movement path 48 within each blending zone 50 may be referred to as a corner path.

    [0129] Due to the blending zones 50, the industrial robot 12 is allowed to fly-by the virtual target points 42. The movement path 48 is thereby made more smooth and acceleration and deceleration phases along the movement path 48 can be reduced or eliminated. As a consequence, the speed of the industrial robot 12 can be increased and the wear on mechanical components of the industrial robot 12 can be reduced.

    [0130] In this example, the blending zones 50 are positioning blending zones 50, i.e. for positioning the tool 18. Additional orientation blending zones may be defined for orientation of the tool 18. Alternatively, the positioning blending zones 50 may be used also for orientation of the tool 18.

    [0131] FIG. 8 schematically represents a further example of blending zones 50-1,1, 50-2,1, 50-1,2, 50-2,2/50-1,3 and 50-2,3 associated with the virtual target points 42. Mainly differences with respect to FIG. 7 will be described.

    [0132] In FIG. 8, each blending zone 50 is a circle (or sphere in case of a three-dimensional movement path 48). For each blending zone 50, the circle is centered at the associated virtual target point 42.

    [0133] The blending zone 50-1,1 is a partial circle centered at the input target point 30-1. The radius of the blending zone 50-1,1 corresponds to the distance between the input target point 30-0 and the virtual target point 42-1,1. The blending zone 50-1,1 is limited by a preceding zone border in the input target point 30-1.

    [0134] The blending zone 50-2,1 is a full circle centered at the virtual target point 42-2,1. The radius of the blending zone 50-2,1 corresponds to the distance between the virtual target point 42-2,1 and the input target point 30-1.

    [0135] The blending zone 50-1,2 is a partial circle centered at the virtual target point 42-1,2. The radius of the blending zone 50-1,2 corresponds to the distance between the virtual target point 42-1,2 and the input target point 30-2. The blending zone 50-1,2 is limited by the blending zone 50-2,1.

    [0136] The blending zone 50-2,2/50-1,3 is a partial circle centered at the virtual target point 42-2,2/42-1,3. The radius of the blending zone 50-2,2/50-1,3 corresponds to the distance between the virtual target point 42-2,2/42-1,3 and the input target point 30-2. The blending zone 50-2,2/50-1,3 is limited by a zone border at the input target point 30-3.

    [0137] The blending zone 50-2,3 is a partial circle centered at the virtual target point 42-2,3. The radius of the blending zone 50-2,3 corresponds to the distance between the virtual target point 42-2,3 and the input target point 30-3. The blending zone 50-2,3 is limited by a zone border at the input target point 30-4.

    [0138] Also in FIG. 8, the blending zones 50 are maximized and some of the blending zones 50 (all except blending zone 50-2,1) are asymmetric.

    [0139] The orientation o.sub.1,i of the tool 18 in the virtual target points 42 can be computed by computing the orientation from the input target points 30 using a slerp (spherical linear) interpolation with

    [00001] o 1 , i = slerp ( o i - 1 , o i , L ) where L 1 , i = .Math. "\[LeftBracketingBar]" υ i T ( p i - 1 p i ) .Math. "\[RightBracketingBar]" .Math. p i - 1 - p i .Math.

    [0140] o.sub.2,i can be computed in a similar way. o.sub.1,i and o.sub.2,i are unit quaternions representing the orientation of the tool 18 in a normalized 4-element data vector. Using this approach and linear interpolation between the virtual target points 42, both the position and the orientation of the tool 18 in the input target points 30 will be correct in the movement path 48. Other types of interpolation schemes can of course be used to interpolate the orientation of the tool 18.

    [0141] FIG. 9 schematically represents one example of a limitation of the virtual target points 42. FIG. 9 further shows a plurality of distances 54-1,1, 54-2,1, 54-1,2, 54-2,2/54-1,3 and 54-2,3. The distances 54-1,1, 54-2,1, 54-1,2, 54-2,2/54-1,3 and 54-2,3 may alternatively be referred to with reference numeral “54”.

    [0142] The algorithm can be extended with additional constraints. An additional constraint 2.1 of the algorithm of this example may be formulated as:


    The shortest distance 54 from the virtual target points p.sub.v1,i with index i to the straight line l.sub.i connecting the input target points p.sub.i−1 and p.sub.i is ≤ε.sub.tol, and the shortest distance 54 from the virtual target points p.sub.v2,i with index i to the straight line l.sub.i+1 connecting the input target points p.sub.i and p.sub.i+1 is ≤ε.sub.tol  (2.1)

    [0143] ε.sub.tol may for example be set to 1 mm. As shown in FIG. 9, distances between the respective preceding virtual target points 42-1,1, 42-1,2 and 54-2,2/54-1,3 and the respective straight lines 44-1, 44-2 and 44-3 are limited by the respective distances 54-1,1, 54-1,2, 54-2,2/54-1,3. Furthermore, distances between the respective succeeding virtual target points 42-2,1, 42-2,2/42-1,3 and 42-2,3 and the respective straight lines 44-2, 44-3 and 44-4 are limited by the respective distances 54-2,1, 54-2,2/54-1,3 and 54-2,3.

    [0144] FIG. 10 schematically represents a further example of a limitation of the virtual target points 42. In FIG. 10, ε.sub.tol in constraint 2.1 is reduced in comparison with ε.sub.tol in FIG. 9. As a consequence, the virtual target points 42 are moved closer to their respectively associated input target point 30 and the deviations of the movement path 48 from the straight lines 44 between respective input target points 30 will be made smaller. In this way, the movement path 48 can be restricted to a certain degree of conformity with a linearly interpolated movement path between the input target points 30. However, the movement path 48 will be less smooth if ε.sub.tol is selected to a too low value.

    [0145] FIG. 11 schematically represents a further example of virtual target points 42. The positions of the virtual target points 42-2,2 and 42-1,3 between the input target points 30-2 and 30-3 are close, but in this example not close enough to be replaced by a single virtual target point according to step 1.4.

    [0146] An additional constraint 2.2 of the algorithm of this example may be formulated as:


    ∥υ.sub.i∥.sup.2≤min{∥p.sub.i−p.sub.i−1∥.sup.2,∥p.sub.i+1−p.sub.i∥.sup.2}/κ.sup.2,κ≥1  (2.2)

    [0147] In this way, the positions of the virtual target points 42 are limited related to the distances between the input target points 30. κ thus represents how large part of a distance between two input target points 30 that can be utilized for positioning the virtual target points 42. In FIG. 11, κ is for example set to 1. The positions of the virtual target points 42 are thereby related to the distances between the input target points 30.

    [0148] FIG. 12 schematically represents a further example of virtual target points 42. In FIG. 12, κ in constraint 2.2 is set to 3. As a consequence, the lengths of the virtual target vectors 46 are reduced and the movement path 48 is made more smooth between the input target points 30-2 and 30-3.

    [0149] FIG. 13 schematically represents an intermediate vector 56 between the two virtual target points 42-1,2 and 42-1,2. An additional constraint (2.3) of the algorithm of this example may be formulated as:


    The intermediate vector p.sub.v2,i−p.sub.v1,i+1 between a succeeding virtual target point 42 of a preceding input target point 30 and a preceding virtual target point 42 of a succeeding input target point 30 should lie in a cone defined by υ.sub.i and υ.sub.i+1  (2.3)

    [0150] The constraint 2.3 imposes smoothness. With constraint 2.3, movement changes in the movement path 48 will be improved since the virtual target vectors υ.sub.i and υ.sub.i+1 represent the direction (derivative) in the respective input target point p.sub.i and p.sub.i+1. As shown in FIG. 13, an inclination of the intermediate vector 56 lies between an inclination of the succeeding virtual target vector 46-2,1 and the preceding virtual target vector 46-1,2.

    [0151] FIG. 14 schematically represents the intermediate vector 56 and a cone formed by the virtual target vectors 46-2,1 and 46-1,2. As illustrated in FIG. 14, constraint 2.3 puts a constraint on the intermediate vector 56 connecting the virtual target points 42-2,1 and 42-1,2 such that the intermediate vector 56 lies in a cone spanned by the virtual target vectors 46-1 and 46-2. By means of the intermediate vector 56 defined in this way, the movement path 48 can be made even more smooth.

    [0152] Constraint 2.3 can be effectively expressed as c is inside if:


    c.sup.−T(ā+b)≥ā.sup.T(ā+b)

    [0153] This constraint also works in a three-dimensional implementation. The vectors used in the inequality are normalized.

    [0154] While the present disclosure has been described with reference to exemplary embodiments, it will be appreciated that the present invention is not limited to what has been described above. For example, it will be appreciated that the dimensions of the parts may be varied as needed. Accordingly, it is intended that the present invention may be limited only by the scope of the claims appended hereto.