Control of a robot assembly

11839985 · 2023-12-12

Assignee

Inventors

Cpc classification

International classification

Abstract

A method for the control of a robot assembly having at least one robot. The method includes acquiring pose data from an object arrangement having at least one object, which data has a first time interval; determining modified pose data from the object arrangement, which data has a second time interval that is larger or smaller than the first time interval, or is equal to the first time interval, on the basis of the acquired pose data; and controlling the robot assembly on the basis of said modified pose data.

Claims

1. A method for the control of a robot assembly having at least one robot to perform an operation on at least one object, the method comprising: detecting with a first sensor, first pose data of the at least one object, the first pose data comprising a first time interval; determining first modified pose data of the at least one object on the basis of the first detected pose data, the first modified pose data comprising a second time interval that is less than or greater than the first time interval, or equal to the first time interval; controlling the robot assembly on the basis of the first modified pose data; detecting second pose data of the at least one object, the second pose data comprising a third time interval that is chronologically parallel to the detection of the first pose data; determining second modified pose data of the at least one object on the basis of the second detected pose data, the second modified pose data comprising a fourth time interval that is less than or greater than the third time interval, or equal to the third time interval; and controlling the robot assembly on the basis of the second modified pose data; wherein at least one of the detected or modified first pose data and/or at least one of the detected or modified second pose data depend on an orientation of the at least one object; and wherein determining at least one of the first modified pose data or the second modified pose data comprises determining the at least one of the first or second modified pose data in such a manner that a rotation of the at least one object in a space between the first modified pose data or the second modified pose data is minimal.

2. The method of claim 1, wherein detecting the second pose data comprises detecting the second pose data with a second sensor.

3. The method of claim 1, wherein at least one of: the first and second pose data comprise quaternions; or a time derivative of the first and second pose data describe an angular velocity of the at least one object.

4. The method of claim 1, wherein determining at least one of the first modified pose data or the second modified pose data comprises determining the modified pose data in such a manner that the first and second modified pose data satisfy a function that approximates at least one of the first detected pose data or second detected pose data.

5. The method of claim 4, wherein at least one of: the function comprises piecewise polynomials; the function comprises third order polynomials; or the function is satisfied when the first or second modified pose data runs through the first detected pose data, the second detected pose data, or supporting points between successive ones of at least the first detected pose data or the second detected pose data.

6. The method of claim 5, wherein at least one of: the function comprises prespecified derivatives at supporting points; or the piecewise polynomials have the same derivative at respective transition points of the piecewise polynomials.

7. The method of claim 6, wherein the prespecified derivatives are determined by successive detected pose data.

8. The method of claim 1, wherein determining at least one of the first modified pose data or the second modified pose data comprises determining the at least one of the first or second modified pose data by at least one filtering.

9. The method of claim 1, further comprising: when detected pose data is absent at a predetermined follow-up time, then determining the first modified pose data and the second modified pose data in such a manner that a velocity of the at least one object in a space between the first and second modified pose data is constant.

10. The method of claim 1, further comprising: when detected first or second pose data is absent at a predetermined halt time, then at least one of: stopping control of the robot assembly based on the modified pose data; or issuing a warning.

11. The method of claim 1, wherein at least one of: detecting the first or second pose data of the at least one object comprises detecting the pose data using at least one sensor; or transmitting at least one of the first modified pose data or the second modified pose data to a controller for the purpose of controlling the robot arrangement.

12. The method of claim 11, wherein the at least one sensor is at least one of a non-contact sensor or an optical sensor.

13. The method of claim 1, wherein control of the robot assembly based on at least one of the first modified pose data or the second modified pose data comprises the combination of the first or second modified pose data with object-order-based manipulation instructions.

14. A system for controlling a robot assembly having at least one robot to perform an operation on at least one object, the system comprising means for: detecting first pose data of the at least one object, the first pose data comprising a first time interval; determining first modified pose data of the at least one object on the basis of the first detected pose data, the first modified pose data comprising a second time interval that is less than or greater than the first time interval, or equal to the first time interval; controlling the robot assembly on the basis of the first modified pose data; detecting second pose data of the at least one object, the second pose data comprising a third time interval that is chronologically parallel to the detection of the first pose data; determining second modified pose data of the at least one object on the basis of the second detected pose data, the second modified pose data comprising a fourth time interval that is less than or greater than the third time interval, or equal to the third time interval; and controlling the robot assembly on the basis of the second modified pose data; wherein at least one of the detected or modified first pose data and/or at least one of the detected or modified second pose data depend on an orientation of the at least one object; and wherein determining at least one of the first modified pose data or the second modified pose data comprises determining the modified pose data in such a manner that a rotation of the at least one object in a space between the first modified pose data or the second modified pose data is minimal.

15. An arrangement, comprising: a robot assembly having at least one robot; and a system according to claim 14 for controlling the robot assembly.

16. A computer program product for use in controlling a robot assembly having at least one robot to perform an operation on at least one object, the computer program product having program code stored in a non-transitory computer-readable medium that, when executed by a computer, causes the computer to: detect with a first sensor, first pose data of the at least one object, the first pose data comprising a first time interval; determine first modified pose data of the at least one object on the basis of the first detected pose data, the first modified pose data comprising a second time interval that is less than or greater than the first time interval, or equal to the first time interval; control the robot assembly on the basis of the first modified pose data; detect second pose data of the at least one object, the second pose data comprising a third time interval that is chronologically parallel to the detection of the first pose data; determine second modified pose data of the at least one object on the basis of the second detected pose data, the second modified pose data comprising a fourth time interval that is less than or greater than the third time interval, or equal to the third time interval; and control the robot assembly on the basis of the second modified pose data; wherein at least one of the detected or modified first pose data and/or at least one of the detected or modified second pose data depend on an orientation of the at least one object; and wherein determining at least one of the first modified pose data or the second modified pose data comprises determining the modified pose data in such a manner that a rotation of the at least one object in a space between the first modified pose data or the second modified pose data is minimal.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and, together with a general description of the invention given above, and the detailed description given below, serve to explain the principles of the invention.

(2) FIG. 1 shows an assembly having a robot and a system for controlling the robot according to an embodiment of the present invention.

(3) FIG. 2 shows detected and modified pose data.

(4) FIG. 3 shows a method for the control of the robot according to an embodiment of the present invention.

DETAILED DESCRIPTION

(5) FIG. 1 shows an assembly having a robot 1 and a system for controlling the robot 1 according to an embodiment of the present invention.

(6) The system comprises a robot controller 2 and a detection and processing device 30 communicating therewith, which comprises a camera 31, for detecting pose data of objects 4.

(7) By means of the camera 31, the detection and processing device 30 detects (first) pose data x.sub.s, which describe the three-dimensional Cartesian position and/orientation of the objects 4—of which only one is shown by way of example in FIG. 1—at specific times t.sub.1, t.sub.5, t.sub.9. These are indicated by solid squares in FIG. 2.

(8) Accordingly, these first pose data x.sub.s comprise a first time interval which is constant in this exemplary embodiment and corresponds to the sampling and/or evaluation rate of the camera 31 and/or detection and processing device 30. The detected pose data describe the orientation of the object 4, optionally after appropriate transformation, in quaternions.

(9) The camera 31 detects poses of the object 4 in a step S90 (see FIG. 3). The detection and processing device 30 checks in a step S100 whether a prespecified halt time has already passed since the last detected pose was received from the camera 31.

(10) If this is the case (S100: “Y”)—that is to say, there are no (new) detected pose data for the halt time—control of the robot 1 on the basis of the pose data is stopped and a warning is issued in a step S110.

(11) Otherwise (S100: “N”), the detection and processing device 30 checks, in a step S120, whether a prespecified follow-up time has elapsed since the last detected pose was received from the camera 31.

(12) If this is the case (S120: “Y”)—that is to say, there are no (new) detected pose data for the follow-up time—modified pose data y(t) are determined in a step S130, in a manner described below with reference to FIG. 2, in such a manner that a translational and rotational velocity of the object 4 in the space between modified pose data and/or the time derivative of the determined modified pose data is constant, and the robot 1 is controlled, in step S140, on the basis of this modified pose data until either the halt time has passed or in the meantime a new detected pose has been received from the camera 31 (S90).

(13) In other words, upon a delay of newly detected poses, it is assumed that the object 4 continues to move proceeding from the last supporting point at a constant Cartesian and (absolute) angular velocity, and the robot 1 is controlled on the basis of this assumption until either new poses are detected or the halt time is reached.

(14) If newly detected poses are received by the camera 31 in time and/or within the follow-up time (S120: “N”), the detection and processing device 30 continues with a step S150.

(15) In this step, it determines by linear interpolation a supporting point x.sub.d, indicated in FIG. 2 by an empty square, in the middle between the newly detected and the previously detected pose, as well as a velocity v.sub.d at this supporting point, which likewise results from the linear interpolation and/or the quotient of the distance from the supporting point to the new pose divided by the corresponding time.

(16) By way of example, FIG. 2 shows such a supporting point x.sub.d(t.sub.7), which results from linear interpolation between the pose data x.sub.s(t.sub.5) and x.sub.s(t.sub.9), as indicated by dashed lines in FIG. 2, and the corresponding velocity v.sub.d(t.sub.7) at this supporting point, found from [x.sub.s(t.sub.9)−x.sub.d(t.sub.7)]/(t.sub.9−t.sub.7).

(17) Then, in a step S160, the detection and processing device 30 determines a third-order polynomial function f defined piecewise by third-order polynomials in such a way that it adjoins the preceding polynomial in a C.sup.1-continuous manner and/or has the same (time) derivative at the transition point, as well as runs through the new supporting point x.sub.d, where it has the determined velocity v.sub.d(t.sub.7).

(18) This function f and/or this polynomial (piece) is evaluated in the control cycle of the robot controller 2 in a step S170—that is, with a shorter time interval T.sub.2—and produces the modified pose data y, which in FIG. 2 are indicated by solid circles and are indicated by way of example for the times t.sub.4, t.sub.5, t.sub.6, t.sub.11 and t.sub.12.

(19) On the basis of this modified pose data y, the robot controller 2 controls the robot 1 in step S170—for example, to grasp the object 4, or the like. In one embodiment, for the control, an object-based manipulation instruction—for example, a (processing) path defined relative to the object 4—can be superimposed on the modified pose data y, and the robot 1 can be controlled on the basis of the modified pose data y and the object-based manipulation instruction.

(20) It can be seen in FIG. 2 that a newly detected pose would have to be present for the time point t.sub.13=t.sub.19+T.sub.1 in order to be able to determine a new supporting point for t.sub.11, and thus the next polynomial piece, in the manner described.

(21) However, since such a newly detected pose is not yet available in the exemplary embodiment within the follow-up time, in step S130, the function f is continued linearly at the corresponding point t.sub.11 with the slope f′(t.sub.11) present there, as shown in FIG. 2 by dashed lines, such that the translational and rotational velocity of the object 4 in the space between modified pose data is constant starting from t.sub.11.

(22) Then, in step S140, analogous to step S170, this function is then evaluated in the control cycle of the robot controller 2 and/or with the shorter time interval T.sub.2, and produces the modified pose data y(t.sub.12).

(23) If there is no newly detected pose after the halt time (S100: “Y”), the method is stopped.

(24) In the method described, the rotation of the object 4 in the space between modified pose data y is advantageously minimal, particularly due to the polynomial interpolation of the quaternions, such that, particularly, staggering is prevented.

(25) In parallel, further modified pose data can also be determined for further pose data detected chronologically in parallel by a camera 32 in an analogous manner described herein, and the robot 1 can also be controlled on the basis of this further modified pose data.

(26) Advantageously, the robot controller 2 can be provided with the modified pose data from the camera 31 and the further modified pose data from the camera 32 with the same cycle timing. This can be done by the modified pose data and the further modified pose data being determined with the same second time interval T.sub.2 and/or comprising the same second time interval. Likewise, the modified pose data and the further modified pose data can also have different cycle rates and/or phasing.

(27) It can be seen in FIG. 2 that the modified pose data y “lag behind” the detected pose data x.sub.s by a delay of T.sub.1/2, since the corresponding supporting point (for example, x.sub.d(t.sub.7)) and the velocity and/or time derivative given there (for example, v.sub.d(t.sub.7)) are only determined and used for the approximation by the function f and/or the (fine) interpolation once the newly detected pose (for example x.sub.s(t.sub.9)) is received, as long as the expected next and/or newly detected pose is not missing (for example, x.sub.s(t.sub.13)).

(28) Although exemplary embodiments have been explained in the foregoing description, it should be understood that a variety of modifications are possible. It should also be noted that the exemplary embodiments are merely examples that are not intended to limit the scope, applications and construction in any way. Rather, the person skilled in the art is given a guide by the preceding description for the implementation of at least one exemplary embodiment, wherein various modifications, particularly with regard to the function and arrangement of the components described, can be made without departing from the scope of the invention as defined according to the claims and to combinations of features equivalent thereto.

(29) While the present invention has been illustrated by a description of various embodiments, and while these embodiments have been described in considerable detail, it is not intended to restrict or in any way limit the scope of the appended claims to such detail. The various features shown and described herein may be used alone or in any combination. Additional advantages and modifications will readily appear to those skilled in the art. The invention in its broader aspects is therefore not limited to the specific details, representative apparatus and method, and illustrative example shown and described. Accordingly, departures may be made from such details without departing from the spirit and scope of the general inventive concept.

LIST OF REFERENCE NUMBERS

(30) 1 robot (arrangement) 2 robot controller 30 detection and processing device 31, 32 camera 4 object (arrangement) F function T.sub.1 first time interval T.sub.2 second time interval v.sub.d velocity x.sub.s detected pose (data)) x.sub.d supporting points y modified pose data