CONTROL OF A ROBOT ASSEMBLY
20200114518 · 2020-04-16
Inventors
Cpc classification
B25J9/1664
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/37017
PHYSICS
G05B2219/39391
PHYSICS
International classification
Abstract
A method for the control of a robot assembly having at least one robot. The method includes acquiring pose data from an object arrangement having at least one object, which data has a first time interval; determining modified pose data from the object arrangement, which data has a second time interval that is larger or smaller than the first time interval, or is equal to the first time interval, on the basis of the acquired pose data; and controlling the robot assembly on the basis of said modified pose data.
Claims
1-14. (canceled)
15. A method for the control of a robot assembly having at least one robot, the method comprising: detecting with a first sensor, first pose data an object arrangement having at least one object, the first pose data comprising a first time interval; determining first modified pose data of the object arrangement on the basis of the first detected pose data, the first modified pose data comprising a second time interval that is less than or greater than the first time interval, or equal to the first time interval; and controlling the robot assembly on the basis of the first modified pose data.
16. The method of claim 15, further comprising: detecting second pose data of the object arrangement, the second pose data comprising a third time interval that is chronologically parallel to the detection of the first pose data, determining second modified pose data of the object arrangement on the basis of the second detected pose data, the second modified pose data comprising a fourth time interval that is less than or greater than the third time interval, or equal to the third time interval; and controlling the robot assembly on the basis of the second modified pose data.
17. The method of claim 16, wherein detecting the second pose data comprises detecting the second pose data with a second sensor.
18. The method of claim 16, wherein at least one of the detected or modified first pose data and/or at least one of the detected or modified second pose data depend on an orientation of the object arrangement.
19. The method of claim 18, wherein at least one of: the first and second pose data comprise quaternions; or a time derivative of the first and second pose data describe an angular velocity of the object arrangement.
20. The method of claim 18, wherein determining at least one of the first modified pose data or the second modified pose data comprises determining the modified pose data in such a manner that a rotation of the object arrangement in the space between the first modified pose data or the second modified pose data is minimal.
21. The method of claim 16, wherein determining at least one of the first modified pose data or the second modified pose data comprises determining the modified pose data in such a manner that the first and second modified pose data satisfy a function that approximates at least one of the first detected pose data or second detected pose data.
22. The method of claim 21, wherein at least one of: the function comprises piecewise polynomials; the function comprises third order polynomials; or the function is satisfied when the first or second modified pose data runs through the first detected pose data, the second detected pose data, or supporting points between successive ones of at least the first detected pose data or the second detected pose data.
23. The method of claim 22, wherein at least one of: the function comprises, at supporting points, prespecified derivatives, which are particularly determined by successive detected pose data; or the piecewise polynomials have the same derivative at respective transition points of the polynomials.
24. The method of claim 16, wherein determining at least one of the first modified pose data or the second modified pose data comprises determining the modified pose data by at least one filtering.
25. The method of claim 15, further comprising: when detected pose data is absent at a predetermined follow-up time, then determining the first modified pose data and the second modified pose data in such a manner that a velocity of the object arrangement in the space between the first and second modified pose data is constant.
26. The method of claim 16, further comprising: when detected first or second pose data is absent at a predetermined halt time, then at least one of: stopping control of the robot assembly based on the modified pose data; or issuing a warning.
27. The method of claim 16, wherein at least one of: detecting the first or second pose data of the object arrangement comprises detecting the pose data using at least one sensor; or transmitting at least one of the first modified pose data or the second modified pose data to a controller for the purpose of controlling the robot arrangement.
28. The method of claim 27, wherein the at least one sensor is at least one of a non-contact sensor or an optical sensor.
29. The method of claim 16, wherein control of the robot assembly based on at least one of the first modified pose data or the second modified pose data comprises the combination of the first or second modified pose data with object-order-based manipulation instructions.
30. A system for controlling a robot assembly having at least one robot, the system comprising means for: detecting first pose data an object arrangement having at least one object, the first pose data comprising a first time interval; determining first modified pose data of the object arrangement on the basis of the first detected pose data, the first modified pose data comprising a second time interval that is less than or greater than the first time interval, or equal to the first time interval; and controlling the robot assembly on the basis of the first modified pose data.
31. An arrangement, comprising: a robot assembly having at least one robot; and a system according to claim 30 for controlling the robot assembly.
32. A computer program product for use in controlling a robot assembly having at least one robot, the computer program product having program code stored in a non-transitory computer-readable medium that, when executed by a computer, causes the computer to: detect with a first sensor, first pose data an object arrangement having at least one object, the first pose data comprising a first time interval; determine first modified pose data of the object arrangement on the basis of the first detected pose data, the first modified pose data comprising a second time interval that is less than or greater than the first time interval, or equal to the first time interval; and control the robot assembly on the basis of the first modified pose data.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0057] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and, together with a general description of the invention given above, and the detailed description given below, serve to explain the principles of the invention.
[0058]
[0059]
[0060]
DETAILED DESCRIPTION
[0061]
[0062] The system comprises a robot controller 2 and a detection and processing device 30 communicating therewith, which comprises a camera 31, for detecting pose data of objects 4.
[0063] By means of the camera 31, the detection and processing device 30 detects (first) pose data x.sub.s, which describe the three-dimensional Cartesian position and/orientation of the objects 4of which only one is shown by way of example in
[0064] Accordingly, these first pose data x.sub.s comprise a first time interval which is constant in this exemplary embodiment and corresponds to the sampling and/or evaluation rate of the camera 31 and/or detection and processing device 30. The detected pose data describe the orientation of the object 4, optionally after appropriate transformation, in quaternions.
[0065] The camera 31 detects poses of the object 4 in a step S90 (see
[0066] If this is the case (S100: Y)that is to say, there are no (new) detected pose data for the halt timecontrol of the robot 1 on the basis of the pose data is stopped and a warning is issued in a step S110.
[0067] Otherwise (S100: N), the detection and processing device 30 checks, in a step S120, whether a prespecified follow-up time has elapsed since the last detected pose was received from the camera 31.
[0068] If this is the case (S120: Y)that is to say, there are no (new) detected pose data for the follow-up timemodified pose data y(t) are determined in a step S130, in a manner described below with reference to
[0069] In other words, upon a delay of newly detected poses, it is assumed that the object 4 continues to move proceeding from the last supporting point at a constant Cartesian and (absolute) angular velocity, and the robot 1 is controlled on the basis of this assumption until either new poses are detected or the halt time is reached.
[0070] If newly detected poses are received by the camera 31 in time and/or within the follow-up time (S120: N), the detection and processing device 30 continues with a step S150.
[0071] In this step, it determines by linear interpolation a supporting point x.sub.d, indicated in
[0072] By way of example,
[0073] Then, in a step S160, the detection and processing device 30 determines a third-order polynomial function f defined piecewise by third-order polynomials in such a way that it adjoins the preceding polynomial in a C.sup.1-continuous manner and/or has the same (time) derivative at the transition point, as well as runs through the new supporting point x.sub.d, where it has the determined velocity v.sub.d(t.sub.7).
[0074] This function f and/or this polynomial (piece) is evaluated in the control cycle of the robot controller 2 in a step S170that is, with a shorter time interval T.sub.2and produces the modified pose data y, which in
[0075] On the basis of this modified pose data y, the robot controller 2 controls the robot 1 in step S170for example, to grasp the object 4, or the like. In one embodiment, for the control, an object-based manipulation instructionfor example, a (processing) path defined relative to the object 4can be superimposed on the modified pose data y, and the robot 1 can be controlled on the basis of the modified pose data y and the object-based manipulation instruction.
[0076] It can be seen in
[0077] However, since such a newly detected pose is not yet available in the exemplary embodiment within the follow-up time, in step S130, the function f is continued linearly at the corresponding point t.sub.11 with the slope f(t.sub.11) present there, as shown in
[0078] Then, in step S140, analogous to step S170, this function is then evaluated in the control cycle of the robot controller 2 and/or with the shorter time interval T.sub.2, and produces the modified pose data y(t.sub.12).
[0079] If there is no newly detected pose after the halt time (S100: Y), the method is stopped.
[0080] In the method described, the rotation of the object 4 in the space between modified pose data y is advantageously minimal, particularly due to the polynomial interpolation of the quaternions, such that, particularly, staggering is prevented.
[0081] In parallel, further modified pose data can also be determined for further pose data detected chronologically in parallel by a camera 32 in an analogous manner described herein, and the robot 1 can also be controlled on the basis of this further modified pose data.
[0082] Advantageously, the robot controller 2 can be provided with the modified pose data from the camera 31 and the further modified pose data from the camera 32 with the same cycle timing. This can be done by the modified pose data and the further modified pose data being determined with the same second time interval T.sub.2 and/or comprising the same second time interval. Likewise, the modified pose data and the further modified pose data can also have different cycle rates and/or phasing.
[0083] It can be seen in
[0084] Although exemplary embodiments have been explained in the foregoing description, it should be understood that a variety of modifications are possible. It should also be noted that the exemplary embodiments are merely examples that are not intended to limit the scope, applications and construction in any way. Rather, the person skilled in the art is given a guide by the preceding description for the implementation of at least one exemplary embodiment, wherein various modifications, particularly with regard to the function and arrangement of the components described, can be made without departing from the scope of the invention as defined according to the claims and to combinations of features equivalent thereto.
[0085] While the present invention has been illustrated by a description of various embodiments, and while these embodiments have been described in considerable detail, it is not intended to restrict or in any way limit the scope of the appended claims to such detail. The various features shown and described herein may be used alone or in any combination. Additional advantages and modifications will readily appear to those skilled in the art. The invention in its broader aspects is therefore not limited to the specific details, representative apparatus and method, and illustrative example shown and described. Accordingly, departures may be made from such details without departing from the spirit and scope of the general inventive concept.
LIST OF REFERENCE NUMBERS
[0086] 1 robot (arrangement) [0087] 2 robot controller [0088] 30 detection and processing device [0089] 31, 32 camera [0090] 4 object (arrangement) [0091] F function [0092] T.sub.1 first time interval [0093] T.sub.2 second time interval [0094] v.sub.d velocity [0095] x.sub.s detected pose (data)) [0096] x.sub.d supporting points [0097] y modified pose data