MONITORING DURING A ROBOT-ASSISTED PROCESS

20230311326 · 2023-10-05

Assignee

Inventors

Cpc classification

International classification

Abstract

A method for monitoring during a robot-assisted first or second process-includes (a.1) detecting process data; and (a.2) performing a model-based assessment with the aid of a machine-learned model on the basis of the detected process data; wherein, if the model-based assessment satisfies an examination criterion, in particular depending on an external confirmation: (b.1) performing a test assessment with the aid of a testing authority; and (b.2) training the machine-learned model further on the basis of the test assessment; and then, for the first process optionally performed again: (c.1) detecting process data; (c.2) performing the model-based assessment with the aid of the further trained model on the basis of the detected process data; and (c.3) monitoring during the first process is performed on the basis of this assessment.

Claims

1-11. (canceled)

12. A method for monitoring during a robot-assisted first process, wherein the following steps are performed by a robot controller for at least the robot-assisted first process: (a.1) detecting process data; (a.2) performing a first model-based assessment with the aid of a machine-learned model on the basis of the detected process data; in response to the model-based assessment satisfying an examination criterion, then: (b.1) performing a test assessment with the aid of a testing authority, and (b.2) further training the machine-learned model on the basis of the test assessment; (c.1) detecting further process data for the first process; (c.2) performing a second model-based assessment with the aid of the further trained machine-learned model on the basis of the further detected process data; and (c.3) monitoring the robot during the first process and based on the second model-based assessment.

13. The method of claim 12, wherein the examination criterion depends on an external confirmation.

14. The method of claim 12, wherein at least one of: the testing authority comprises at least one person; the testing authority comprises at least one further machine-learned model; the testing authority determines at least one parameter; or the test assessment comprises a test run of at least one robot by which the particular process is performed, the test run being different than the first process.

15. The method of claim 14, wherein the test run is different than a second process.

16. The method of claim 12, wherein at least one of the process data or data used in the test assessment are at least one of: data of at least one robot by which the particular process is performed; data of at least one process product of the particular process; or at least one of audio or video data of the particular process.

17. The method of claim 16, wherein at least one of: the data of at least one robot by which the particular process is performed are time profiles; or the data of at least one process product of the particular process are image data.

18. The method of claim 12, further comprising: monitoring the robot during the first process and based on the first model-based assessment.

19. The method of claim 12, wherein: steps (a.1) and (a.2) are performed at least two times; and then steps (b.1) and (b.2) are performed.

20. The method of claim 12, wherein monitoring comprises at least one of: monitoring at least one robot by which the first process is performed; predictive maintenance monitoring of at least one robot by which the first process is performed; monitoring for errors in the first process; or monitoring for errors in process products of the first process.

21. The method of claim 12, wherein the examination criterion is predefined in such a way that the model-based assessment satisfies the examination criterion when the model-based assessment reveals a specific error or a predefined repetition number of the error.

22. The method of claim 12, wherein: the examination criterion is predefined in such a way that: the model-based assessment satisfies the examination criterion with the aid of the model on the basis of the first detected process data, and the model-based assessment does not satisfy the examination criterion with the aid of the same model on the basis of second detected process data; and an expected gain in information during further training of the model on the basis of the first process data is greater than during further training of the model on the basis of the second process data.

23. The method of claim 12, wherein the model is further trained before step (c.1) additionally on the basis of detected process data without a test assessment being taken into account.

24. The method of claim 23, wherein the model is further trained with the aid of the testing authority.

25. a system for monitoring during a robot-assisted first process, the system comprising: for the robot-assisted first process or a robot-assisted second process, means for: (a.1) detecting process data, and (a.2) performing a model-based assessment with the aid of a machine-learned model on the basis of the detected process data; and means configured to, in response to the model-based assessment satisfying an examination criterion: (b.1) perform a test assessment with the aid of a testing authority, and (b.2) further train the machine-learned model on the basis of the test assessment.

26. The system of claim 25, wherein the system further comprises means for: (c.1) detecting further process data for the first process; (c.2) performing a second model-based assessment with the aid of the further trained model on the basis of the further detected process data; and (c.3) monitoring the robot during the first process and based on the second model-based assessment.

27. The method of claim 25, wherein the examination criterion depends on an external confirmation.

28. A computer program product for monitoring during a robot-assisted first process, the computer program product comprising program code stored on a non-transitory, computer-readable medium, the program code configured, when executed by a computer, to cause the computer to perform for the robot-assisted first process, the method set forth in claim 12.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0090] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and, together with a general description of the invention given above, and the detailed description given below, serve to explain the principles of the invention.

[0091] FIG. 1 schematically illustrates a system according to one embodiment of the present invention in a cyclic robot-assisted process;

[0092] FIG. 2 illustrates a method for monitoring during the cyclic robot assisted process according to an embodiment of the present invention; and

[0093] FIG. 3 illustrates a method for monitoring during the cyclic robot-assisted process according to a further embodiment of the present invention.

DETAILED DESCRIPTION

[0094] FIG. 1 shows by way of example a robot 10 with a robot arm 11, which in one cycle uses a tool 12 to machine in each case one workpiece 20, which is conveyed to and away or further onwards on a conveyor belt 21 and is recorded by a camera 30 after each processing. A controller of the robot 10 is denoted by 13.

[0095] FIG. 2 shows a method for monitoring during the cyclical robot-assisted process according to an embodiment of the present invention.

[0096] In a step S10, a cycle of the robot-assisted process sketched with reference to FIG. 1 is performed and process data, in the embodiment by way of example driving forces of the robot 10 or the like, are detected.

[0097] With the aid of the model that is machine-learned beforehand by training the artificial neural network 13.1, a model-based assessment of the robot for predictive maintenance is performed on the basis of these detected process data (FIG. 2: step S20), wherein the model or artificial neural network 13.1 in the embodiment, again purely by way of example, on account of the detected driving forces classifies the robot 10 as currently error-free, not in need of maintenance, or as defective or in need of maintenance, i.e. the model-based assessment for monitoring with the aid of the machine-learned model reveals or outputs or assesses an error.

[0098] As long as no error is assessed (S30: “OK”) and the intended cycles have not yet all been performed (S40: “N”), the next cycle is performed.

[0099] If all the intended cycles have been performed (S40: “Y”), the process is ended (FIG. 2: step S50).

[0100] If an error is assessed (S30: “F”), an alarm is output (FIG. 2: step S60).

[0101] If an external confirmation is then provided by manual input (S70: “Y”), a test assessment is performed with the aid of a testing authority (FIG. 2: step S80) and the model is further trained on the basis of this test assessment (FIG. 2: step S85).

[0102] For this purpose, for example, a reference run is performed with the robot, during which, in the case of defective robots, particularly significant driving forces or the like occur.

[0103] On the basis of this reference run or data detected during this run, a testing authority, for example in the form of a different machine-learned model or a signal processing method, performs a test assessment or labels correspondingly the process data detected in step S10 which have led to the error message, it also being possible in one embodiment to distinguish between different errors of the robot.

[0104] On the basis of this test assessment, the machine-learned model or artificial neural network 13.1 is further trained and subsequently, if necessary, the next cycle is run through.

[0105] Without external confirmation (S70: “N”) a corresponding action is performed in the case of the alarm (S60), for example the robot is repaired (S90).

[0106] In a modification, in step S30 the “F” branch is only taken when a predefined error value repetition number has been reached.

[0107] It can be seen that the artificial neural network 13.1 is initially triggered on the basis of the process data detected in the normal working process, in particular if the alarm threshold is initially selected to be low as a precaution.

[0108] Due to the fact that the correct alarms differ from the false alarms with the aid of the reference runs by the testing authority, and the artificial neural network 13.1 is further trained on the basis of this labeling (step S80), the number of false alarms decreases with increasing duration.

[0109] These reference runs are advantageously only performed when this is triggered by the artificial neural network 13.1 or when their assessment signals an error of the robot 10.

[0110] FIG. 3 shows a method for monitoring the cyclic robot-assisted process according to a further embodiment of the present invention.

[0111] In a step S11, several cycles of the robot-assisted process sketched with reference to FIG. 1 are performed and process data, in this embodiment images of the processed workpieces from the camera 30, are detected.

[0112] These are labeled during this process or subsequently by the already (pre-)trained artificial neural network 13.1, which classifies the workpiece in question as “error-free” or “defective” (FIG. 3: step S21). In another embodiment, the artificial neural network 13.1 can additionally or alternatively also use other data, in particular kinematic and/or dynamic robot data.

[0113] The image of the processed workpiece from the camera 30 and the associated assessment by the artificial neural network 13.1 (in the other embodiment according to the robot data) are stored in each case (FIG. 3: step S31).

[0114] These collected process data and model-based assessments are used in a step S41 to select those cycles in which the reliability of the classification falls below a predefined minimum amount, since further training of the artificial neural network 13.1 with these cycles or images can expect the greatest information gain. In modifications, information gain, entropy or the like can also be used as an examination criterion or can be dependent thereon.

[0115] These selected images are labeled by the human (step S51). If, in the above-mentioned modification, the artificial neural network 13.1 uses kinematic and/or dynamic robot data or the like, the model-based assessment and the test assessment are thus performed on the basis of different process data, while the same process data can alternatively also be used.

[0116] In a step S61, the artificial neural network 13.1 is further trained with the process data labeled in step S21 and additionally the process data labeled in step S51.

[0117] As long as no termination criterion is satisfied, for example the learning progress falls below a predefined minimum level or a predefined repetition number is reached (S71: “N”), the method jumps back to step S11.

[0118] If the termination criterion is satisfied (S71: “Y”), the further training is terminated and the artificial neural network 13.1 is used for quality monitoring in the process (step S81).

[0119] It can be seen that the artificial neural network 13.1 is thus particularly effectively (further) trained on the basis of the images particularly suitable for this purpose, thereby significantly increasing its performance.

[0120] Although embodiments have been explained in the preceding description, it is noted that a large number of modifications are possible.

[0121] Thus, in particular, in addition to the labeled process data, unlabeled process data can also be used for the further training of the artificial neural network 13.1.

[0122] It is also noted that the embodiments are merely examples that are not intended to restrict the scope of protection, the applications, and the structure in any way. Rather, the preceding description provides a person skilled in the art with guidelines for implementing at least one embodiment, with various changes, in particular with regard to the function and arrangement of the described components, being able to be made without departing from the scope of protection as it arises from the claims and from these equivalent combinations of features.

[0123] While the present invention has been illustrated by a description of various embodiments, and while these embodiments have been described in considerable detail, it is not intended to restrict or in any way limit the scope of the appended claims to such de-tail. The various features shown and described herein may be used alone or in any combination. Additional advantages and modifications will readily appear to those skilled in the art. The invention in its broader aspects is therefore not limited to the specific details, representative apparatus and method, and illustrative example shown and described. Accordingly, departures may be made from such details without departing from the spirit and scope of the general inventive concept.

LIST OF REFERENCE SIGNS

[0124] 10 Robot [0125] 11 Robot arm [0126] 12 Tool [0127] 13 Controller [0128] 20 Workpiece [0129] 21 Conveyor belt [0130] 30 Camera

[0131] What is claimed is: