METHOD AND SYSTEM FOR MONITORING A ROBOT ARRANGEMENT
20220314454 · 2022-10-06
Inventors
Cpc classification
F16P3/142
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
B25J9/1676
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1666
PERFORMING OPERATIONS; TRANSPORTING
B25J13/087
PERFORMING OPERATIONS; TRANSPORTING
F16P3/144
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
B25J9/163
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A method for monitoring a robot arrangement, which robot arrangement has at least one robot includes capturing optical signals from a plurality of signal sources at least one sensor, wherein the signal sources and/or the sensor is/are positioned on the robot arrangement and triggering a monitoring reaction if a deviation of an actual arrangement of the captured optical signals from a desired arrangement of these signals exceeds a limit value. In one aspect, a reaction may be triggered if at least a predefined minimum number of signals from the desired arrangement is not present in the actual arrangement of the captured optical signals.
Claims
1-10. (canceled)
11. A method for monitoring a robot arrangement that includes at least one robot, the method comprising: capturing optical signals from a plurality of signal sources with at least one sensor, wherein at least one of the signal sources or the at least one sensor is positioned on the robot arrangement; determining a deviation of an actual arrangement of the captured optical signals from a desired arrangement of the signals; triggering a monitoring reaction in response to the determined deviation exceeding a predetermined limit value.
12. The method of claim 11, wherein determining the deviation comprises determining whether at least a predefined minimum number of signals from the desired arrangement is not present in the actual arrangement of the captured optical signals.
13. The method of claim 11, wherein the desired arrangement is determined on the basis of a determined position of the signal sources and the sensor to one another.
14. The method of claim 13, wherein the determined position of the signal sources and the sensor to one another is at least one of: based on a determined position of the robot arrangement; based on a predefined or determined position of the signal sources and the sensor relative to the robot arrangement; or based on at least one of a kinematic model or an optical model.
15. The method of claim 11, wherein at least one of: at least one of the signal sources is attached to a sheath attached to the robot arrangement; or at least one of the signal sources emits the optical signals.
16. The method of claim 15, wherein the emitted optical signals comprise at least one of: at least one of laser light or visible light; infrared light; or UV light.
17. The method of claim 11, wherein at least one of: at least one of the signal sources is illuminated by at least one light source with light, and the light is reflected as an optical signal; or at least one deflection means is arranged in an optical path between at least one of the signal sources and the sensor.
18. The method of claim 11, wherein at least one of: the optical signals of at least one of the signal sources has a predefined time pattern; or the actual arrangement is determined by the at least one sensor based on signal capturing with emitted optical signals and signal capturing without emitted optical signals.
19. The method of claim 11, wherein at least two of the signal sources emit different optical signals.
20. The method of claim 19; wherein at least one of: the at least two signal sources emit different robot link-specific optical signals; the different optical signals have at least one of different geometries, different brightnesses, or different colors; or the different optical signals have mutually different predefined time patterns.
21. The method of claim 11, wherein the monitoring reaction depends on at least one of: at least one of a number or a location of non-existent signals of the desired arrangement in the actual arrangement; or a thermal radiation detected by the sensor.
22. The method of claim 11, further comprising: interrupting an optical path between at least one signal source and the sensor with a robot-guided component.
23. The method of claim 22, wherein interrupting the optical path comprises interrupting the optical path on the robot side.
24. A system for monitoring a robot arrangement that includes at least one robot, the system comprising: at least one sensor designed to capture optical signals from a plurality of signal sources, wherein at least one of the signal sources or the at least one sensor is positioned on the robot arrangement; and means for triggering a monitoring reaction in response to the determination that the deviation of an actual arrangement of the captured optical signals from a desired arrangement of the signals exceeds a predetermined limit value.
25. The system of claim 24, wherein the deviation is determined by evaluating whether at least a predefined minimum number of signals from the desired arrangement is not present in the actual arrangement of the captured optical signals.
26. A computer program product for monitoring a robot arrangement that includes at least one robot, the computer program product including a program code stored on a non-transient, computer-readable medium, the program code, when executed by a computer, causing the computer to: capture optical signals from a plurality of signal sources with at least one sensor, wherein at least one of the signal sources or the at least one sensor is positioned on the robot arrangement; determine a deviation of an actual arrangement of the captured optical signals from a desired arrangement of the signals; and trigger a monitoring reaction in response to the determined deviation exceeding a predetermined limit value.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0071] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and, together with a general description of the invention given above, and the detailed description given below, serve to explain the principles of the invention.
[0072]
[0073]
DETAILED DESCRIPTION
[0074]
[0075] The robot arrangement consists of a multi-jointed or multi-axis robot 10 with a stationary or mobile base 11, a carousel 12 rotatable on it about a vertical axis, and a multi-jointed or multi-axis robot hand rotatable about a horizontal axis with a link arm 13, an arm 14, and an end effector 15.
[0076] Light sources in the form of laser pointers and/or LEDs 20, which have link-specific colors and/or shapes and/or can be activated in specific time patterns, are positioned on links of the robot 10.
[0077] A plurality of sensors in the form of cameras 30 are distributed around the working space of the robot 10, only a few of which sensors are shown in
[0078] These capture optical signals from the light sources 20 (
[0079] The cameras 30 are signal-connected to a monitoring device 100, which monitoring device can be integrated in a controller of the robot 10 and, for example, thereby receives a position of the robot 10 or corresponding joint angles.
[0080] From this, in a step S20 (cf.
[0081] The positions of the individual cameras 30 are also known in the monitoring device 100. These can, for example, have been determined in advance by means of triangulation from known positions of the light sources.
[0082] With the help of an optical model, the monitoring device 100 now predicts desired images of the light sources 20 as they (should) be captured by the cameras 30, provided that no unexpected obstacles between the robot 10 and the cameras 30 unexpectedly interrupt the optical path from the light sources to the cameras. Known or permitted obstacles can be taken into account in the optical model, for example with the help of appropriate learning drives or a desired configuration of the vicinity.
[0083] In a step S30, the monitoring device compares these desired images with the actual images actually captured by the cameras 30-32. In one embodiment, images with and without active light sources can be subtracted from one another, so that the desired and actual images each have only the images of the light sources themselves that are isolated in this way.
[0084] If at least a predefined minimum number of desired images of the light sources is not present in one of the actual images (S40: “Y”), the monitoring device 100 triggers a monitoring reaction (S50), for example, reduces the velocity of the robot 10; if necessary, it stops the robot. Otherwise (S40: “N”), the monitoring device or the method returns to step S10.
[0085] For the sake of clarity,
[0086] It can be seen that a failure of this light source 20 or of this camera 32 as well as the obstacle H lead to a deviation between the desired and actual image and trigger the (same) monitoring reaction, so that neither the light source nor the cameras have to be designed expensively in safe technology.
[0087] In addition, the embodiment makes it clear that a component guided on the end effector 15 would interrupt the optical path between the camera 30 and the lower signal source 20 in
[0088] In addition, the embodiment makes it clear that, for example, it is possible to determine based on the actual image from camera 32, whether there is an obstacle—as in
[0089] The embodiment also makes it clear that the desired image or the desired arrangement is predicted on the basis of a kinematic and optical model that comprises known obstacles in the vicinity of the robot. If, for example, the person H is at the position shown in
[0090] Although embodiments have been explained in the preceding description, it is noted that a large number of modifications is possible.
[0091] Thus, in addition or as an alternative to LEDs or laser pointers, the signal sources 20 can also have passive signal sources, in particular reflectors, which in one embodiment are specifically illuminated.
[0092] It is also noted that the embodiments are merely examples that are not intended to restrict the scope of protection, the applications and the structure in any way. Rather, the preceding description provides a person skilled in the art with guidelines for implementing at least one embodiment, with various changes, in particular with regard to the function and arrangement of the described components, being able to be made without departing from the scope of protection as it arises from the claims and from these equivalent combinations of features.
[0093] While the present invention has been illustrated by a description of various embodiments, and while these embodiments have been described in considerable detail, it is not intended to restrict or in any way limit the scope of the appended claims to such de-tail. The various features shown and described herein may be used alone or in any combination. Additional advantages and modifications will readily appear to those skilled in the art. The invention in its broader aspects is therefore not limited to the specific details, representative apparatus and method, and illustrative example shown and described. Accordingly, departures may be made from such details without departing from the spirit and scope of the general inventive concept.
REFERENCE SIGNS
10 Robot
11 Base
12 Carousel
13 Link arm
14 Arm
[0094] 15 End effector
20 Light source (laser pointer; LED)
30-32 Camera
[0095] 100 Monitoring device
200 Deflection mirror
H Person