HANDLING ASSEMBLY COMPRISING A HANDLING DEVICE FOR CARRYING OUT AT LEAST ONE WORK STEP, METHOD, AND COMPUTER PROGRAM
20200246974 ยท 2020-08-06
Inventors
- Christian Knoll (Stuttgart, DE)
- Corinna Pfeiffer-Scheer (Markgroeningen, DE)
- Dieter Kunz (Ditzingen, DE)
- Jens Hofele (Lenningen, DE)
- Peter Schlaich (Leonberg, DE)
Cpc classification
G05B2219/40099
PHYSICS
B25J9/1664
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40323
PHYSICS
G05B2219/40607
PHYSICS
International classification
Abstract
A handling assembly having a handling device for carrying out at least one working step with and/or on a workpiece in a working region of the handling device, stations being situated in the working region, with at least one monitoring sensor for the optical monitoring of the working region and for provision as monitoring data, with a localization module, the localization module being designed to recognize the stations and to determine a station position for each of the stations.
Claims
1-15. (canceled)
16. A handling assembly, comprising: a handling device configured to carry out at least one work step with and/or on a workpiece in a working region of the handling device, stations being situated in the working region; at least one monitoring sensor configured to optically monitor the working region and to provide monitoring data based on the optical monitoring; and a localization module configured to recognize the stations and to determine a respective station position for each of the stations.
17. The handling assembly as recited in claim 16, wherein the localization module includes a training module with training data, the training module being configured to determine, based on the training data, recognition features as station recognition data for detection of the stations, the localization module being configured to recognize the stations based on station recognition data.
18. The handling assembly as recited in claim 16, further comprising: a model production module configured to generate a model of the working region.
19. The handling assembly as recited in claim 18, further comprising: a display unit configured to display the generated model, segments being selectable in the displayed model by a user as additional recognition features, the station recognition data including the additional recognition features.
20. The handling assembly as recited in claim 16, further comprising: a control module configured to control the handling device to carry out the work step based on the respective station position.
21. The handling assembly as recited in claim 16, further comprising: a task definition module for a semantic selection and/or definition of the work step.
22. The handling assembly as recited in claim 21, wherein the work step has at least two parameters that are to be defined, the two parameters including a start position and an end position, such that for the user, one of the stations is selectable in the displayed model as the start position, and one of the stations is selectable and/or is capable of being set in the displayed model as the end position.
23. The handling assembly as recited in claim 22, wherein, after termination of the work step, the workpiece is in an end position, the end position being selectable by the user via the task definition module and/or being determined based on the monitoring data.
24. The handling assembly as recited in claim 16, further comprising: a path planning module configured to determine a trajectory of the handling device and/or of the workpiece during a carrying out of the work step.
25. The handling assembly as recited in claim 16, further comprising: a safety module connected in terms of data to the monitoring sensor to take over the monitoring data during the work step, and being configured to control the handling device based on changes in the working region.
26. The handling assembly as recited in claim 16, further comprising: a testing module including rules and configured to control the handling device to carry out and/or to terminate the work step based on the following of the rules.
27. The handling assembly as recited in claim 17, further comprising: an additional sensor configured for fine resolution of a segment of the working region and providing fine resolution data based on the fine resolution; and a fine localization module configured for a more precise determination of an orientation of the stations based on the respective station position, and/or the fine resolution data, and/or the station recognition data, and/or the sensor data.
28. The handling assembly as recited in claim 27, wherein the additional sensor is carried along on the handling device.
29. A method for operating a handling device with a handling assembly, comprising the following steps: optically monitoring, using a monitoring sensor, a working region of the handling device; providing, by the monitoring sensor, monitoring data of the working region; recognizing stations in the working region based on the monitoring data; and determining a station position for the recognized stations.
30. A non-transitory computer-readable storage medium on which is stored a computer program having program code for operating a handling device with a handling assembly, the computer program, when executed by a computer or the handling assembly, causing the computer or handling assembly to perform the following steps: optically monitoring, using a monitoring sensor, a working region of the handling device; providing, by the monitoring sensor, monitoring data of the working region; recognizing stations in the working region based on the monitoring data; and determining a station position for the recognized stations.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0033]
[0034]
[0035]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0036]
[0037] Stations 5 are situated in working region 3. Stations 5 are for example deposition locations for a workpiece 6. For example, one station 5 can be understood as a workpiece source, and another station 5 can be understood as a workpiece end point. For example, one station 5 is a pallet on which the workpieces are situated and/or are capable of being situated. Stations 5 are preferably situated at fixed locations in working region 3;
[0038] alternatively, stations 5 can be displaced and/or are movable in working region 3 of handling device 2.
[0039] Handling device 2 is designed to carry out a work step. The work step can for example be grasp a workpiece 6 in a first station 5 with gripper 4 and transport workpiece 6 to the other station 5 and put it down there. In addition, handling device 2 can carry out a multiplicity of work steps, for example grasp workpiece 6 with gripper 4 and process workpiece 6, for example with a drill.
[0040] Handling assembly 1 includes two monitoring sensors 7. Monitoring sensors 7 are designed as monitoring cameras. Monitoring sensors 7 are designed for the optical monitoring of working region 3. For this purpose, monitoring sensors 7 record working region 3 in the form of monitoring images. Monitoring sensors 7 are designed to provide monitoring data, the monitoring data including in particular the monitoring images. In particular, monitoring sensors 7 are configured in such a way that the monitoring images have a region of overlap, the region of overlap showing a common region of working region 3. Particularly preferably, monitoring sensors 7 are stereo cameras, these stereo cameras producing a three-dimensional image of working region 3. The monitoring data are provided to a central evaluation unit 8.
[0041] Central evaluation unit 8 is designed for example as a computing unit. It can be provided that central evaluation unit 8 is situated in decentralized fashion, for example in a server room; alternatively, evaluation unit 8 is a central evaluation unit 8 integrated for example into handling device 2.
[0042] Central evaluation unit 8 includes a localization module 9. The monitoring data are provided to localization module 9. Localization module 9 includes station recognition data 10. Station recognition data 10 include in particular information and/or features that permit the inference of a station in the monitoring data and/or in the monitoring images. For example, station recognition data 10 include information about the geometry, the contours, the contrast, and/or the structure of stations 5 in the monitoring data and/or monitoring images.
[0043] Localization module 9 is designed to recognize a station based on the monitoring data and station recognition data 10, and, based thereon, to determine station positions 11 for a recognized station. Station positions 11 are in particular coordinates in a three-dimensional space, and indicate the position of station 5 in working region 3 and/or in the monitoring images. In addition, station positions 11 can also include information about the orientation, for example the angular position.
[0044] Central evaluation unit 8 includes a model production module 12. Model production module 12 receives station positions 11 and provides the monitoring data. Model production module 12 is designed to produce a model 13 of working region 3 with stations 5 and handling device 2, based on the monitoring data and the station positions 11. Model 13 is here a three-dimensional model. Preferably, model 13 is a CAD model of working region 3, including stations 5 and handling device 2. For the description of the orientation and/or the positions of stations 5 and/or of handling device 2 in model 13, model production module 12 can include an auxiliary coordinate system 14.
[0045] Central evaluation unit 8 has a task definition module 15. Task definition module 15 is designed to define and/or select the working step that is to be carried out on workpiece 6 by handling device 2 in working region 3. In particular, task definition module 15 is designed in such a way that a user can more precisely define and/or select the task and/or the work step on a semantic basis. For example, for this purpose task definition module 15 includes semantic phrases 16, such as grip, lift, or transport. The user can define and/or link these semantic phrases 16 by determining and/or inputting station positions 11. In addition, the user can also complete the task and/or the semantic phrases 16 by determining an end position 17. End position 17 includes, in addition to the coordinates for determining the deposition location, information about the orientation in space, for example three Euler angles. Alternatively, it can be provided that, using task definition module 15, the user can define and/or select the task and/or the work step via optical selection and/or optical marking.
[0046] Central evaluation unit 8 includes a path planning module 18. Path planning module 18 is designed, based on the task, the work step, and/or station positions 11, of planning a trajectory X(t), this trajectory X(t) describing the path-time curve of workpiece 6 during the work step. Path planning module 18 is in addition designed to determine trajectory X(t) in such a way that the trajectory X(t) is collision-free, i.e., no collision occurs of workpiece 6 with handling device 2 and/or with objects in working region 3.
[0047] In addition, it is provided that central evaluation unit 8 includes a control module, the control module being designed to control handling device 2 by carrying out the work step. For example, the control module controls handling device 2 in such a way that handling device 2 grips workpiece 6 with gripper 4 and transports it along trajectory X(t).
[0048]
[0049]
[0050]
[0051] In a task definition step 300, a person defines a task. In particular, the task is defined and/or selected by the person in a semantic and/or optical selection. For example, for this purpose the user can select previously accomplished tasks, for example transport and drill workpiece 6. These selected tasks can be defined more precisely in particular using station positions 11, for example grasp a workpiece from station 5 at station position 11 and drill this workpiece 6.
[0052] In a planning step 400, based on the defined task and the station positions, the work step is planned and a trajectory X(T) is determined, this trajectory being a trajectory of the workpiece free of collisions with objects in working region 3. Based on this trajectory X(T), handling device 2 is controlled in order to carry out the work step.