HANDLING ASSEMBLY COMPRISING A HANDLING DEVICE FOR CARRYING OUT AT LEAST ONE WORK STEP, METHOD, AND COMPUTER PROGRAM

20200246974 ยท 2020-08-06

    Inventors

    Cpc classification

    International classification

    Abstract

    A handling assembly having a handling device for carrying out at least one working step with and/or on a workpiece in a working region of the handling device, stations being situated in the working region, with at least one monitoring sensor for the optical monitoring of the working region and for provision as monitoring data, with a localization module, the localization module being designed to recognize the stations and to determine a station position for each of the stations.

    Claims

    1-15. (canceled)

    16. A handling assembly, comprising: a handling device configured to carry out at least one work step with and/or on a workpiece in a working region of the handling device, stations being situated in the working region; at least one monitoring sensor configured to optically monitor the working region and to provide monitoring data based on the optical monitoring; and a localization module configured to recognize the stations and to determine a respective station position for each of the stations.

    17. The handling assembly as recited in claim 16, wherein the localization module includes a training module with training data, the training module being configured to determine, based on the training data, recognition features as station recognition data for detection of the stations, the localization module being configured to recognize the stations based on station recognition data.

    18. The handling assembly as recited in claim 16, further comprising: a model production module configured to generate a model of the working region.

    19. The handling assembly as recited in claim 18, further comprising: a display unit configured to display the generated model, segments being selectable in the displayed model by a user as additional recognition features, the station recognition data including the additional recognition features.

    20. The handling assembly as recited in claim 16, further comprising: a control module configured to control the handling device to carry out the work step based on the respective station position.

    21. The handling assembly as recited in claim 16, further comprising: a task definition module for a semantic selection and/or definition of the work step.

    22. The handling assembly as recited in claim 21, wherein the work step has at least two parameters that are to be defined, the two parameters including a start position and an end position, such that for the user, one of the stations is selectable in the displayed model as the start position, and one of the stations is selectable and/or is capable of being set in the displayed model as the end position.

    23. The handling assembly as recited in claim 22, wherein, after termination of the work step, the workpiece is in an end position, the end position being selectable by the user via the task definition module and/or being determined based on the monitoring data.

    24. The handling assembly as recited in claim 16, further comprising: a path planning module configured to determine a trajectory of the handling device and/or of the workpiece during a carrying out of the work step.

    25. The handling assembly as recited in claim 16, further comprising: a safety module connected in terms of data to the monitoring sensor to take over the monitoring data during the work step, and being configured to control the handling device based on changes in the working region.

    26. The handling assembly as recited in claim 16, further comprising: a testing module including rules and configured to control the handling device to carry out and/or to terminate the work step based on the following of the rules.

    27. The handling assembly as recited in claim 17, further comprising: an additional sensor configured for fine resolution of a segment of the working region and providing fine resolution data based on the fine resolution; and a fine localization module configured for a more precise determination of an orientation of the stations based on the respective station position, and/or the fine resolution data, and/or the station recognition data, and/or the sensor data.

    28. The handling assembly as recited in claim 27, wherein the additional sensor is carried along on the handling device.

    29. A method for operating a handling device with a handling assembly, comprising the following steps: optically monitoring, using a monitoring sensor, a working region of the handling device; providing, by the monitoring sensor, monitoring data of the working region; recognizing stations in the working region based on the monitoring data; and determining a station position for the recognized stations.

    30. A non-transitory computer-readable storage medium on which is stored a computer program having program code for operating a handling device with a handling assembly, the computer program, when executed by a computer or the handling assembly, causing the computer or handling assembly to perform the following steps: optically monitoring, using a monitoring sensor, a working region of the handling device; providing, by the monitoring sensor, monitoring data of the working region; recognizing stations in the working region based on the monitoring data; and determining a station position for the recognized stations.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0033] FIG. 1 shows a schematic view of an exemplary embodiment of a handling assembly.

    [0034] FIGS. 2a and 2b show a schematic view of a display unit of the handling assembly of FIG. 1.

    [0035] FIG. 3 shows a flow diagram for an exemplary embodiment of the method for carrying out a work step with the handling device.

    DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

    [0036] FIG. 1 shows a schematic view of a handling assembly 1. Handling assembly 1 includes a handling device 2. Handling device 2 is situated in a working region 3. Handling device 2 is realized as a multi-axle robot, and is in particular movable and/or pivotable about at least three axles. Handling device 2 has a gripper 4. Gripper 4 is capable of being moved in working region 3 by handling device 2. In particular, gripper 4 is movable in three dimensions in working region 3. Working region 3 is for example a manufacturing plant, a production area, and/or a factory floor segment.

    [0037] Stations 5 are situated in working region 3. Stations 5 are for example deposition locations for a workpiece 6. For example, one station 5 can be understood as a workpiece source, and another station 5 can be understood as a workpiece end point. For example, one station 5 is a pallet on which the workpieces are situated and/or are capable of being situated. Stations 5 are preferably situated at fixed locations in working region 3;

    [0038] alternatively, stations 5 can be displaced and/or are movable in working region 3 of handling device 2.

    [0039] Handling device 2 is designed to carry out a work step. The work step can for example be grasp a workpiece 6 in a first station 5 with gripper 4 and transport workpiece 6 to the other station 5 and put it down there. In addition, handling device 2 can carry out a multiplicity of work steps, for example grasp workpiece 6 with gripper 4 and process workpiece 6, for example with a drill.

    [0040] Handling assembly 1 includes two monitoring sensors 7. Monitoring sensors 7 are designed as monitoring cameras. Monitoring sensors 7 are designed for the optical monitoring of working region 3. For this purpose, monitoring sensors 7 record working region 3 in the form of monitoring images. Monitoring sensors 7 are designed to provide monitoring data, the monitoring data including in particular the monitoring images. In particular, monitoring sensors 7 are configured in such a way that the monitoring images have a region of overlap, the region of overlap showing a common region of working region 3. Particularly preferably, monitoring sensors 7 are stereo cameras, these stereo cameras producing a three-dimensional image of working region 3. The monitoring data are provided to a central evaluation unit 8.

    [0041] Central evaluation unit 8 is designed for example as a computing unit. It can be provided that central evaluation unit 8 is situated in decentralized fashion, for example in a server room; alternatively, evaluation unit 8 is a central evaluation unit 8 integrated for example into handling device 2.

    [0042] Central evaluation unit 8 includes a localization module 9. The monitoring data are provided to localization module 9. Localization module 9 includes station recognition data 10. Station recognition data 10 include in particular information and/or features that permit the inference of a station in the monitoring data and/or in the monitoring images. For example, station recognition data 10 include information about the geometry, the contours, the contrast, and/or the structure of stations 5 in the monitoring data and/or monitoring images.

    [0043] Localization module 9 is designed to recognize a station based on the monitoring data and station recognition data 10, and, based thereon, to determine station positions 11 for a recognized station. Station positions 11 are in particular coordinates in a three-dimensional space, and indicate the position of station 5 in working region 3 and/or in the monitoring images. In addition, station positions 11 can also include information about the orientation, for example the angular position.

    [0044] Central evaluation unit 8 includes a model production module 12. Model production module 12 receives station positions 11 and provides the monitoring data. Model production module 12 is designed to produce a model 13 of working region 3 with stations 5 and handling device 2, based on the monitoring data and the station positions 11. Model 13 is here a three-dimensional model. Preferably, model 13 is a CAD model of working region 3, including stations 5 and handling device 2. For the description of the orientation and/or the positions of stations 5 and/or of handling device 2 in model 13, model production module 12 can include an auxiliary coordinate system 14.

    [0045] Central evaluation unit 8 has a task definition module 15. Task definition module 15 is designed to define and/or select the working step that is to be carried out on workpiece 6 by handling device 2 in working region 3. In particular, task definition module 15 is designed in such a way that a user can more precisely define and/or select the task and/or the work step on a semantic basis. For example, for this purpose task definition module 15 includes semantic phrases 16, such as grip, lift, or transport. The user can define and/or link these semantic phrases 16 by determining and/or inputting station positions 11. In addition, the user can also complete the task and/or the semantic phrases 16 by determining an end position 17. End position 17 includes, in addition to the coordinates for determining the deposition location, information about the orientation in space, for example three Euler angles. Alternatively, it can be provided that, using task definition module 15, the user can define and/or select the task and/or the work step via optical selection and/or optical marking.

    [0046] Central evaluation unit 8 includes a path planning module 18. Path planning module 18 is designed, based on the task, the work step, and/or station positions 11, of planning a trajectory X(t), this trajectory X(t) describing the path-time curve of workpiece 6 during the work step. Path planning module 18 is in addition designed to determine trajectory X(t) in such a way that the trajectory X(t) is collision-free, i.e., no collision occurs of workpiece 6 with handling device 2 and/or with objects in working region 3.

    [0047] In addition, it is provided that central evaluation unit 8 includes a control module, the control module being designed to control handling device 2 by carrying out the work step. For example, the control module controls handling device 2 in such a way that handling device 2 grips workpiece 6 with gripper 4 and transports it along trajectory X(t).

    [0048] FIG. 2a shows the view of a display unit, model 13 of working region 3 being displayed. Model 13 includes four stations 6a, 6b, 6c, and 6d. Stations 6a, 6c, and 6d form workpiece end points, and station 6b forms a workpiece source. The work step defined and/or selected by this model is a work step based on the workpiece. The work step includes three processes 19a, 19b, and 19c. Processes 19a, 19b, and 19c are processes that can be carried out by a single arm of handling device 2. For example, process 19a is defined as grasp a workpiece 6 at station 5b and put it down at station 5a. Process 19b is defined for example as grasp an object 6 at station 5b, transport it to station 5b, and put it down there. Process 19c is defined for example as grasp workpiece 6 and put it down at station 5c. For example, the work step can be defined in that a user moves a workpiece 6 from one station 5 to another station 5, this work step corresponding to the transfer of the workpiece from the first station to the second station.

    [0049] FIG. 2b also shows a model 13 of working region 3, this model including, as workpiece sources, stations 5a, 5b, and 5e. Model 13 includes, as workpiece end points, stations 5c, 5d, and 5f. The work step defined and/or selected by this model is a work step based on a pallet. This means in particular that this work step does not transport and/or process any individual workpiece 6; rather, an entire workpiece pallet is transported and/or processed. In particular, processes 19a, 19b, and 19c for carrying out the work step are processes that are to be carried out using two arms of handling device 2. Process 19a is for example designed to transport a pallet of workpieces 6 from station 6a to station 5c. Process 19b is for example defined so as to transport a pallet from station 5b to station 5f. Process 19c is designed to transport a pallet of workpieces 6 from station 5d to station 5e. In particular, model 13 also illustrates that stations can have different shapes and/or sizes, station 5d being square and much smaller than rectangular station 5a.

    [0050] FIG. 3 shows a flow schema of a method for carrying out a working step with handling device 2. In a training step 100, a large amount of training data is provided to handling assembly 1 and/or to localization module 9. The training data include for example images indicating working regions 3 with stations 5. Here, localization module 9 includes a training module, and, in training step 100, the training module extracts, from the training data, recognition features for the detection of stations in the monitoring data. These recognition features are provided to station recognition data 10. In particular, in this step classifications and/or structures for recognizing the stations are obtained. This step can be carried out for example by a neural network. Training step 100 is followed by a localization step 200. In localization step 200, based on the monitoring data, station position 11 of a station, and/or of a multiplicity of stations, 5 are defined. Here, for example the monitoring data and/or the monitoring images are examined for structures and/or features that indicate stations 5. Based on the stations 5 that are found, the positions and/or orientations of stations 5 are determined as station positions 11.

    [0051] In a task definition step 300, a person defines a task. In particular, the task is defined and/or selected by the person in a semantic and/or optical selection. For example, for this purpose the user can select previously accomplished tasks, for example transport and drill workpiece 6. These selected tasks can be defined more precisely in particular using station positions 11, for example grasp a workpiece from station 5 at station position 11 and drill this workpiece 6.

    [0052] In a planning step 400, based on the defined task and the station positions, the work step is planned and a trajectory X(T) is determined, this trajectory being a trajectory of the workpiece free of collisions with objects in working region 3. Based on this trajectory X(T), handling device 2 is controlled in order to carry out the work step.