Method for controlling a plurality of mobile driverless manipulator systems

11117260 · 2021-09-14

Assignee

Inventors

Cpc classification

International classification

Abstract

The invention relates to a method for controlling a plurality of mobile driverless manipulator systems (10, 20), in particular driverless transport vehicles in a logistics environment (40) for manipulating objects (30). In the method, ambient information is provided by a central control device (50), and in one step, an object to be moved (30) in the surroundings is detected. The position and the pose of the detected object are used for updating the ambient information and are taken into account in the path planning of the mobile driverless manipulator systems (10, 20) in that, prior to a movement of the detected object (30), a first mobile driverless manipulator system (10) is used to check whether the detected object (30) is needed for the orientation of a second mobile driverless manipulator system (20).

Claims

1. A method for controlling a plurality of mobile driverless manipulator systems (10, 20) in a logistics environment (40) for moving objects (30), wherein the mobile driverless manipulator systems (10, 20) each comprise a sensor for orientation and have a wireless communication link in order to communicate with at least one central control unit (50), which method comprises the following steps: providing environmental information by way of the at least one central control unit (50); detecting an object (30) to be manipulated by the mobile driverless manipulator systems (10, 20) in the environment and determining position and pose of the detected object (30); updating the environmental information with the position and pose of the detected object (30); taking into consideration the position and pose of the detected object (30) in a path plan of one or more of the mobile driverless manipulator systems (10, 20), by checking before a manipulation of the detected object (30) by a first mobile driverless manipulator system (10) whether the detected object (30) is required for orientation of a second mobile driverless manipulator system (20).

2. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 1, wherein the sensors are optical sensors.

3. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 2, wherein the detection of position and pose of the detected object (30) is carried out with the aid of the optical sensors of the mobile driverless manipulator systems (10, 20).

4. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 3, wherein the comparison of acquired sensor data to models of objects to be recognized takes place in a decentralized manner in data processing devices of the mobile driverless manipulator systems (10, 20).

5. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 2, wherein at least one of the mobile driverless manipulator systems (10, 20) detect the detected object (30) with their sensors and transmit a message representative of position and pose of the detected object (30) by the wireless communication link to the central control unit (50) to update the environmental information.

6. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 5 wherein the transmission of the message is carried out using timestamps in order to acquire a precise point in time of the detection.

7. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 5 wherein the message contains the pose and an uncertainty of pose estimation for the detected object (30).

8. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 7 wherein the transmission of the message is carried out using timestamps in order to acquire a precise point in time of the detection.

9. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 8 wherein the message contains the pose and an uncertainty of pose estimation for the detected object (30).

10. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 2, wherein the optical sensors are laser scanners.

11. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 2, wherein the detection of position and pose of the detected object (30) is carried out with the aid of the optical sensors of the mobile driverless manipulator systems (10, 20), and is based on a comparison of acquired sensor data from the detection of position and pose of the detected object to models of objects to be recognized.

12. The method for controlling a plurality of mobile driverless manipulator systems as claimed in claim 1, furthermore comprising the following steps: using the updated environmental information to estimate position of the detected object (30) in relation to the second mobile driverless manipulator system (10, 20) and redetecting the detected object by the sensor of the second mobile driverless manipulator system.

13. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 1, furthermore comprising the following steps: using the updated environmental information to estimate position of the object (30) in relation to the second mobile driverless manipulator system and intentionally searching for the detected object (30) by the second mobile driverless manipulator system to improve position estimation of the second mobile driverless manipulator system.

14. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 1, furthermore comprising the following steps: redetecting the detected object (30) by the sensors of a said mobile driverless manipulator system (10, 20) and using sensor data from said redetecting to improve an accuracy of determination of the position and pose of the detected object.

15. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 1, furthermore comprising the following step: responsive to determining that the detected object (30) is required for the orientation of the second mobile driverless manipulator system (20), deciding whether the path plan will be modified for the second mobile driverless manipulator system (20).

16. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 1, furthermore comprising the following step: responsive to determining that the detected object (30) is required for the orientation of the second mobile driverless manipulator system (20), determining a time which the detected object has to remain at its position until the second mobile driverless manipulator system has completed its orientation based on the detected object.

17. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 1, furthermore comprising the following step: managing orders of the detected object (30) in the central control unit (50) in a queue, to model dependence of other orders on a present pose of the detected object.

18. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 17, furthermore comprising the following step: prioritizing the orders and responsive to determining that the detected object (30) is required for the orientation of a second driverless manipulator system (20), deciding whether the path plan for the second mobile driverless manipulator system (20) will be modified depending on the priority of the orders.

19. The method for controlling a plurality of mobile driverless manipulator systems (10, 20) as claimed in claim 1, wherein at least one of the mobile driverless manipulator systems (10, 20) detect the detected object (30) with their sensors and transmit a message representative of position and pose of the detected object (30) by the wireless communication link to the central control unit (50) to update the environmental information.

20. A system for handling objects (30) to be moved in a logistics environment, comprising a plurality of mobile driverless manipulator systems (10, 20) for handling the objects and at least one central control unit (50), wherein the mobile driverless manipulator systems (10, 20) each comprise a sensor for orientation and have a wireless communication link in order to communicate with a central control unit (50), wherein the control unit (40) and the plurality of mobile driverless manipulator systems (10, 20) are configured to carry out a method as claimed in claim 1.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) The present invention will be described hereafter with reference to the appended figures. In the figures:

(2) FIG. 1 schematically shows localizing of a mobile manipulator system with the aid of a detected object;

(3) FIG. 2 schematically shows various path plans with and without object recognition;

(4) FIG. 3 schematically shows path plans utilizing the knowledge of position and pose of detected objects; and

(5) FIG. 4 shows a schematic flow chart of an exemplary method.

DETAILED DESCRIPTION

(6) FIG. 1 schematically shows an example of cooperative localizing with the aid of a detected object. Two mobile driverless manipulator systems, namely driverless transportation vehicles 10 and 20, are provided in an environment 40 in the example. The two manipulator systems 10, 20 have optical sensors (not shown), specifically laser scanners in particular, which have a semicircular scanning region in the travel direction of the manipulator systems. The visual range of the sensors and the scanning region are indicated with the reference signs 11 and 21, respectively (see FIG. 1a). The first manipulator system 10 transports an object 30 to be moved and follows a path plan 12. In FIG. 1b, the manipulator system 10 has moved somewhat along its path plan 12 and deposits the object 30 at the indicated point. The deposited object 30 can be detected by stationary sensors (not shown) being used, for example, or the position and pose of the object 30 being estimated on the basis of the position information of the manipulator system 10 upon depositing the object 30 in the position in FIG. 1b, for example. Corresponding items of information are transferred to a central control unit 50, which communicates wirelessly with the manipulator systems, for example, as indicated by the line 51. The updated items of environmental information having the estimated position and pose of the detected object 30 can now be used in the path plan of the second manipulator system 20. The path plan is indicated by the dashed line having the reference sign 23. The position and pose of the object 30 are only estimated and are relatively inaccurate at the time of FIG. 1b. This is indicated by the ellipse 31, which is relatively large in FIGS. 1a and 1b.

(7) In FIG. 1c, the second manipulator system 20 has moved somewhat along its path 23. As a result of this movement, the localizing of the manipulator system 20 is less accurate than at the beginning of the movement, which is to be indicated by the uncertainty ellipse 22 (see FIG. 1c) enlarged in comparison to FIG. 1b. In the situation shown in FIG. 1c, the sensors of the manipulator system 20 acquire the object 30 at a position at which the object 30 would be approximately expected on the basis of the updated environmental information. This renewed detection of the object 30 can be used in this case, on the one hand, to determine position and pose of the object 30 more accurately. On the other hand, however, it can also be used to improve the position estimation of the manipulator system 20. This is indicated in FIG. 1d by the two uncertainty ellipses 31 and 22, which are accordingly reduced in size.

(8) A total of four schematic illustrations 2a to 2d of possible path plans for a manipulator system are shown in FIG. 2. In this case, the illustrations a, b, c correspond to a path plan without detection of the object 30 and FIG. d corresponds to a corresponding path plan with detection of the object 30. A manipulator system 10 is schematically shown in FIG. 2a, having corresponding sensors, which have a visual range 11. The planned destination of the manipulator system 10 is indicated by 10′. An object 30 to be moved is again designated by the reference sign 30, which could not be used up to this point in the path plan 12 of the manipulator system 10, however, since it has not yet been detected. The path plan 12 of the manipulator system thus presumes in the situation of FIG. 2a that the object 30 is not present. In the situation of FIG. 2a, however, the sensors acquire a part of the object 30, and therefore the path plan 12 has to be dynamically adapted. As shown in FIG. 2b, the manipulator system travels farther and dynamically adapts its path, since the sensors still detect blocking of the route by the object 30. In the situation of FIG. 2c, the manipulator system 10 establishes that it has to travel back again and has to take an entirely different route to reach the destination 10′, in order to bypass the object 30. Such detours can be avoided in the case of a path plan with knowledge of position and orientation of the object 30.

(9) In the situation of FIG. 2d, the object 30 was detected and taken into consideration during the updating of the environmental information by the central control unit 50. As can be seen in FIG. 2d, the path plan 12 can now take into consideration the presence of the object 30 from the outset, and therefore the manipulator system 10 can travel into the destination 10′ without detours.

(10) In FIG. 3, a first manipulator system 10 is to transport an object 30 to be moved to another location. A second manipulator system 20 is to be moved to its destination 20′. For the path plan of the second manipulator system 20, the object 30 can be used as an orientation point and therefore, as indicated in FIG. 3c, the manipulator system 20 can be guided on a direct route to its destination 20′. Without the consideration of the detected object 30 in the path plan, the manipulator system 20 would have to take the bypass as indicated in FIG. 3b, since in this example otherwise no features are present for accurate localizing of the manipulator system 20 and the manipulator system 20 therefore cannot travel the direct route in this example without the orientation point of the detected object 30. Before the detected object 30 is moved by the first manipulator 10, it is therefore checked whether the detected object 30 is required for the orientation of the second mobile manipulator system 20. Since this is the case here, the movement of the object 30 by the first manipulator system 10 is deferred until the second manipulator system 20 has completed its orientation on the basis of the object 30. It is clear in this case that the second manipulator system 20 does not necessarily have to be moved up into its destination 20′ for this purpose, but rather it is sufficient if the sensors of the manipulator system 20 acquire the object 30 and thus enable accurate localizing of the manipulator system 20.

(11) FIG. 4 schematically shows a flow chart of the method according to the invention. In step S1, items of environmental information are provided by a central control unit, for example, a map of the environment, in which the mobile manipulator systems are to move. In step S2, an object to be moved is detected in the environment and the position and pose of the detected object are determined. These items of information are used in step S3 to update the environmental information with the position and pose of the detected object. In step S4, the position and pose of the detected object are then taken into consideration in the path plan of the mobile manipulator systems, in which it is checked before a movement of the detected object by a first mobile manipulator system whether the detected object is required for the orientation of a second manipulator system.

LIST OF REFERENCE SIGNS

(12) 10, 20 mobile driverless manipulator systems

(13) 11, 21 visual range of the sensors

(14) 30 object

(15) 22, 31 uncertainty ellipse

(16) 12, 23 path plan

(17) 40 environment

(18) 50 central control unit

(19) 51 line (wireless communication link)

(20) 10′, 20′ destination of the manipulator system