METHOD AND SYSTEM FOR LEARNING A DRIVING ROUTE OF AN AUTONOMOUSLY DRIVING AGRICULTURAL DRIVING ROBOT
20250377661 · 2025-12-11
Inventors
- Jan MÜHLNICKEL (Unna, DE)
- Rainer GRASER (Laupheim, DE)
- Timo BRESSMER (Ulm, DE)
- Peter FISCHER (Ulm, DE)
- Manuel WOPFNER (Langenau, DE)
Cpc classification
G05D1/617
PHYSICS
G05D2111/32
PHYSICS
G05D2105/50
PHYSICS
G05D1/246
PHYSICS
International classification
G05D1/246
PHYSICS
Abstract
A method for learning a route of an autonomously driving agricultural robot involves providing an environmental map, driving along a route in the region of the environmental map from a starting point to an end point by manually controlling the driving robot, locating the driving robot while driving along the route with the aid of a sensor arranged on the driving robot, and recording coordinates of waypoints along the route based on the locating process that has been carried out. The route is validated by autonomously driving along the route taking the stored coordinates of the waypoints into account, wherein the route is manually confirmed and then the confirmed route is marked as being able to be driven along autonomously.
Claims
1-19. (canceled)
20. A method for learning a route of an autonomously driving agricultural robot, the method comprising: providing an environmental map; driving the autonomously driving agricultural robot along a route lying within a region of the environmental map from a starting point to an end point by manually controlling the autonomously driving agricultural robot; localizing the autonomously driving agricultural robot while the autonomously driving agricultural robot travels along the route using at least one sensor arranged on the autonomously driving agricultural robot; recording coordinates of waypoints along the route based on the localizing; validating the route by autonomously driving the autonomously driving agricultural robot along the route taking into account the recorded coordinates of the waypoints, wherein the route is confirmed manually; and marking the manually confirmed route as autonomous.
21. The method of claim 20, wherein, before validating the route, the autonomously driving agricultural robot is manually steered back to the starting point and the validating the route comprises autonomously traveling the route in a same direction specified in the driving the autonomously driving agricultural robot by manually controlling the autonomously driving agricultural.
22. The method of claim 20, wherein the validating the route comprises autonomously traveling the route in a direction opposite to a direction in which the route was specified in the driving the autonomously driving agricultural robot by manually controlling the autonomously driving agricultural.
23. The method of claim 20, wherein the route is confirmed section by section.
24. The method of claim 20, wherein the autonomously driving agricultural robot is controlled manually via a remote control when traveling the route.
25. The method of claim 24, wherein the remote control is coupled wirelessly to a control device of the autonomously driving agricultural robot.
26. The method of claim 24, wherein the route or a route section is considered confirmed when an action is actively performed by a user during the validating the route.
27. The method of claim 26, wherein the action is an operation of a control element on the remote control.
28. The method of claim 20, wherein the route or a route section is considered to be validated if there is no intervention by a user during the validating the route correcting or stopping movement of the autonomously driving agricultural robot.
29. The method of claim 20, wherein a distance to a surrounding object is measured via at least one distance sensor during the manually controlled or automatic travel of the route.
30. The method of claim 29, wherein the distance to the surrounding object is compared with a safety distance and an acoustic or optical warning is emitted when the distance to the surrounding object falls below the safety distance.
31. The method of claim 20, wherein at least one marked point is defined by a user during the driving the autonomously driving agricultural robot by manually controlling the autonomously driving agricultural robot while the autonomously driving agricultural robot is stationary.
32. The method of claim 31, wherein one or more function(s) to be performed are assigned to the autonomously driving agricultural robot at the marked point.
33. The method of claim 31, wherein the at least one marked point includes the starting or end point of at least one route.
34. The method of claim 20, wherein the route is manually modified in whole or in sections by a user before the validating the route by changing the coordinates of the waypoints.
35. The method of claim 20, wherein the route is automatically modified in whole or in sections before the validating the route by changing the coordinates of the waypoints to smooth a course of the route in whole or in sections.
36. The method of claim 35, wherein route elements comprising a plurality of neighboring waypoints are transformed into smoothed route elements by mathematical functions.
37. The method of claim 36, wherein restrictive boundary conditions for modifying the coordinates of the waypoints are predetermined.
38. A system comprising: an autonomously driving agricultural robot comprising a control device to autonomously drive the autonomously driving agricultural robot; and a remote control coupled to the control device and configured to manually control the autonomously driving agricultural robot, wherein the control device is configured to drive the autonomously driving agricultural robot along a route lying within a region of an environmental map from a starting point to an end point by manually controlling the autonomously driving agricultural robot; localize the autonomously driving agricultural robot while the autonomously driving agricultural robot travels along the route using at least one sensor arranged on the driving robot; record coordinates of waypoints along the route based on the localizing; validate the route by autonomously driving the autonomously driving agricultural robot along the route taking into account the recorded coordinates of the waypoints, wherein the route is confirmed manually; and mark the manually confirmed route as autonomous.
Description
BRIEF DESCRIPTION OF THE DRAWING FIGURES
[0021] The invention is explained in more detail below by means of an exemplary embodiment with the aid of figures, wherein the figures show as follows:
[0022]
[0023]
[0024]
[0025]
[0026]
DETAILED DESCRIPTION
[0027]
[0028] In this example, the driving robot 1 is a so-called feeding robot, which is set up to pick up food from a dispensing point, mix it automatically, and unload it at one or more feeding points. The driving robot 1 is therefore also referred to below as a feeding robot or simply robot.
[0029] Identical reference signs indicate elements that are identical or have the same effect in all figures. For reasons of clarity, not every element in every figure is provided with a reference sign. In the description, the terms right and left refer to the respective representation of the figure. The terms top and bottom, on the other hand, refer to the natural orientation of the driving robot. The terms front and rear refer to a forward direction of travel 10 of the driving robot 1. The forward direction of travel 10, which is indicated by a directional arrow in
[0030] The driving robot 1 has two main components, a chassis 100 and a body 110.
[0031] The chassis 100 is preferably universally applicable and can be used together with various functional units if necessary. In
[0032] The body 110 essentially determines the functionality of the driving robot and thus its intended use within the barn or yard area.
[0033] In the case of the driving robot 1 equipped as a feeding robot in the present case, the body 110 has a feed container 111 as a key component. The feed to be distributed is taken into the feed container 111 and can be mixed during filling, in a charging station 28 (see
[0034] The body 110 further comprises a cladding formed using a plurality of cladding elements, typically cladding panels 114. The cladding panels 114 can preferably be removed separately in order to gain access to underlying components and their maintenance or replacement. Elements accessible from the outside are integrated into the cladding, for example charging contacts 115 (see
[0035] The driving robot 1 is also equipped with a navigation system that enables navigation in the barn or yard area without fixed infrastructure elements such as rails or guide cables. For this purpose, the driving robot is equipped with a plurality of sensors that are either integrated into the cladding or protrude from the cladding.
[0036]
[0037] Other sensors (not visible here) are mechanical sensors that detect the application of force to one or both bumpers 104. For this purpose, the respective bumper 104 can be movably mounted, for example, so that one of possibly several sensors is actuated when moving against a spring force. In an alternative design, the bumper 104 can be formed in an outer area from an elastically deformable material, in particular a foam material, into which a sensor is incorporated, which detects a deformation preferably along the entire edge of the bumper 104. In this way, a collision with an obstacle is advantageously damped and detected at the same time. In one design, for example, two spaced-apart electrodes can be embedded in the elastic material along the edge of the bumper 104, between which a capacitance is detected. A change in capacitance indicates a deformation of the material. In a further design, a tension chain can be incorporated into the elastic material, which is coupled to a switch or sensor. A deformation of the elastic material leads to a tension on the tension chain, which is detected by the switch or sensor.
[0038] The driving robot 1 has at least one control device that controls the actuators of the driving robot, including driving motors, and reads and evaluates signals from the sensors. The control device also performs navigation tasks and maintains a stored map of the environment, which is used, among other things, to localize the driving robot in its environment. The map is preferably created by the driving robot 1 itself in a so-called SLAM (Simultaneous Localization and Mapping) process by evaluating the sensor data recorded during various journeys. In addition, the control device is equipped or coupled with communication interfaces, in particular for wireless communication. The communication interfaces are used, for example, to connect to a higher-level operations management system that coordinates the use of the driving robot 1. The communication interfaces can also be used to control the driving robot 1 using a remote control.
[0039]
[0040] In the example shown here, the farm 2 comprises two barns, specifically a first barn 20 and a second barn 21, for keeping animals, for example for keeping cows, and a surrounding yard area. In the example, the two barns 20, 21 differ in size, with the second barn 21 being an outbuilding to the larger first barn 20, which in this sense is to be regarded as the main barn. The number of two barns 20, 21 on farm 2 is purely exemplary, as are their size and arrangement.
[0041] Both barns 20, 21 have walls 22 on the outside and a plurality of supporting pillars 23 on the inside. This is also purely exemplary. The barns 20, 21 could also be provided with walls on the inside. Furthermore, a gate 24 is provided for each of the barns 20, 21.
[0042] Several animal areas 25 are provided within each of the two barns 20, 21, i.e., areas in which animals, e.g., the aforementioned cows, are kept. Each animal area 25 is assigned a so-called feed fence 26, in front of which feed is placed, which the animals can pick up from the animal area 25.
[0043] Outside the barns 20, 21, i.e., in the yard area of the farm 2, several feed bunkers 27 are set up in which different types of feed are kept for the animals. By way of example, three larger feed bunkers 27 are shown, which are used to hold silage feed, for example. A smaller feed bunker 27, shown round in the schematic
[0044] The map of farm 2 shown in
[0045] In order to be able to use the driving robot 1 on the farm 2, routes, i.e., possible paths of the driving robot 1, are defined in accordance with the invention in a learning method. During operation of the driving robot 1, a navigation method can then compile a route from the acquired routes, depending on the task to be completed, along which the driving robot moves in order to be able to fulfill its tasks.
[0046] In the following, a method according to the invention is explained using the farm 2 shown in
[0047] In the farm 2 shown, the charging station 28 is mounted adjacent to the feed bunkers 27. The charging station 28 is controlled by the driving robot 1 in order to charge batteries for its energy supply. The charging station 28 comprises contacts which, when the driving robot 1 is correctly positioned, make contact with the charging contacts 115 (see
[0048] In the vicinity of the charging station 28, an alternative positioning method can be used in which markings on the charging station are detected optically. Such markings are, for example, the reflectors 29 shown in
[0049]
[0050] In a first step, which is carried out in preparation for the learning method according to the application, the driving robot 1 is manually steered into the vicinity of the charging station 28 so that it is at a distance of about 1-2 m from the charging station 28 and is positioned in its direction of travel 10 aligned with the charging station 28. The distance and position are selected so that it is possible to navigate into the charging station 28 using the reflectors 29.
[0051] To control the driving robot 1, a remote control (not shown here) is used, which preferably, but not necessarily, communicates wirelessly with the driving robot 1. An optical or radio-based communication link can be used for transmission. In particular, a universally usable mobile end device of a user, e.g., a tablet computer, can be used as a remote control. The communication link can be established directly between the tablet computer and a receiving device of the driving robot 1 or also via a shared communication network, for example a WLAN (Wireless Local Area Network) network, which is available on the farm 2.
[0052] After the driving robot 1 has been brought into the position shown in
[0053] In the exemplary embodiment shown, the position taken in this way represents a first marked point, the coordinates of which are stored in the control unit of the driving robot 1. The coordinates refer to the environmental map created by the driving robot 1. This first marked point can be regarded as a kind of fixed point for a route network 3 to be set up, as it is fixed by design due to the positioning of the charging station 28. It is therefore also referred to below as anchor point 30. In an alternative design of the method, it can also be provided that the first marked point, the anchor point 30, is set to the position of the driving robot 1 in the charging station 28.
[0054] At the same time, the anchor point 30 represents a starting point for the first route of the route network 3 to be learned. To define the route, further marked points on the farm 2 are approached manually by the user using the remote control, e.g., a marked point 30a, which is located in front of a first of the feed bunkers 27 and represents the position at which the driving robot 1 can pick up feed from this first feed bunker 27.
[0055] The position and orientation of the driving robot 1 in front of the first feed bunker 27 is noted in the map guided by the driving robot 1 as a marked point 30a, wherein this marked point is assigned information about the functionin this case the picking up of feed from the first feed bunker 27. Marked points, which also include the anchor point 30, are also abbreviated below as POI (Point Of Interest).
[0056] The other feed bunkers 27 are then approached in the same way and corresponding POIs 30b-d are defined in the map of the driving robot 1 and assigned functions that the driving robot 1 can perform at these points are stored. One possible sequence of functions is, for example [0057] Stop [0058] Start mixing device [0059] Request filling of x kg of feed from the feed bunker with the number y [0060] Wait until the requested amount of feed has been filled [0061] Continue driving.
[0062] Here, x and y represent parameter values to be used, which are usually specified by the higher-level farm management system. In addition to a list of functions to be executed, it may also be provided to define more complex sequences, e.g., to determine how to deal with errors, e.g., if no or too little feed is dispensed from the feed bunker.
[0063] This results in the situation shown in
[0064] While driving along the route from the starting point, the anchor point 30, to an end point, the POI 30d, not only the POIs 30a-d are recorded on the map, but also a plurality of waypoints 31, of which only two are marked between the POIs 30a and 30b in
[0065] During manual driving and recording of the route, the remote control is only used to control the driving robot 1 andwhen the driving robot 1 is stationaryto mark a marked point and, optionally, to define actions. The latter can also be done retrospectively. The user's concentration can thus be fully focused on the driving robot 1 and distances to obstacles, etc.
[0066] Before a recorded route specified by waypoints 31 or POIs 30, 30a-d can be traveled autonomously, this route is validated according to the invention by an already autonomously controlled travel of the route, which, however, takes place under the supervision of the user. Only when this validation step has been carried out for a specific route or route section is this route or route section marked as being able to be driven autonomously and can be used as part of an automatic navigation process.
[0067] The previously specified route between anchor point 30 and POI 30d is intended to be traveled in both directions. In the case of such a route, it may be provided that validation must also take place in both directions of travel. Alternatively, it can also be provided that validation in one direction is sufficient, wherein this direction does not necessarily have to be the direction in which this route was specified.
[0068] It may also be provided that the route to be validated or sections of the route to be validated are revised beforehand. Manual control when learning the route can result in unplanned swerves or similar that lead to non-optimal routing. A non-optimal route results in a longer distance being covered and leads to actually avoidable cornering, acceleration and braking actions, which cost energy and put unnecessary strain on the material of the driving robot 1.
[0069] To correct the route, in an advantageous design of the method, the recorded route of the previously created route network 3 can be shown on a display, in particular a display of the remote control of the driving robot 1, before validation, wherein the user has the option of correcting route elements 32 or waypoints 31 and sections of route elements 32 between POIs 30, 30a-d manually or automatically. Manual correction of coordinates of POIs 30a-d can also be provided.
[0070] Manual correction can, for example, involve moving a waypoint 31 or a POI 30a-d. It may be provided that this shift is only possible by a certain distance in order to prevent the risk of moving waypoints into the area of obstacles. In the case of POIs 30a-d, the maximum possible displacement of the coordinates can be more strictly limited so as not to run the risk of the assigned function (e.g., picking up food) being impaired after a displacement. Furthermore, it may be provided that distances between the waypoints 31 and boundaries on the map are determined and that certain safety distances between waypoints 31 and boundary objects, e.g., the walls 22 or the support pillars 23, must be maintained when moving.
[0071] In addition to manual correction of route elements 32 or parts of route elements 32, automated correction options may also be available.
[0072] One possibility for automated correction is provided by a smoothing function. Here, route elements 32 are smoothed by filter algorithms, e.g., a low-pass filter, in order to reduce possible snaking or swerving.
[0073] The route elements 32 are thus replaced by smoothed curves, wherein certain boundary conditions, for example a continuous, non-kinking connection to following or preceding route elements 32, e.g., by maintaining tangent gradients in the POIs 30, 30a-d, are taken into account. An upper limit for a maximum displacement of waypoints 31 resulting from the automated smoothing can also be defined.
[0074] Another option for automated correction, which also results in curve smoothing, is to completely recalculate the position of the waypoints 31 of route elements 32 using suitable parametrically modeled curves. For example, route elements 32 or parts of route elements 32 can be replaced by so-called splines or Bzier curves. Splines and Bzier curves are mathematically determined curve elements that are composed of polynomial functions. As with smoothing using a filter function, certain boundary conditions can also be provided for smoothing using fully calculated waypoints, for example a continuous, non-kinking connection to following or preceding route elements 32. In addition, it may also be provided here that minimum distances to boundary objects, e.g., the walls 22 or the supporting pillars 23, are maintained in order to prevent collisions and to correctly avoid the objects. In addition, it may be provided that certain waypoints 31 are regarded as unchangeable in order to be able to manually influence the route guidance.
[0075] As a result, the route elements 32 are converted into smoothed route elements 33 by the manual or one of the automatic corrections described, as shown in
[0076] In the actual validation step, the driving robot 1 then travels back along the recorded and possibly smoothed route or again in the direction of recording, wherein the user must confirm that route elements 32 have been traveled. For this purpose, it may be provided, for example, that the automatic travel is carried out for as long as a button on the remote control is held down. Each section that is traveled again is marked as successfully validated. If there is a problem during the run, the user can release the button, whereupon the driving robot 1 stops immediately and switches from the step of validating the route back to the step of recording by manual control. In this way, a route or at least a section of it can then be specified again in corrected form.
[0077] If the route is fully validated by driving over it and confirming, it is then available for autonomous travel.
[0078]
[0079] A navigation system can use the route element 32 between the anchor point 30 and the marked point 30e for both navigation tasks, i.e., a journey to the first barn 20 or to the second barn 21. A further branching is possible at POI 30f, where a total of four route elements 32 converge.
[0080] A first possibility starting from the marked point 30f runs via the POI 30g along the upper animal area 25 of the first barn 20 in
[0081] From the marked point 30f, an alternatively movable route element 32 leads to a marked point 30j, at which feeding takes place in the lower animal area 25 in
[0082] From the end points of the feed ejection, the POIs 30h and 30k respectively, route elements 32 lead to a marked point 30i, at which the return paths join after feeding has been completed. From the marked point 30i, a further route element 32 leads back to the marked point 30f, wherein the intermediate route element 32 is characterized, for example, by a higher travel speed, which is possible with an emptied feed container 111 of the driving robot 1.
[0083] In a similar way to the first barn 20, route elements 32 lead from the marked point 30e into the second barn 21. Here, a branching point is defined as the marked point 30l, from which route elements 32 lead via the marked points 30m and 30n along the upper feed fence 26 in
[0084] The recorded route elements 32 shown in
[0085] Subsequently, a validation run is performed for the recorded route elements 32 and, optionally, the smoothed route elements 33, during which the driving robot 1 runs along the route elements 32, 33 and then identifies them as autonomously usable if they are confirmed by the user. Straightening the route elements 32 is particularly useful along the feed fence 26 in order to ensure that food is ejected or pushed at a defined and constant distance from the feed fence 26.
[0086] As described above, no additional information, for example measured distances to obstacles, is displayed on the user's remote control during the learning of the routes, i.e., during the specification of the route elements 32 by manual movement. However, both during learning and during validation, it is possible to signal that the safety distances to detected obstacles have not been breached by means of an acoustic and/or visual warning signal on the driving robot 1 itself. This prevents possible collisions without distracting the user's attention from the position and movement of the driving robot 1.
[0087] It may be provided that a confirmation of a route element 32 or a route section comprising several route elements 32 or the entire route is actively performed by an action of the user. Alternatively, it may be provided that a confirmation is assumed if the user does not intervene during the validation run by changing the route traveled or stopping the driving robot 1.
[0088] Although the invention has been illustrated and described in detail by way of preferred embodiments, the invention is not limited by the examples disclosed, and other variations can be derived from these by the person skilled in the art without leaving the scope of the invention. It is therefore clear that there is a plurality of possible variations. It is also clear that embodiments stated by way of example are only really examples that are not to be seen as limiting the scope, application possibilities or configuration of the invention in any way. In fact, the preceding description and the description of the figures enable the person skilled in the art to implement the exemplary embodiments in concrete manner, wherein, with the knowledge of the disclosed inventive concept, the person skilled in the art is able to undertake various changes, for example, with regard to the functioning or arrangement of individual elements stated in an exemplary embodiment without leaving the scope of the invention, which is defined by the claims and their legal equivalents, such as further explanations in the description.
LIST OF REFERENCE SIGNS
[0089] 1 Driving robot [0090] 10 Direction of travel [0091] 100 Chassis [0092] 101 Drive wheel [0093] 102 Swivel wheel [0094] 103 Apron (feed pusher) [0095] 104 Bumper [0096] 110 Body [0097] 111 Feed container [0098] 112 Mixing device [0099] 113 Feed conveyor [0100] 114 Cladding panels [0101] 115 Charging contacts [0102] 116 Operating and/or display elements [0103] 117 Lidar sensor [0104] 118 Ultrasonic sensor [0105] 2 Farm [0106] 20 First barn [0107] 21 Second barn [0108] 22 Wall [0109] 23 Support pillar [0110] 24 Gate [0111] 25 Animal area [0112] 26 Feed fence [0113] 27 Feed bunker [0114] 28 Charging station [0115] 29 Reflectors [0116] 3 Route network [0117] 30, 30a-q Marked point (POI) [0118] 31 Waypoint [0119] 32 Route element [0120] 33 Smoothed route element