METHOD FOR AUTOMATED WEED CONTROL OF AGRICULTURAL LAND AND ASSOCIATED STAND-ALONE SYSTEM
20230368312 · 2023-11-16
Inventors
Cpc classification
A01M21/00
HUMAN NECESSITIES
A01B76/00
HUMAN NECESSITIES
A01B69/001
HUMAN NECESSITIES
International classification
A01M21/00
HUMAN NECESSITIES
Abstract
The invention relates to a method for maintaining agricultural land (110), comprising: preparing, by means of a computing device (160) comprising a processor (161) and a computer memory (162), a map of the agricultural land from previously acquired images of the agricultural land, a plurality of growing areas (120), each comprising a plurality of parallel growing rows (130), being defined in the map prepared; assigning a list of growing areas to be treated to a stand-alone vehicle (140) depending on an initial position and a final position of the stand-alone vehicle and the position of the growing areas to be treated; determining a travel plan for the stand-alone vehicle from the list of growing areas assigned to the vehicle with due regard to the orientation of the rows in the growing areas to be treated; and moving the stand-alone vehicle according to the previously established travel plan.
Claims
1-15. (canceled)
16-30. (canceled)
31. A method for maintaining an agricultural land, comprising: preparing, by a computing device comprising a processor and a computer memory, a map of the agricultural land, the map delimiting a plurality of growing areas each growing area of the plurality comprising a plurality of parallel growing rows; assigning to a stand-alone vehicle a list of growing areas to be treated according to an initial position and a final position of the stand-alone vehicle and a position of the growing areas to be treated; determining a travel plan for the stand-alone vehicle from the list of growing areas to be treated, with reference to an orientation of the parallel growing rows of the growing areas to be treated; moving the stand-alone vehicle according to the travel plan; and the method further comprising a step of generating, by the computing device, a diagnosis of an agronomic pressure inside of the plurality of growing areas of the agricultural land, the diagnosis of the agronomic pressure depending on a nature of a weed present in a growing area of the plurality of growing areas, as well as a number of weeds present and a development state of the weed present, the list of growing areas to be treated by the stand-alone vehicle being ordered according to the agronomic pressure associated with each growing area to be treated.
32. The method of claim 31 wherein preparing the map of the agricultural land further comprises using a geolocation data of an agricultural tool having previously covered the agricultural land.
33. The method of claim 31 wherein preparing the map of the agricultural land further comprises using previously acquired images of the agricultural land.
34. The method of claim 31, wherein the step of generating the diagnosis of the agronomic pressure comprises a sub-step of automatic recognition of the nature of the weed present by an image of the weed present and an automatic learning method trained on a database comprising identified weed images.
35. The method of claim 31, wherein the diagnosis of the agronomic pressure in the growing area is further based on a comparison between a growth of the weed present and the growth of a plant cultivated in the growing area.
36. The method of claim 31, wherein the step of generating the diagnosis of the agronomic pressure comprises a sub-step of predicting an evolution of a developmental state of the weed present at a given time point as a function of at least one among a measured meteorological data and a predicted meteorological data.
37. The method of claim 31, implementing a plurality of stand-alone vehicles, wherein the list of growing areas to be treated and assigned to the stand-alone vehicle takes into account an order of the growing areas to be treated according to the agronomic pressure associated with each growing area and a position of each stand-alone vehicle at a given time.
38. The method of claim 31, wherein the list of growing areas to be treated and assigned to the stand-alone vehicle is taking into account an autonomy of the stand-alone vehicle at a given time with respect to a charging station adapted to charging the stand-alone vehicle.
39. The method of claim 31, further comprising recognizing a direction of the parallel growing rows in the growing area and determining a spacing between two parallel growing rows from at least one acquired image covering the growing area.
40. The method of claim 39, wherein recognizing a direction of the parallel growing rows in the growing area and determining a spacing between two parallel growing rows comprises using a record of a plurality of positions of a seeding tool used during a preliminary seeding phase, the seeding tool comprising a geolocation device.
41. The method of claim 39, further comprising automatically defining the growing area by analyzing the direction of the parallel growing rows.
42. The method of claim 40, further comprising automatically defining the growing area by analyzing the direction of the parallel growing rows.
43. The method of claim 31, comprising a prior step of acquiring an image of the agricultural land by at least one stand-alone image acquisition device.
44. The method of claim 43, wherein the at least one stand-alone image acquisition device is an aerodyne adapted to flying over the agricultural land at a maximum altitude of 100 meters with respect to a surface of the agricultural land for preparing the map of the agricultural land and adapted to flying over the agricultural land at the maximum altitude of 3 meters relative to the surface of the agricultural land for making the diagnosis of the agronomic pressure.
45. The method of claim 31, further comprising determining at least one area on the map of the agricultural land where a movement of the stand-alone vehicle is prohibited.
46. The method of claim 31, further comprising associating a culture type to at least one growing area of the agricultural land.
47. A system for maintaining an agricultural land, comprising the computing device including the computer memory adapted for storing instructions for the implementation of the method of claim 31, and at least one stand-alone vehicle equipped with a mechanical tool configured to perform an agricultural land maintenance operation.
48. The system of claim 47, also comprising at least one stand-alone image acquisition device.
49. The system of claim 47, further comprising a charging station adapted for charging the stand-alone vehicle.
50. The system of claim 48, further comprising a charging station adapted for charging the at least one stand-alone image acquisition device.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0064] Other advantages, aims and particular features of the present invention will appear from the following nonlimiting description of at least one particular embodiment of the devices and methods objects of the present invention, with reference to the appended drawings, wherein:
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
[0072]
DETAILED DESCRIPTION OF THE INVENTION
[0073] This description is given without limitation, each feature of an embodiment being able to be advantageously combined with any other feature of any other embodiment.
[0074] Note, as of now, that the figures are not to scale.
Example of a Particular Embodiment
[0075]
[0076] In the present nonlimiting example of the invention, the agricultural land 110 comprises three contiguous growing areas 120, forming a parcel. Nonetheless, the invention can also be applied to a discontinuous agricultural land, having for example parcels remote from each other, each parcel comprising at least one growing area.
[0077] Furthermore, the plant, such as barley, wheat, maize or rapeseed, selected for cultivation in the three growing areas 120 is herein identical but could very well be distinct.
[0078] The maintenance system 100 herein comprises at least one stand-alone vehicle 140 configured to perform maintenance operations on the agricultural land 110, at least one drone 150, which is an aerodyne, configured to acquire at least one aerial-type image of the agricultural land 110 by flying over the agricultural land 110, at least one station 145 for charging the stand-alone vehicle, at least one station 155 for charging the drone 150 and a computing device 160, in this case a server, configured to manage the maintenance system 100.
[0079] It should be pointed out that the drone 150 is a stand-alone image acquisition device. Alternatively, or in addition to the drone 150, another motor-driven stand-alone image acquisition device, such as a second drone or a terrestrial mobile robot 151 acquiring partial images of the land during its movements, may be included in the maintenance system 100.
[0080] The computing device 160, comprising a processor 161 and a computer memory 162, processes instructions of a maintenance method 200 illustrated in
[0081] The maintenance method 200 comprises a first step 210 of acquiring images by means of the drone 150 flying over the agricultural land 110 at a maximum altitude in the range of one hundred meters with respect to the surface of the agricultural land located vertically below the drone 150. Geographic coordinates, obtained for example by means of a geolocation device included in the drone 150, as well as an orientation of the image with respect to a cardinal point, are herein advantageously associated with each acquired image. The images acquired by the drone 150 are also generally time-stamped.
[0082] An example of an image 300 of the agricultural land 110 acquired by the drone 150 is shown in
[0083] Based on the acquired images, the computing device 160 prepares, during a second step 220 of the method 200, a map of the agricultural land 110 in which the different growing areas 120 as well as the growing rows 130 are automatically defined.
[0084] To this end, an image of the agricultural land 110 is reconstituted beforehand based on a plurality of images acquired by means of the drone 150 during a first sub-step 221 of step 220. The reconstruction is performed while taking into account the geographical coordinates and the orientation associated with each image. The position of each image can be adjusted by superimposing at least two elements common to each image, for example by calculating a sharpness criterion at the superimposition of the images.
[0085] Afterwards, the growing rows 130 are recognized in the reconstituted image of the agricultural land 110 during a second sub-step 222 of step 220 by using, for example, an image segmentation or boundary detection method.
[0086] Alternatively or in complement, the recognition of growing rows is based on a record of the positions of a seeding tool used during a preliminary phase of seeding seeds on an agricultural land 110. To this end, the seeding tool comprises a geolocation device, for example of the GPS type (acronym of “Global Positioning System”), providing a position at regular intervals. The records over a period of time form a trail on a map. Thus, by analyzing this trace, it is possible to identify different growing rows 130 by identifying portions of the trace that are substantially straight over a predetermined length, for example equal to at least five or ten meters. An example of a trace 310 of the seeding tool covering a parcel 320 of the agricultural land 110 is illustrated in
[0087] When the position of the rows 130 is detected, their direction is analyzed in order to automatically define the growing areas 120, each defined by substantially parallel growing rows, in the reconstituted image of the agricultural land 110 during a third sub-step 223 of step 220. It should be pointed out that two growing rows are generally considered to be in the same growing area when they form an angle smaller than five degrees. For example, three growing areas 120 are determined on the image 300 of
[0088] It should be pointed out that an operator can possibly intervene during the method to make a correction or supplement to a boundary of a growing area 120 defined on the map.
[0089] Afterwards, an average spacing between two rows 130 may be determined for each growing area 120 during a fourth sub-step 224 of step 220. The average spacing is calculated based on lines each representing a row.
[0090] An association of a type of culture with at least one growing area 120 may also be performed during a possible fifth sub-step 225. This association may be automatically performed by recognition of the type of culture on an image or manually by an operator.
[0091] During a third step 230 of the method 200, an automatic analysis of the map on which an initial position and a final position of the stand-alone vehicle 140 are determined is performed by the computing device 160 to assign to the stand-alone vehicle 140 a list of growing areas to be treated, while taking into account its autonomy to move between the initial position and the final position. The initial position corresponds to a position at a given time point of the stand-alone vehicle 140, this position may be instantaneous or predicted. For example, the final position corresponds to the position of the charging station 145 of the stand-alone vehicle 140.
[0092] It should be pointed out that the stand-alone vehicle 140 generally moves slowly, at a speed in the range of 0.5 km/h to 2 km/h during maintenance of a growing area 120, and at a speed in the range of 4 to 6 km/h during a relocation travel, i.e. when not maintaining. This relocation may be one inside or outside a growing area 120.
[0093] A travel plan for the stand-alone vehicle 140 is then determined during a fourth step 240 of the method 200. This travel plan comprises an optimized trajectory for the stand-alone vehicle 140 allowing making the best use of the energy stored in an energy storage device comprised in the stand-alone vehicle 140, while complying with the movement constraints according to the growing rows of each growing area treated by the stand-alone vehicle 140.
[0094] During a fifth step 250 of the method 200, the stand-alone vehicle 140 moves by following the previously established travel plan.
[0095] In order to improve the assignment of the growing areas to be treated to the stand-alone vehicle 140, the method 200 may comprise, prior to step 230, a step 260 of generating by the computing device 160 a diagnosis of the agronomic pressure of the weeds inside all or part of the growing areas 120. The agronomic pressure due to a weed is assessed according to the nature of this weed, its density in a growing area and/or its development state. The agronomic pressure of a weed translates the capacity of the weed to become predominant in the analyzed growing area, thereby slowing down the growth of the cultivated plant in the area.
[0096] Thus, if the agronomic pressure of the weeds is low, the plant selected for cultivation can grow normally. Conversely, if the agronomic pressure of the weeds is high, for example higher than a predetermined threshold, the growth of the selected cultivated plant is much lower because the weeds become predominant in the growing area and derive elementary resources, such as water and the Sun, to their benefit, to the detriment of the growth of the selected cultivated plant.
[0097] For illustration, Table 1 presents the number of plants per m.sup.2 sufficient to reduce the yield of straw cereals by 5%.
TABLE-US-00001 TABLE 1 Galium 1.8 Common wild oat 5.3 Poppy 22.0 Matricaria 22.0 Ryegrass 25.0 Foxtail grass 26.0 Starwort 26.0 Veronica persica 26.0 Veronica filiformis 44.0 Lamium 44.0 Myosotis 66.0 Viola 133.0 Alchemilla 133.0
[0098] In order to assess the agronomic pressure related to a weed in a growing area, a recognition of the weed is generally performed during a first sub-step 261 of step 260 illustrated by a specific block diagram shown in
[0099] For example, the recognition is performed by means of a neural network type automatic learning method, trained beforehand on a database of images of weeds whose identification has been performed beforehand. More specifically, the neural network allows indicating a probability of the presence of a weed in an image provided to the neural network.
[0100] It should be pointed out that the images used for recognition are generally distinct from those used for mapping. Indeed, in order to have a good recognition, it is preferable to use images having a very good resolution with a more restricted field of vision than that used for mapping. To this end, the drone 150 generally performs an overflight of the agricultural land, at a lower altitude, in the range of three meters, in order to acquire more accurate images of the ground during an optional preliminary sub-step 269. The drone 150 may also be equipped with several image acquisition devices with separate objectives, one dedicated to image acquisition for mapping with a wide field of view, i.e. with a short focal length, and the other one dedicated to the acquisition of images for the recognition of a weed with a narrower field of vision, i.e. with a larger focal length.
[0101] In the case of two stand-alone image acquisition devices, it may be considered to dedicate one to the acquisition of images for mapping and the other one to the acquisition of images for the recognition of a weed.
[0102] Afterwards, an assessment of the number of weeds of the same species in an image is performed during a second sub-step 262 of step 260. It should be pointed out that the assessment of the number of weeds on a surface is equivalent to the assessment of the density of weeds, for example per square meter.
[0103] For example, the assessment of the number of weeds may be performed automatically by counting the recognized weeds for a given species. This assessment may be performed on all images covering a growing area 120, or on a sample of images associated to a growing area 120.
[0104] The development state associated to a weed species detected in a growing area 120 is assessed during a third sub-step 263 of step 260 from the images acquired by estimating for example a characteristic value related to the growth of the weed, such as the size of the weed, the leaf number, the leaf density, the size of a leaf, etc. Afterwards, this characteristic value is compared to a scale characteristic of the growth of the weed to assess the associated development state.
[0105] In a variant of this particular implementation of the invention, the development state is obtained concomitantly with the identification of the weed by the automatic learning method. In such case, the database comprises weed images each associated with a development state of the weed.
[0106] A growth model adapted to the weed may be used to predict the evolution at a given time point of the development state of the weed, during an optional fourth sub-step 264 of step 260. For example, such growth models are described in the book “Plant Architecture and Growth, Modeling and Applications” by P. de Reffye et. al.
[0107] It should be pointed out that the used growth model can be supplemented with current and/or predicted meteorological data, such as temperature, wind direction, hygrometry, sunshine or rainfall, in order to improve the prediction of the evolution of the state of the weed. This prediction may be performed over one day, three days or five days, and possibly over 1 or 2 month(s).
[0108] Based on the current and/or future development state of the weed, an assessment of the agronomic pressure associated with each weed detected in a growing area is performed during the fifth sub-step 265 of step 260. For example, a weed with an advanced development state has a higher agronomic pressure than a weed with a lower development state.
[0109] It should be pointed out that the assessment of this agronomic pressure may also take into account a comparison of the growth of the weed with the growth of the cultivated plant at a given time point, either current or future.
[0110] For illustration, Table 2 presents an example of the evolution of the agronomic pressure of different types of weeds, taking into account a simplified model of plant growth according to the sunshine duration, the temperature and the hygrometry. The parameters used herein are 14 h of sunshine per day, a hygrometry of 70% and a temperature of 10° C.
TABLE-US-00002 TABLE 2 Evolution Day Area index D 0 D 0 + 5 D 0 + 10 D 0 + 15 D 0 + 20 PAA 1 0.3 12 14 17 21 27 PAA 2 0.2 25 26 28 32 37 PAA 3 1.4 2 9 23 45 73 PAA 4 1.1 57 62 73 89 110 PAA 5 2.1 5 16 37 68 111
[0111] In table 2, it is possible to notice that the area PAA 5 could be treated as a priority instead of the area PAA 2 because, although it has a lower agronomic pressure at time 0, the weed will tend to grow very quickly as shown by the value of the agronomic pressure after 20 days, which is three times higher than the agronomic pressure of the area PAA 2. Thus, the order of the growing areas 120 to be treated may be modified while taking into account the agronomic pressures of the weeds expected at a given time period.
[0112] A representative value of the agronomic pressure of the weeds is calculated during a sixth step 266 of step 260 for each growing area 120 by combining the agronomic pressures of each weed detected on each growing area 120. For example, the representative value may correspond to a maximum value, to an average value of the agronomic pressures or to a sum of the values of the agronomic pressures associated to each weed detected in the growing area 120.
[0113] It should be pointed out that an agronomic pressure or a representative value of the agronomic pressure in a growing area may be weighted according to the type of the cultivated plant in the corresponding growing area, in order to take into account the relationship between the development of the cultivated plant and of the weeds. Furthermore, this weighting allows better comparing the agronomic pressures of the weeds between different growing areas in which different plants are cultivated.
[0114] Afterwards, a classification of the growing areas 120 to be treated as a priority is performed during a seventh sub-step 267 of step 260.
[0115] The list assigned to the stand-alone vehicle 140 during step 230 can then take into account the order of the growing areas 120 to be treated as a priority. Thus, a growing area 120 wherein the agronomic pressure of weeds is high can be treated as a priority by the stand-alone vehicle 140. It should be pointed out that this growing area 120 to be treated as a priority is not necessarily the growing area 120 the closest to the stand-alone vehicle 140 which will thus expend energy to move up to this growing area 120 to be treated as a priority.
[0116] In the case where the maintenance system 100 comprises a plurality of stand-alone vehicles 140, the distribution of the growing areas 120 to be treated between the different stand-alone vehicles is optimized by taking into account the order of the growing areas 120 to be treated as a priority and the position of each stand-alone vehicle 140 at a given time point. This optimization generally also takes into account the position of one or more charging station(s) 145 by managing the charging times of each stand-alone vehicle 140.
[0117] Indeed, it should be pointed out that the number of charging stations 145 may be smaller than the number of stand-alone vehicles 140. In such case, the charging of all stand-alone vehicles 140 being impossible to be performed simultaneously, a clever management of the charging periods of each stand-alone vehicle 140 may be put in place in order to avoid unnecessarily immobilizing a stand-alone vehicle 140.
[0118] In order to avoid a stand-alone vehicle 140 being disturbed by an obstacle during movement thereof or being able to move outside of a dedicated space, the method 200 may advantageously comprise, before the step 240 of determining a movement, a step 270 of determining at least one area where the movement of a stand-alone vehicle 140 is prohibited on the map. This prohibited area may be automatically identified beforehand on an image by recognizing, for example, an obstacle such as a pole, a tree, a river, a ditch, a significant relief, a roadside, etc. This prohibition area may also be added to the map by an operator.
[0119] Thus, the stand-alone vehicle 140 can maintain the agricultural land 110 with due regard to the order of areas 120 to be treated which are assigned thereto.
[0120] To this end, as illustrated in more detail in
[0121] The frame 420 accommodates an electronic device 460 equipped with a processor 461 and a computer memory 462 allowing in particular controlling the movement of the stand-alone vehicle 400 based on the previously established travel plan.
[0122] To follow the travel plan, the stand-alone vehicle 140 may advantageously comprise a geolocation device 470, for example of the satellite geo-positioning type such as the GPS system (acronym of “Global Positioning System”) or Galileo.
[0123] It should be pointed out that the travel plan has generally been communicated to the stand-alone vehicle 140 via a wireless communication device 475 included in the stand-alone vehicle 400, and connected to the electronic device 460.
[0124] To locally detect the presence of a weed, the stand-alone vehicle 140 is herein provided with four cameras 490, located at the front of the vehicle 140, and whose optical axis is advantageously oriented towards the ground on which the stand-alone vehicle moves 140. Thus, images acquired at regular intervals by the cameras are analyzed by the electronic device 460 which can control the action of a mechanical tool 410 depending on the weed recognized in the image. The recognition of a weed is herein performed by a neural network-type automatic learning method, previously trained on a database of images of identified weeds.
[0125] The images acquired by the cameras 490 may also be transmitted to the computing device 160 of the maintenance system 100 in order to improve mapping of the agricultural land and determination of a better agronomic pressure of the weeds. Thus, a correlation may be performed between the agronomic pressure of the weeds assessed on the growing area 120 being treated by the stand-alone vehicle 400 and the agronomic pressure of the weeds observed during the treatment. This correlation allows improving the assessment of the agronomic pressure of weeds based on the images acquired by means of the drone 150.
[0126]
[0127] The drone 150 includes a frame 510 provided with four arms extending in a main plane of the frame 510. A motor-driven propeller 520 is disposed at each end of the arms 515, each rotating in a plane parallel to the main plane 511 of the frame 510. The four propellers 520, disposed opposite a so-called upper face of the drone 150, thereby generating during actuation thereof a lift force of the drone 150.
[0128] The drone 150 has on a face opposite to the upper face, an image acquisition device 530, in this instance a camera, whose optical axis 531 is herein perpendicular to the main plane 511 of the frame 510. The optical axis 531 is directed so as to enable the camera 530 to acquire aerial images of the land 110 captured substantially vertically.
[0129] The drone 150 also comprises a battery 540 for storing electrical energy as well as an inductive charging device 545.
[0130] It should be pointed out that the drone 150 can move automatically or be piloted by an operator.