METHOD FOR PLANTATION TREATMENT OF A PLANTATION FIELD WITH A VARIABLE APPLICATION RATE
20220167546 · 2022-06-02
Inventors
Cpc classification
A01C21/002
HUMAN NECESSITIES
A01B79/02
HUMAN NECESSITIES
International classification
A01B79/02
HUMAN NECESSITIES
A01M7/00
HUMAN NECESSITIES
Abstract
A method for plantation treatment of a plantation field, the method comprising: determining (S10) an application rate decision logic (10) based on offline field data (Doff) relating to expected conditions on the plantation field (300); taking (S20) an image (20) of a plantation of a plantation field (300); recognizing (S30) objects (30) on the taken image (20); determining (S40) an application rate based on the determined application rate decision logic (10) and the recognized objects (30); and determining (S50) a control signal (S) for controlling a treatment arrangement (50) of a treatment device (200) based on the determined application rate.
Claims
1. A method for plantation treatment of a plantation field, the method comprising: determining (S10) an application rate decision logic (10) based on offline field data (Doff) relating to expected conditions on the plantation field (300); taking (S20) an image (20) of a plantation of a plantation field (300); recognizing (S30) objects (30) on the taken image (20); determining (S40) an application rate based on the determined application rate decision logic (10) and the recognized objects (30); and determining (S50) a control signal (S) for controlling a treatment arrangement (270) of a treatment device (200) based on the determined application rate.
2. The method of claim 1, wherein: taking (S20) an image (20) of a plantation of a plantation field (300); recognizing (S30) objects (30) on the taken image (20), determining (S40) an application rate and determining (S50) a control signal (S) for controlling a treatment arrangement (270) are carried out as real time process, such that the treatment device (200) is instantaneous controllable based on taken images of the plantation field as the treatment device traverses through the field at the time of treatment in a specific location of the plantation field (300).
3. The method of claim 1, wherein: the application rate decision logic (10) provides a logic to determine the application rate for treating the plantation depending on an expected efficacy loss of a crop to be cultivated in the plantation field.
4. The method of claim 1, wherein: the application rate decision logic (10) includes variable application rates depending on one or more parameter(s) derived from the image and/or object recognition.
5. The method of claim 1, wherein: recognizing objects (30) is based on the taken image (10) and/or geometric object profiles, modeling the geometry of objects based on their species and/or their growth stage.
6. The method of claim 1, wherein: determining an application rate based on the recognized objects includes determining object species, object growth stages and/or object density.
7. The method of claim 1, further comprising: receiving online field data (Don) by the treatment device (200) relating to current conditions on the plantation field (300); and determining the control signal (S) dependent on the determined application rate decision logic (10) and the determined recognized objects (30) and/or the determined online field data (Don).
8. The method of claim 7, wherein: the online field data (Don) relates to current weather condition data, current plantation growth data and/or current soil data.
9. The method of claim 1, further comprising: providing validation data (V) dependent on a performance review of the treatment of the plantation; and adjusting the application rate decision logic (10) dependent on the validation data (V).
10. The method of claim 9, further comprising: adjusting the geometric object profiles based on the validation data (V).
11. The method of claim 1, further comprising: adjusting the application rate decision logic (10) using a machine learning algorithm.
12. A field manager system (100) for a treatment device (200) for plantation treatment of a plantation field (300), the field manager system (100) comprising: an offline field data interface (150) being adapted for receiving offline field data (Doff) relating to expected conditions on the plantation field (300); a machine learning unit (110) being adapted for determining the application rate decision logic (10) of the treatment device (200) dependent on the offline field data (Doff); and a decision logic interface (140), being adapted for providing the application rate decision logic (10) to a treatment device (200).
13. The field manager system (100) of claim 12, further comprising: a validation data interface (160) being adapted for receiving validation data (V), wherein the machine learning unit (110) is adapted for adjusting the application rate decision logic (10) dependent on the validation data (V).
14. A treatment device (200) for plantation treatment of a plant, the treatment device (200) comprising: an image capture device (220) being adapted for taking an image (20) of a plantation; a decision logic interface (240) being adapted for receiving an application rate decision logic (10) from a field manager system (100) according to claim 12; a treatment arrangement (270) being adapted for treating the plantation dependent on the received application rate decision logic (10); an image recognition unit (230) being adapted for recognizing objects (30) on the taken image (20); and a treatment control unit (210) being adapted for determining a control signal (S) for controlling a treatment arrangement (270) dependent on the received application rate decision logic (10) and the recognized objects (30); wherein the decision logic interface (240) of the treatment device (200) is connectable to a decision logic interface (140) of the field manager system (100); and wherein the treatment device (200) is adapted to activate the treatment arrangement (270) based on the control signal (S) of the treatment control unit (210).
15. The treatment control device of claim 14, further comprising: an online field data interface (240) being adapted for receiving online field data (Don) relating to current conditions on the plantation field (300), wherein the treatment control unit (210) is adapted for determining a control signal (S) for controlling a treatment arrangement (270) dependent on the received application rate decision logic (10) and the recognized objects (30) and/or the online field data (Don).
16. The treatment device of claim 14, wherein the image capture device (220) comprises one or a plurality of cameras, in particular on a boom of the treatment device (200), wherein the image recognition unit (230) is adapted for recognizing objects using red-green-blue RGB data and/or near infrared NIR data.
17. The treatment device of claim 14, wherein the treatment device (200) is designed as a smart sprayer, and wherein the treatment arrangement (270) is a nozzle arrangement.
18. The treatment device of claim 14, wherein the image capture device (220) comprises a plurality of cameras and the treatment arrangement (270) comprises a plurality of nozzle arrangements, each being associated to one of the plurality of cameras, such that images captured by the cameras are associated with the area to be treated by the respective nozzle arrangement.
19. A treatment system comprising a field manager system according to claim 12.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0098] Exemplary embodiments will be described in the following with reference to the following drawings:
[0099]
[0100]
[0101]
[0102]
DETAILED DESCRIPTION OF EMBODIMENTS
[0103]
[0104] The treatment device 200, preferably a smart sprayer, comprises a treatment control unit 210, an image capture device 220, an image recognition unit 230 and a treatment arrangement 270 as well as a application rate decision logic interface 240 and an online field data interface 250.
[0105] The image capture device 220 comprises at least one camera, configured to take an image 20 of a plantation field 300. The taken image 20 is provided to the image recognition unit 230 of the treatment device 200.
[0106] The field manager system 100 comprises a machine learning unit 110. Additionally, the field manager system 100 comprises an offline field data interface 150, a application rate decision logic interface 140 and a validation data interface 160. The field manager system 100 may refer to a data processing element such as a microprocessor, microcontroller, field programmable gate array (FPGA), central processing unit (CPU), digital signal processor (DSP) capable of receiving field data, e.g. via a universal service bus (USB), a physical cable, Bluetooth, or another form of data connection. The field manager system 100 may be provided for each treatment device 200. Alternatively, the field manager system may be a central field manager system, e.g. a personal computer (PC), for controlling multiple treatment devices 200 in the field 300.
[0107] The field manager system 100 is provided with offline field data Doff relating to expected condition data of the plantation field 300. Preferably, the offline field data Doff comprises local yield expectation data, resistance data relating to a likelihood of resistance of the plantation against a treatment product, expected weather condition data, expected plantation growth data, zone information data, relating to different zones of the plantation field, expected soil moisture data and/or legal restriction data.
[0108] The offline field data Doff is provided from external repositories. For example, the expected weather condition data is provided by a weather station, providing a weather forecast. The weather station can also be a local weather station disposed on the plantation field or on the treatment device. Alternatively, the expected weather condition data can be provided by a service provider, which preferably uses satellite data for forecasting the weather. Additionally, the expected plantation growth data is for example provided by a database having stored different plantation growth stages or from plantation growth stage models, which make statements on the expected growth stage of a crop plant, a weed and/or a pathogen dependent on past field condition data. The expected plantation growth data alternatively is provided by plantation models, which are basically digital twins of the respective plantation, and estimate the growth stage of the plantation, in particular dependent on former field data. Further, for example the expected soil moisture data is determined dependent on the past, present and expected weather condition data. The offline field data Doff may also be provided by an external service provider.
[0109] Dependent on the offline field data Doff, the machine learning unit 110 determines a application rate decision logic 10. Preferably, the machine learning unit 110 knows the planned time of treatment of the plantation. For example, a farmer provides the field manager system 100 with the information that he plans to treat the plantation in a certain field the next day. The application rate decision logic 10 preferably is represented as a configuration file that is provided to the application rate decision logic interface 140 of the field manager system 100. Ideally, the application rate decision logic 10 is determined by the machine learning unit 110 on the same day, the treatment device 200 is using the application rate decision logic 10. Via the application rate decision logic interface 140, the application rate decision logic 10 is provided to the treatment device 200, in particular the application rate decision logic interface 240 of the treatment device 200. For example, the application rate decision logic 10 in form of a configuration file is uploaded to a memory of the treatment device 200.
[0110] When the application rate decision logic 10 is received by the treatment device 200, in particular the treatment control unit 210, the treatment of plantation in the plantation field 300 can begin.
[0111] The treatment device 200 moves around the plantation field 300 and detects and recognizes objects 30, in particular crop plants, weeds, pathogens and/or insects on the plantation field 300.
[0112] Therefore, the image capture device 200 constantly takes images 20 of the plantation field 300. The images 20 are provided to the image recognition unit 230, which runs an image analysis on the image 20 and detects and/or recognizes objects 30 on the image 20. The objects 30 to detect are preferably crops, weeds, pathogens and/or insects. Recognizing objects includes recognizing a plantation, preferably a type of plantation and/or a plantation size, an insect, preferably a type of insect and/or an insect size, and/or a pathogen, preferably a type of pathogen and/or a pathogen size. For example, it is recognized the difference between for example Amaranthus retroflexus and Digitaria sanguinalis, or between a bee and a locust. The objects 30 are provided to the treatment control unit 210.
[0113] If the image recognition analysis detects an object 30, but is not able to recognize the object 30 and/or its growth stage, the image recognition unit 230 is provided with geometric object profiles, relating to expected geometric appearances of different plantation. The image recognition unit 230 may not be able to recognize the object 30 and/or its growth stage because of many different factors like reflections, unexpected weather conditions and/or unexpected growth stages of the plantation.
[0114] The geometric object profiles model the geometry of objects on their species and/or their growth stage. Based on the taken image 20 and the geometric object profiles, the image recognition unit 230 may be able to recognize objects 30 and/or their growth stage that could not have been recognized without the geometric object profiles.
[0115] The treatment control unit 210 was provided with the application rate decision logic 10 in form of the configuration file. The application rate decision logic 10 can be illustrated as a decision tree, wherein based on input data, over different layers of decisions a treatment of a plantation is decided and the application rate of the treatment product is decided. For example, in a first step, it is checked, if the biomass of the detected weed exceeds a predetermined threshold set up by the application rate decision logic 10. The biomass of the weed generally relates to the degree of coverage of the weed in the taken image 20. For example, if the biomass of the weed is below 4%, it is decided that the weed is not treated at all. If the biomass of the weed is above 4%, further decisions are made. For example, in a second step, if the biomass of the weed is above 4%, dependent on the moisture of the soil it is decided, if the weed is treated. If the moisture of the soil exceeds a predetermined threshold, it is still decided to treat the weed and otherwise it is decided not to treat the weed. This is, because the herbicides used to treat the weed are more efficient, when the weed is in a growth phase, which is triggered by a high soil moisture. The application rate decision logic 10 already includes information about the expected soil moisture. Since it has been raining the past days, the expected soil moisture is above the predetermined threshold and it will be decided to treat the weed. However, the treatment control unit 210 also is provided by online field data Don, in this case from a soil moisture sensor, providing the treatment control unit 210 with additional data. The decision tree of the configuration file will therefore be decided based on the online field data Don. In an exemplary embodiment, the online field data Don comprises the information that the soil moisture is below the predetermined threshold. Thus, it is decided not to treat the weed.
[0116] The treatment control unit 210 generates a treatment control signal S based on the application rate decision logic 10, the recognized objects and/or the online field data Don. The treatment control signal S therefore contains information if the recognized object 20 should be treated or not. The treatment control unit 210 then provides the treatment control signal S to the treatment arrangement 270, which treats the plantation based on the control signal S. The treatment arrangement 270 comprises in particular a chemical spot spray gun with different nozzles, which enables it to spray an herbicide, plant growth regulator, insecticide and/or fungicide with high precision.
[0117] Thus, a application rate decision logic 10 is provided dependent on offline field data Doff relating to an expected field condition. Based on the application rate decision logic 10 a treatment device 200 can decide, which plantation should be treated only based on the recognized objects in the field. Thus, the efficiency of the treatment and/or the efficacy of the treatment product can be improved. In order to further improve the efficiency of the treatment and/or the efficacy of the treatment product online field data Don can be used to include current measurable conditions of the plantation field.
[0118] The provided treatment arrangement 400 additionally is capable of learning. The machine learning unit 110 determines the application rate decision logic 10 dependent on a given heuristic. After the plantation treatment based on the provided application rate decision logic 10, it is possible to validate the efficiency of the treatment and the efficacy of the treatment product. For example, the farmer can provide the field manager system 100 with field data of a part of the plantation field that has been treated before based on the application rate decision logic 10. This information is referred to as validation data V. The validation data V is provided to the field manager system 100 via the validation data interface 160, providing the validation data V to the machine learning unit 110. The machine learning unit 110 then adjusts the application rate decision logic 10 or the heuristic, which is used to determine the application rate decision logic 10 according to the validation data V. For example, the validation data V indicates that the weed that has been treated based on the application rate decision logic 10 is not killed, the adjusted application rate decision logic 10 lowers the threshold to treat the plantation in one of the branches of the underlying decision tree.
[0119] As an alternative to the application rate decision logic 10 in form of a configuration file provided by an external field manager system 100 to a treatment device 200, the functionality of the field manager system 100 can also be embedded into the treatment device 200. For example, a treatment device with relatively high computational power is capable to integrate the field manager system 100 within the treatment device 200. Alternatively, the whole described functionality of the field manager system 100 and the functionality up to the determination of the control signal S by the treatment device 200 can be calculated externally of the treatment device 200, preferably in a cloud service. The treatment device 200 thus is only a “dumb” device treating the plantation dependent on a provided control signal S.
[0120]
[0121] The plantation to be treated is weed of a plantation field 300. The weed is to be treated with a treatment product like a herbicide. The crop plantation to be cultivated in this example is soy.
[0122] The country of cultivation in this example is Brazil. The weed spectrum known in the machine learning unit 110 for the application rate decision logic 20 are Digitaria sanguinalis (sourgrass) and Amaranthus retroflexus (pig weed).
[0123] Based on rainfall events, the maximum growth stage of the weeds is modelled for both species and geometric object profiles are derived to differentiate them. For example, Digitaria sanguinalis is expected to be much smaller than Amaranthus retroflexus at the time of the treatment. Based on these assumptions application rates are calculated to treat Digitaria sanguinalis with 1 l/ha of the herbicide and to treat Amaranthus retroflexus with 2 l/ha of the herbicide.
[0124] The user is provided with a configuration file for the treatment device 200 and a tank recipe corresponding to the expected weeds.
[0125] On the plantation field 300, the user validates the presence of Digitaria sanguinalis and Amaranthus retroflexus with an image recognition app, for example xarvio scouting.
[0126] The user sets up the treatment device 200 with the configuration file and fills the treatment device 200 according to the tank recipe, in particular with a combination of water and the herbicide. The treatment device 200 comprises a treatment arrangement 270 with several independent nozzles. The configuration file will tell the treatment device 200 how to control each nozzle of the treatment device 200 based on the treatment control signal S and in particular a GPS-position and online data from sensors of the treatment device 200.
[0127] When treating the weeds, the following steps are executed.
[0128] In step S10, an application rate decision logic 10 is determined based on offline field data Doff relating to expected conditions on the plantation field 300. In step S20, an image 20 of a plantation of a plantation field 300 is taken. In step S30, objects 30 are recognized on the taken image 20.
[0129] In step S40, the application rate is determined based on the application rate decision logic 10 and the recognized objects 30.
[0130] In this case, according to the application rate decision logic 10, it is checked, if the biomass of the recognized object 30 is greater than 0.2%. The percent value relates to the percent of the taken image 20 covered by the recognized object 30. If the biomass of the recognized object 30 is lower than 0.2% the application rate of the herbicide is set to 0 l/ha, since the recognized object 30 is too small to be efficiently treated. If the biomass of the recognized object 30 is bigger than 0.2%, it is checked for the species of the recognized object 30. The percentage of the biomass of the recognized object 30 directly relates to the species and/or the growth stage of the object 30.
[0131] If the species of the weed, and therefore the recognized object 30 is Digitaria sanguinalis, the application rate of the herbicide is set to 1 l/ha. If the species of the weed is Amaranthus retroflexus, the application rate of the herbicide is set to 2 l/ha. If the species of the weed cannot be determined and therefore is undetectable, geometric object profiles are provided to the image recognition unit 230, in particular by the machine learning unit 110, and step S30, recognizing objects 30 on the taken image 20, is redone.
[0132] Afterwards, the species of the weed is likely to be recognized and the application rate is set according to the respective species. Alternatively, it is checked if the species of the weed is most likely Amaranthus retroflexus and the application rate is therefore set to 2 l/ha of herbicide. If the species of weed is not likely Amaranthus retroflexus, the application rate is set to 1 l/ha of herbicide.
[0133] According to the application rate decision logic 10, it is now checked, if the biomass of the recognized weed is greater than 4%. If the biomass of the recognized weed is greater than 4%, the application rate of the herbicide is increased by 10%. Otherwise, the application rate of the herbicide is unchanged.
[0134] According to the application rate decision logic 10, it is now checked, if the humidity of the environment of the plantation field 300 is greater than 50%. If the humidity if greater than 50%, the application rate of the herbicide is increased by 10%. Otherwise, the application rate of the herbicide is unchanged.
[0135] According to the application rate decision logic 10, it is now checked, if an intensive tillage system is present on the plantation field 300. In this case, the application rate of the herbicide is decreased by 5%. Otherwise, the application rate of the herbicide is unchanged.
[0136] Furthermore, in step S50, a control signal S for controlling a treatment arrangement 50 of a treatment device 200 is determined based on the determined application rate.
[0137]
[0138] The UAV 200 has an image capture device 220 comprising one or a plurality of cameras, and as it flies over the plantation field 300 imagery is acquired. The UAV 200 also has a GPS and inertial navigation system, which enables both the position of the UAV 200 to be determined and the orientation of the camera 220 also to be determined. From this information, the footprint of an image on the ground can be determined, such that particular parts in that image, such as the example of the type of crop, weed, insect and/or pathogen can be located with respect to absolute geospatial coordinates. The image data acquired by the image capture device 220 is transferred to an image recognition unit 230.
[0139] The image acquired by the image capture device 220 is at a resolution that enables one type of crop to be differentiated from another type of crop, and at a resolution that enables one type of weed to be differentiated from another type of weed, and at a resolution that enables not only insects to be detected but enables one type of insect to be differentiated from another type of insect, and at a resolution that enables one type of pathogen to be differentiated from another type of pathogen.
[0140] The image recognition unit 230 may be external from the UAV 200, but the UAV 200 itself may have the necessary processing power to detect and identify crops, weeds, insects and/or pathogens. The image recognition unit 120 processes the images, using a machine learning algorithm for example based on an artificial neural network that has been trained on numerous image examples of different types of crops, weeds, insects and/pathogens, to determine which object is present and also to determine the type of object.
[0141] The UAV also has a treatment arrangement 270, in particular a chemical spot spray gun with different nozzles, which enables it to spray an herbicide, plant growth regulator, insecticide and/or fungicide with high precision.
[0142] As shown in
[0143] Therefore, the efficiency of the image capture device 220 can be improved.
REFERENCE SIGNS
[0144] 10 application rate decision logic [0145] 20 image [0146] 30 objects on image [0147] 100 field manager system [0148] 110 machine learning unit [0149] 140 decision logic interface [0150] 150 offline field data interface [0151] 160 validation data interface [0152] 200 treatment device (UAV) [0153] 210 treatment control unit [0154] 220 image capture device [0155] 230 image recognition unit [0156] 240 decision logic interface [0157] 250 online field data interface [0158] 270 treatment arrangement [0159] 300 plantation field [0160] 400 treatment system [0161] 410 crop [0162] 420 weed (Amaranthus retroflexus) [0163] 430 weed (Digitaria sanguinalis) [0164] 440 unidentified weed [0165] S treatment control signal [0166] Don online field data [0167] Doff offline field data [0168] V validation data [0169] S10 determining application rate decision logic [0170] S20 taking image [0171] S30 recognizing objects on image [0172] S40 determining application rate [0173] S50 determining control signal