METHOD FOR PLANTATION TREATMENT OF A PLANTATION FIELD

20220167606 · 2022-06-02

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for plantation treatment of a plantation field, the method, comprising: receiving (S10) a parametrization (10) for controlling a treatment device (200) by the treatment device (200) from a field manager system (100), wherein the parametrization (10) is dependent on offline field data (Doff) relating to expected conditions on the plantation field (300); taking (S20) an image (20) of a plantation of a plantation field (300); recognizing (S30) objects (30) on the taken image (20); determining (S40) at least one treatment product composition (40) optionally dependent or based on the determined parametrization (10), online field data and/or the recognized objects (30); determining (S50) a control signal (S) for controlling a treatment arrangement (270) of the treatment device (200) based on the determined parametrization (10), the recognized objects (30) and the chosen treatment product composition (40).

    Claims

    1. A method for plantation treatment of a plantation field with a treatment product, the method, the method comprising: receiving (S10) a parametrization (10) for controlling a treatment device (200) by the treatment device (200) from a field manager system (100), wherein the parametrization (10) is dependent on offline field data (Doff) relating to expected conditions on the plantation field (300); taking (S20) an image (20) of a plantation of a plantation field (300); recognizing (S30) objects (30) on the taken image (20); determining (S40) a treatment product composition optionally based on the determined parametrization, online field data and/or the recognized objects (30); and determining (S50) a control signal (S) for controlling a treatment arrangement (270) of the treatment device (200) based on the determined parametrization (10), the recognized objects (30) and the determined treatment product composition (40).

    2. The method of claim 1, wherein: taking (S20) an image (20) of a plantation of a plantation field (300); recognizing (S30) objects (30) on the taken image (20); determining (S40) a treatment product composition; and determining (S50) a control signal (S) for controlling a treatment arrangement (270) are carried out as real time process, such that the treatment device (270) is instantaneous controllable based on taken images of the plantation field as the treatment device traverses through the field at the time of treatment in a specific location of the field.

    3. The method of claim 1, further comprising: receiving the offline field data (Doff) by the field manager system (100); determining the parametrization (10) of the treatment device (200) dependent on the offline field data (Doff); and providing the determined parametrization (10) to the treatment device (200).

    4. The method of claim 1, wherein determining the control signal (S) includes generating a tank actuator signal and a treatment arrangement signal to control release of the determined treatment product composition.

    5. The method of claim 4, wherein: the treatment device includes a treatment arrangement with more than one nozzle, and the treatment arrangement signal triggers one or more nozzles separately.

    6. The method of claim 1, wherein: the control signal is provided to a control unit of the treatment device to initiate treatment of the plantation field, wherein the control signal is configured to change the treatment product composition or at least one active ingredient of the treatment product composition during field treatment.

    7. The method of claim 1, wherein recognizing (S20) objects (20) includes recognizing a plantation, preferably a type of plantation and/or a plantation size, an insect, preferably a type of insect and/or an insect size, and/or a pathogen, preferably a type of pathogen and/or a pathogen size.

    8. The method of claim 1, further comprising: receiving online field data (Don) by the treatment device (200) relating to current conditions on the plantation field (300); and determining the control signal (S) dependent on the determined parametrization (10), the determined treatment product composition (40) and the determined recognized objects (30) and/or the determined online field data (Don).

    9. The method of claim 8, wherein: the online field data (Don) relates to current weather condition data, current plantation growth data and/or current soil moisture data.

    10. The method of claim 1, further comprising: providing validation data (V) dependent on a performance review of the treatment of the plantation; and adjusting the parametrization (10) dependent on the validation data (V).

    11. The method of claim 1, further comprising: adjusting the parametrization (10) using a machine learning algorithm.

    12. The method of claim 1, wherein: determining a parametrization (10) comprises determining a tank recipe for a treatment product tank of the treatment device (200).

    13. (canceled)

    14. (canceled)

    15. A treatment device (200) for plantation treatment of a plantation, the treatment device (200) comprising: an image capture device (220) being adapted for taking an image (20) of a plantation; a parametrization interface (240) being adapted for receiving a parametrization (10) from a field manager system (100); a treatment arrangement (270) being adapted for treating the plantation dependent on the received parametrization (10); an image recognition unit (230) being adapted for recognizing objects (30) on the taken image (20); and a treatment control unit (210) being adapted for determining (S40) a treatment product composition (40) optionally based on the determined parametrization (10), online field data and/or the recognized objects (30); and adapted for determining a control signal (S) for controlling a treatment arrangement (240) of the treatment device (200) based on the determined parametrization (10), the recognized objects (30) and the determined treatment product composition (40); wherein the parametrization interface (240) of the treatment device (200) is connectable to a parametrization interface (140) of the field manager system (100); wherein the treatment device (200) is adapted to activate the treatment arrangement (270) based on the control signal (S) of the treatment control unit (210).

    16. The treatment device (200) of claim 15, further comprising: an online field data interface (240) being adapted for receiving online field data (Don) relating to current conditions on the plantation field (300); wherein the treatment control unit controlling a treatment arrangement (240) of the treatment device (200) based on the determined parametrization (10), the recognized objects (30) and the determined treatment product composition (40) and/or the online field data.

    17. The treatment device of claim 15, wherein the image capture device (220) comprises one or a plurality of cameras, in particular on a boom of the treatment device (200), wherein the image recognition unit (230) is adapted for recognizing insects, weed and/or plantation using red-green-blue RGB data and/or near infrared NIR data.

    18. The treatment device of claim 15, wherein the treatment device (200) is designed as a smart sprayer, wherein the treatment arrangement (270) is a nozzle arrangement.

    19. The treatment device of claim 15, wherein the image capture device (220) comprises a plurality of cameras and the treatment arrangement (270) comprises a plurality of nozzle arrangements, each being associated to one of the plurality of cameras, such that images captured by the cameras are associated with the area to be treated by the respective nozzle arrangement.

    20. A treatment system comprising the treatment device according to claim 15.

    21. A field manager system (100) for a treatment device (200) for treatment of a plantation field (300), the field manager system (100) comprising: an offline field data interface (150) being adapted for receiving offline field data (Doff) relating to expected conditions on the plantation field (300); a machine learning unit (110) being adapted determining parametrization (10) of the treatment device (200) dependent on the offline field data (Doff); and a parametrization interface (140), being adapted for providing the parametrization (10) to a treatment device (200) according to claim 14.

    22. The field manager system (100) of claim 21, further comprising: a validation data interface (160) being adapted for receiving validation data (V); wherein the machine learning unit (110) is adapted for adjusting the parametrization (10) dependent on the validation data (V).

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0103] Exemplary embodiments will be described in the following with reference to the following drawings:

    [0104] FIG. 1 shows a schematic diagram of a plantation treatment system;

    [0105] FIG. 2 shows a flow diagram of a plantation treatment method;

    [0106] FIG. 3 shows a schematic view of a zone map of a plantation field;

    [0107] FIG. 4 shows a schematic view of a treatment device on a plantation field; and

    [0108] FIG. 5 shows a schematic view of an image with detected objects.

    DETAILED DESCRIPTION OF EMBODIMENTS

    [0109] FIG. 1 shows a plantation treatment system 400 for treating a plantation of a plantation field 300 by at least one treatment device 200 controlled by a field manager system 100.

    [0110] The treatment device 200, preferably a smart sprayer, comprises a treatment control unit 210, an image capture device 220, an image recognition unit 230 and a treatment arrangement 270 as well as a parametrization interface 240 and an online field data interface 250.

    [0111] The image capture device 220 comprises at least one camera, configured to take an image 20 of a plantation field 300. The taken image 20 is provided to the image recognition unit 230 of the treatment device 200.

    [0112] The field manager system 100 comprises a machine learning unit 110. Additionally, the field manager system 100 comprises an offline field data interface 150, a parametrization interface 140 and a validation data interface 160. The field manager system 100 may refer to a data processing element such as a microprocessor, microcontroller, field programmable gate array (FPGA), central processing unit (CPU), digital signal processor (DSP) capable of receiving field data, e.g. via a universal service bus (USB), a physical cable, Bluetooth, or another form of data connection. The field manager system 100 may be provided for each treatment device 200. Alternatively, the field manager system may be a central field manager system, e.g. a cloud computing environment or a personal computer (PC), for controlling multiple treatment devices 200 in the field 300.

    [0113] The field manager system 100 is provided with offline field data Doff relating to expected condition data of the plantation field 300. Preferably, the offline field data Doff comprises local yield expectation data, resistance data relating to a likelihood of resistance of the plantation against a treatment product, expected weather condition data, expected plantation growth data, zone information data, relating to different zones of the plantation field, expected soil data, e.g. soil moisture data, and/or legal restriction data.

    [0114] The offline field data Doff is provided from external repositories. For example, the expected weather data may be based on satellite data or measured weather data used for forecasting the weather. The expected plantation growth data is for example provided by a database having stored different plantation growth stages or from plantation growth stage models, which make statements on the expected growth stage of a crop plant, a weed and/or a pathogen dependent on past field condition data. The expected plantation growth data may be provided by plantation models, which are basically digital twins of the respective plantation, and estimate the growth stage of the plantation, in particular dependent on former field data. Further, for example the expected soil moisture data may be determined dependent on the past, present and expected weather condition data. The offline field data Doff may also be provided by an external service provider.

    [0115] Dependent on the offline field data Doff, the machine learning unit 110 determines a parametrization 10. Preferably, the machine learning unit 110 knows the planned time of treatment of the plantation. For example, a farmer provides the field manager system 100 with the information that he plans to treat the plantation in a certain field the next day. The parametrization 10 preferably is represented as a configuration file that is provided to the parametrization interface 140 of the field manager system 100. Ideally, the parametrization 10 is determined by the machine learning unit 110 on the same day, the treatment device 200 is using the parametrization 10. Here the machine learning unit 110 may include trained machine learning algorithm(s), wherein the output of the machine learning algorithm(s) may be used for the parametrization. The determination of the parametrization may also be conducted without involvement of any machine learning algorithm(s). Via the parametrization interface 140, the parametrization 10 is provided to the treatment device 200, in particular the parametrization interface 240 of the treatment device 200. For example, the parametrization 10 in form of a configuration file is transferred and stored in a memory of the treatment device 200.

    [0116] Additionally, the machine learning unit determines at least a treatment product composition 40 expected to be used for treating the plantation in the field 300. The determination is made in view of the whole plantation field 300 or at least the part of the plantation field 300 that is planned to be treated. The at least one product composition 40 relates to different herbicides, pathogen and/or insecticides as well as mixing solutions like water or nutrient solutions like nitrogen solutions for mixing with the treatment product. For example, the machine learning unit knows from determines a first active ingredient AI1 and a second active ingredient AI2, which both are different herbicides. The treatment product composition 40 preferably is represented as part of the parametrization in a configuration file that is provided to the parametrization interface 140 of the field manager system 100. Ideally, the treatment product composition 40 is determined by the machine learning unit 110 on the same day, the treatment device 200 is using the treatment product composition 40. Via the parametrization interface 140, the treatment product composition 40 is provided to the treatment device 200, in particular the parametrization interface 240 of the treatment device 200. For example, the treatment product composition 40 in form of a configuration file is uploaded to a memory of the treatment device 200.

    [0117] When the parametrization 10 including the treatment product composition 40 are received by the treatment device 200, in particular the treatment control unit 210, the treatment of plantation in the plantation field 300 can begin. Ideally, the user, in particular a farmer, is additionally provided with a tank recipe by the filed manager 100. The tank recipe is determined dependent on the parametrization 10 including the determined treatment product compositions 40. Thus, the farmer knows approximately how much treatment product of which treatment product composition 40 is needed to treat the plantation in the plantation field 300.

    [0118] The treatment device 200 moves around the plantation field 300 and detects and recognizes objects 30, in particular crop plants, weeds, pathogens and/or insects on the plantation field 300.

    [0119] Therefore, the image capture device 200 constantly takes images 20 of the plantation field 300. The images 20 are provided to the image recognition unit 230, which runs an image analysis on the image 20 and detects and/or recognizes objects 30 on the image 20. The objects 30 to detect are preferably crops, weeds, pathogens and/or insects. Recognizing objects includes recognizing a plantation, preferably a type of plantation and/or a plantation size, an insect, preferably a type of insect and/or an insect size, and/or a pathogen, preferably a type of pathogen and/or a pathogen size. For example, it is recognized the difference between for example Amaranthus retroflexus and Digitaria sanguinalis, or between a bee and a locust. The objects 30 are provided to the treatment control unit 210.

    [0120] The treatment control unit 210 was provided with the parametrization 10 including the treatment product composition(s), the first active ingredient AI1 and the second active ingredient AI2, in form of the configuration file. The parametrization 10 can be illustrated as a decision tree, wherein based on input data, over different layers of decisions a treatment of a plantation is decided and optionally the dose and composition of the treatment product is decided. For example, in a first step, it is checked, if the biomass of the detected weed exceeds a predetermined threshold set up by the parametrization 10. The biomass of the weed generally relates to the degree of coverage of the weed in the taken image 20. For example, if the biomass of the weed is below 4%, it is decided that the weed is not treated at all. If the biomass of the weed is above 4%, further decisions are made. For example, in a second step, if the biomass of the weed is above 4%, dependent on the moisture of the soil it is decided, if the weed is treated. If the moisture of the soil exceeds a predetermined threshold, it is still decided to treat the weed and otherwise it is decided not to treat the weed. This is, because the herbicides used to treat the weed may be more effective, when the weed is in a growth phase, which is triggered by a high soil moisture. The parametrization 10 already includes information about the expected soil moisture. Since it has been raining the past days, the expected soil moisture is above the predetermined threshold and it will be decided to treat the weed. However, the treatment control unit 210 also is provided by online field data Don, in this case from a soil moisture sensor, providing the treatment control unit 210 with additional data. The decision tree of the configuration file will therefore be decided based on the online field data Don. In an exemplary embodiment, the online field data Don comprises the information that the soil moisture is below the predetermined threshold. Thus, it is decided not to treat the weed.

    [0121] The treatment control unit 210 generates a treatment control signal S based on the parametrization 10, the recognized objects and/or the online field data Don. The treatment control signal S therefore contains information if the recognized object 20 should be treated or not. The treatment control unit 210 then provides the treatment control signal S to the treatment arrangement 270, which treats the plantation based on the control signal S. The treatment arrangement 270 comprises in particular a chemical spot spray gun with different nozzles, which enables it to spray an herbicide, insecticide and/or fungicide with high precision.

    [0122] Thus, a parametrization 10 is provided dependent on offline field data Doff relating to an expected field condition. Based on the parametrization 10 a treatment device 200 can decide, which plantation should be treated only based on the situationally recognized objects in the field. Thus, the efficiency of the treatment and/or the efficacy of the treatment product can be improved. In order to further improve the efficiency of the treatment and/or the efficacy of the treatment product online field data Don can be used to include current measurable conditions of the plantation field.

    [0123] The provided treatment arrangement 400 additionally is capable of learning. The machine learning unit 110 determines the parametrization 10 dependent on a given heuristic. After the plantation treatment based on the provided parametrization 10, it is possible to validate the efficiency of the treatment and the efficacy of the treatment product. For example, the farmer can provide the field manager system 100 with field data of a part of the plantation field that has been treated before based on the parametrization 10. This information is referred to as validation data V. The validation data V is provided to the field manager system 100 via the validation data interface 160, providing the validation data V to the machine learning unit 110. The machine learning unit 110 then adjusts the parametrization 10 or the heuristic, which is used to determine the parametrization 10 according to the validation data V. For example, the validation data V indicates that the weed that has been treated based on the parametrization 10 is not killed, the adjusted parametrization 10 lowers the threshold to treat the plantation in one of the branches of the underlying decision tree.

    [0124] As an alternative to the parametrization 10 and/or the treatment product composition 40 in form of a configuration file provided by an external field manager system 100 to a treatment device 200, the functionality of the field manager system 100 can also be embedded into the treatment device 200. For example, a treatment device with relatively high computational power is capable to integrate the field manager system 100 within the treatment device 200. Alternatively, the whole described functionality of the field manager system 100 and the functionality up to the determination of the control signal S by the treatment device 200 can be calculated externally of the treatment device 200, preferably via a cloud service. The treatment device 200 thus is only a “dumb” device treating the plantation dependent on a provided control signal S.

    [0125] FIG. 2 shows a flow diagram of a plantation treatment method.

    [0126] In step S10, a parametrization 10 for controlling a treatment device 200 is received by the treatment device 200 from a field manager system 100, wherein the parametrization 10 is dependent on offline field data Doff relating to expected conditions on the plantation field 300, and at least one treatment product composition 40 expected to be used for treating the plantation dependent on the parametrization 10 is received. In step S20, an image 20 of a plantation of a plantation field 300 is taken. In step S30, objects 30 on the taken image 20 are detected. In step S40, at least one of the at least one treatment product composition 40 is chosen for treating the plantation dependent on the determined parametrization 10 and the recognized objects 30. In step S50, a control signal S for controlling a treatment arrangement 270 of the treatment device 200 is determined based on the determined parametrization 10, the recognized objects 30 and the chosen treatment product composition 40.

    [0127] FIG. 3 shows a zone map 33 of a plantation field 300. The zone map 33 divides the plantation field 300 in different zones ZB, ZC depending on a type of zone map 33. In this case, the zone map 33 divides the plantation field 300 in a center zone ZC and a border zone ZB. The border zone ZB extends around the edges of the plantation field 300. The border zone ZB is easy accessible for unauthorized persons and therefore underlies more strict legal restrictions than the center zone ZC. Based on the zone map 33, zone information is determined, indicating the special legal restrictions of the different zones ZB, ZC.

    [0128] FIG. 4 shows a treatment device 200 in form of an unmanned aerial vehicle (UAV) flying over a plantation field 300 containing a crop 410. Between the crop 410 there are also a number of weeds 421, 422, The weed 421, 422 is particularly virulent, produces numerous seeds and can significantly affect the crop yield. This weed 421, 422 should not be tolerated in the plantation field 300 containing this crop 410.

    [0129] The UAV 200 has an image capture device 220 comprising one or a plurality of cameras, and as it flies over the plantation field 300 imagery is acquired. The UAV 200 also has a GPS and inertial navigation system, which enables both the position of the UAV 200 to be determined and the orientation of the camera 220 also to be determined. From this information, the footprint of an image on the ground can be determined, such that particular parts in that image, such as the example of the type of crop, weed, insect and/or pathogen can be located with respect to absolute geospatial coordinates. The image data acquired by the image capture device 220 is transferred to an image recognition unit 120.

    [0130] The image acquired by the image capture device 220 is at a resolution that enables one type of crop to be differentiated from another type of crop, and at a resolution that enables one type of weed to be differentiated from another type of weed, and at a resolution that enables not only insects to be detected but enables one type of insect to be differentiated from another type of insect, and at a resolution that enables one type of pathogen to be differentiated from another type of pathogen.

    [0131] The image recognition unit 120 may be external from the UAV 200, but the UAV 200 itself may have the necessary processing power to detect and identify crops, weeds, insects and/or pathogens. The image recognition unit 120 processes the images, using a machine learning algorithm for example based on an artificial neural network that has been trained on numerous image examples of different types of crops, weeds, insects and/pathogens, to determine which object is present and also to determine the type of object.

    [0132] The UAV also has a treatment arrangement 260, in particular a chemical spot spray gun with different nozzles, which enables it to spray an herbicide, insecticide and/or fungicide with high precision.

    [0133] In order to treat the weed 421, 422, the UAV 200 is able to use two different treatment product compositions, a first active ingredient AI1 and a second active ingredient AI2. For example, the weeds 421, 422 are Amaranthus retroflexus and Digitaria sanguinalis, which are expected to be treated on the field. Both weeds are especially well treated with the first active ingredient AI1. The first active ingredient AI1 is cheaper and more efficient than the second active ingredient AI2 but also is considered more ecological harmful. The field manager system 100 provides the farmer with a tank recipe. In this case, the field 300 comprises a relatively big center zone ZC and a relatively small border zone ZB, as shown in FIG. 3. In view of the legal restrictions, the first active ingredient AI1 is more legally restricted than the second active ingredient AI2. In this case, this means that in the border zone ZB, the first active ingredient AI1 is legally not allowed to be used. Thus, the provided tank recipe indicates that the first active ingredient AI1, which usually can be used in the relatively big center zone ZC is needed in a larger amount than the second active ingredient AI2, which is allowed in the relatively small border zone ZB. The farmer then can equip the treatment device with the respective treatment products. The first active ingredient AI1 is stored in the first active ingredient tank 271 and the second active ingredient AI2 is stored in the second active ingredient tank 272. The treatment arrangement 270 is able to treat the plantation in the plantation field from the first active ingredient tank 271 and/or the second active ingredient tank 272.

    [0134] As shown in FIG. 5, the image capture device 220 takes in image 20 of the field 300. The image recognition analysis detects four objects 30 and identifies two crops 410 (triangle), a first unwanted weed 421 (circle) and a second unwanted weed 422 (circle). Therefore, the UAV 200 is controlled to treat the unwanted weeds 421, 422. However, the first weeds 421 is arranged in the center zone ZC of the plantation field and the second weeds 422 is arranged in the buffer zone ZB of the plantation field. Based on the online field data Don, the taken image 20 and the treatment product composition, it is determined to treat the weeds 421, 422 with the cheaper and more efficient first active ingredient AI1. However, the first active ingredient AI1 is not allowed to be used in the border zone ZB. Therefore, the second weed 422 in the border zone ZB is to be determined to be treated by the second active ingredient AI2. Without the determination of different treatment products AI1, AI2, the UAV 200 only is able to treat the whole plantation field 300 with the second active ingredient AI2 in order to not violate the specific legal restrictions of the border zone ZB.

    [0135] Thus, an improved method for plantation treatment of a plantation field improving economic return of investment and improving an impact into the ecosystem is provided.

    REFERENCE SIGNS

    [0136] 10 parametrization [0137] 20 image [0138] 30 objects on image [0139] 40 treatment product composition [0140] 100 field manager system [0141] 110 machine learning unit [0142] 140 parametrization interface [0143] 150 offline field data interface [0144] 160 validation data interface [0145] 200 treatment device (UAV) [0146] 210 treatment control unit [0147] 220 image capture device [0148] 230 image recognition unit [0149] 240 parametrization interface [0150] 250 online field data interface [0151] 270 treatment arrangement [0152] 271 first active ingredient tank [0153] 272 second active ingredient tank [0154] 300 plantation field [0155] 400 treatment system [0156] 410 crop [0157] 421 first weed [0158] 422 second weed [0159] S treatment control signal [0160] Don online field data [0161] Doff offline field data [0162] V validation data [0163] ZC centre zone [0164] ZB border zone [0165] AI1 treatment product (first active ingredient) [0166] AI2 treatment product (second active ingredient) [0167] S10 receiving parametrization and treatment product composition [0168] S20 taking image [0169] S30 recognizing object [0170] S40 choosing treatment product [0171] S50 determining control signal