METHOD FOR PLANTATION TREATMENT OF A PLANTATION FIELD

20220167605 · 2022-06-02

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for plantation treatment of a plantation field, the method, comprising: receiving (S10) a parametrization (10) for controlling a treatment device (200) by the treatment device (200) from a field manager system (100), wherein the parametrization (10) is dependent on offline field data (Doff) relating to expected conditions on the plantation field (300); taking (S20) an image (20) of a plantation of a plantation field (300); recognizing (S30) objects (30) on the taken image (20); determining (S40) a control signal (S) for controlling a treatment arrangement (240) of the treatment device (200) based on the determined parametrization (10) and the recognized objects (30).

    Claims

    1. A method for plantation treatment of a plantation field, the method comprising: receiving (S10) a parametrization (10) for controlling a treatment device (200) by the treatment device (200) from a field manager system (100), wherein the parametrization (10) is dependent on offline field data (Doff) relating to expected conditions on the plantation field (300); taking (S20) an image (20) of a plantation of a plantation field (300); recognizing (S30) objects (30) on the taken image (20); and determining (S40) a control signal (S) for controlling a treatment arrangement (270) of the treatment device (200) based on the received parametrization (10) and the recognized objects (30).

    2. The method of claim 1, wherein: taking (S20) an image (20) of a plantation of a plantation field (300); recognizing (S30) objects (30) on the taken image (20); and determining (S40) a control signal (S) for controlling a treatment arrangement (270) are carried out as a real time process, such that the treatment device (200) is instantaneous controllable based on taken images of the plantation field as the treatment device traverses through the field at the time of treatment in a specific location of the field.

    3. The method of claim 1, further comprising: receiving the offline field data (Doff) by the field manager system (100); determining the parametrization (10) of the treatment device (200) dependent on the offline field data (Doff); and providing the determined parametrization (10) to the treatment device (200).

    4. The method of claim 1, wherein the parametrization includes one layer relating to an on/off decision, a second layer relating to a composition of a treatment product and/or a third layer relating to a dosage of the treatment product.

    5. The method of claim 4, wherein: the parametrization of an on/off decision includes thresholds relating to parameter(s) derived from the taken image and/or the object recognition, and at least one parameter derived from the taken image and/or object recognition relates to an object coverage.

    6. The method of claim 1, wherein the parametrization for controlling the treatment device is at least in part spatially resolved for the plantation field.

    7. The method of claim 1, further comprising: receiving online field data (Don) by the treatment device (200) relating to current conditions on the plantation field (300); and determining the control signal (S) dependent on the determined parametrization (10) and the determined recognized objects (30) and/or the determined online field data (Don).

    8. The method of claim 7, wherein: the online field data (Don) relates to current weather condition data, current plantation growth data and/or current soil data.

    9. The method of claim 1, further comprising: providing validation data (V) dependent on a performance review of the treatment of the plantation; and adjusting the parametrization (10) dependent on the validation data (V).

    10. The method of claim 8, wherein: the online field data (Don) and the validation data (V) are at least in part spatially resolved for the plantation field.

    11. A field manager system (100) for a treatment device (200) for plantation treatment of a plantation field (300), the field manager system (100) comprising: an offline field data interface (150) being adapted for receiving offline field data (Doff) relating to expected conditions on the plantation field (300); a machine learning unit (110) being adapted for determining the parametrization (10) of the treatment device (200) dependent on the offline field data (Doff); and a parametrization interface (140), being adapted for providing the parametrization (10) to the treatment device (200) according to claim 10.

    12. The field manager system (100) of claim 11, further comprising: a validation data interface (160) being adapted for receiving validation data (V); wherein the machine learning unit (110) is adapted for adjusting the parametrization (10) dependent on the validation data (V).

    13. A treatment device (200) for plantation treatment of a plantation, the treatment device (200) comprising: an image capture device (220) being adapted for taking an image (20) of a plantation; a parametrization interface (240) being adapted for receiving a parametrization (10) from a field manager system (100) according to claim 9; a treatment arrangement (270) being adapted for treating the plantation dependent on the received parametrization (10); an image recognition unit (230) being adapted for recognizing objects (30) on the taken image (20); a treatment control unit (210) being adapted for determining a control signal (S) for controlling a treatment arrangement (270) dependent on the received parametrization (10) and the recognized objects (30); wherein the parametrization interface (240) of the treatment device (200) is connectable to a parametrization interface (140) of the field manager system (100); wherein the treatment device (200) is adapted to activate the treatment arrangement (270) based on the control signal (S) of the treatment control unit (210).

    14. The treatment device of claim 13, further comprising: an online field data interface (240) being adapted for receiving online field data (Don) relating to current conditions on the plantation field (300); wherein the treatment control unit (210) is adapted for determining a control signal (S) for controlling a treatment arrangement (270) dependent on the received parametrization (10) and the recognized objects (30) and/or the online field data (Don).

    15. The treatment device of claim 13, wherein the image capture device (220) comprises one or a plurality of cameras, in particular on a boom of the treatment device (200), wherein the image recognition unit (230) is adapted for recognizing objects using red-green-blue RGB data and/or near infrared NIR data.

    16. The treatment device of claim 13, wherein the treatment device (200) is designed as a smart sprayer, wherein the treatment arrangement (270) is a nozzle arrangement.

    17. The treatment device of claim 13, wherein the image capture device (220) comprises a plurality of cameras and the treatment arrangement (270) comprises a plurality of nozzle arrangements, each being associated to one of the plurality of cameras, such that images captured by the cameras are associated with the area to be treated by the respective nozzle arrangement.

    18. A treatment system comprising a field manager system according to claim 11.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0084] Exemplary embodiments will be described in the following with reference to the following drawings:

    [0085] FIG. 1 shows a schematic diagram of a plantation treatment system;

    [0086] FIG. 2 shows a flow diagram of a plantation treatment method;

    [0087] FIG. 3 shows a schematic view of a treatment device on a plantation field; and

    [0088] FIG. 4 shows a schematic view of an image with detected objects.

    DETAILED DESCRIPTION OF EMBODIMENTS

    [0089] FIG. 1 shows a plantation treatment system 400 for treating a plantation of a plantation field 300 by at least one treatment device 200 controlled by a field manager system 100.

    [0090] The treatment device 200, preferably a smart sprayer, comprises a treatment control unit 210, an image capture device 220, an image recognition unit 230 and a treatment arrangement 270 as well as a parametrization interface 240 and an online field data interface 250.

    [0091] The image capture device 220 comprises at least one camera, configured to take an image 20 of a plantation field 300. The taken image 20 is provided to the image recognition unit 230 of the treatment device 200.

    [0092] The field manager system 100 comprises a machine learning unit 110. Additionally, the field manager system 100 comprises an offline field data interface 150, a parametrization interface 140 and a validation data interface 160. The field manager system 100 may refer to a data processing element such as a microprocessor, microcontroller, field programmable gate array (FPGA), central processing unit (CPU), digital signal processor (DSP) capable of receiving field data, e.g. via a universal service bus (USB), a physical cable, Bluetooth, or another form of data connection. The field manager system 100 may be provided for each treatment device 200. Alternatively, the field manager system may be a central field manager system, e.g. a cloud computing environment or a personal computer (PC), for controlling multiple treatment devices 200 in the field 300.

    [0093] The field manager 100 is provided with offline field data Doff relating to expected condition data of the plantation field 300. Preferably, the offline field data Doff comprises local yield expectation data, resistance data relating to a likelihood of resistance of the plantation against a treatment product, expected weather condition data, expected plantation growth data, zone information data, relating to different zones of the plantation field, expected soil data, e.g. soil moisture data, and/or legal restriction data.

    [0094] The offline field data Doff is provided from external repositories. For example, the expected weather data may be based on satellite data or measured weather data used for forecasting the weather. The expected plantation growth data is for example provided by a database having stored different plantation growth stages or from plantation growth stage models, which make statements on the expected growth stage of a crop plant, a weed and/or a pathogen dependent on past field condition data. The expected plantation growth data may be provided by plantation models, which are basically digital twins of the respective plantation, and estimate the growth stage of the plantation, in particular dependent on former field data. Further, for example the expected soil moisture data may be determined dependent on the past, present and expected weather condition data. The offline field data Doff may also be provided by an external service provider.

    [0095] Dependent on the offline field data Doff, the machine learning unit 110 determines a parametrization 10. Preferably, the machine learning unit 110 knows the planned time of treatment of the plantation. For example, a farmer provides the field manager system 100 with the information that he plans to treat the plantation in a certain field the next day. The parametrization 10 preferably is represented as a configuration file that is provided to the parametrization interface 140 of the field manager system 100. Ideally, the parametrization 10 is determined by the machine learning unit 110 on the same day, the treatment device 200 is using the parametrization 10. Here the machine learning unit 110 may include trained machine learning algorithm(s), wherein the output of the machine learning algorithm(s) may be used for the parametrization. The determination of the parametrization may also be conducted without involvement of any machine learning algorithm(s). Via the parametrization interface 140, the parametrization 10 is provided to the treatment device 200, in particular the parametrization interface 240 of the treatment device 200. For example, the parametrization 10 in form of a configuration file is transferred and stored in a memory of the treatment device 200.

    [0096] When the parametrization 10 is received by the treatment device 200, in particular the treatment control unit 210, the treatment of plantation in the plantation field 300 can begin.

    [0097] The treatment device 200 moves around the plantation field 300 and detects and recognizes objects 30, in particular crop plants, weeds, pathogens and/or insects on the plantation field 300.

    [0098] Therefore, the image capture device 200 constantly takes images 20 of the plantation field 300. The images 20 are provided to the image recognition unit 230, which runs an image analysis on the image 20 and detects and/or recognizes objects 30 on the image 20. The objects 30 to detect are preferably crops, weeds, pathogens and/or insects. Recognizing objects includes recognizing a plantation, preferably a type of plantation and/or a plantation size, an insect, preferably a type of insect and/or an insect size, and/or a pathogen, preferably a type of pathogen and/or a pathogen size. For example, it is recognized the difference between for example Amaranthus retroflexus and Digitaria sanguinalis, or between a bee and a locust. The objects 30 are provided to the treatment control unit 210.

    [0099] The treatment control unit 210 was provided with the parametrization 10 in form of the configuration file. The parametrization 10 can be illustrated as a decision tree, wherein based on input data, over different layers of decisions a treatment of a plantation is decided and optionally the dose and composition of the treatment product is decided. For example, in a first step, it is checked, if the biomass of the detected weed exceeds a predetermined threshold set up by the parametrization 10. The biomass of the weed generally relates to the degree of coverage of the weed in the taken image 20. For example, if the biomass of the weed is below 4%, it is decided that the weed is not treated at all. If the biomass of the weed is above 4%, further decisions are made. For example, in a second step, if the biomass of the weed is above 4%, dependent on the moisture of the soil it is decided, if the weed is treated. If the moisture of the soil exceeds a predetermined threshold, it is still decided to treat the weed and otherwise it is decided not to treat the weed. This is, because the herbicides used to treat the weed may be more effective, when the weed is in a growth phase, which is triggered by a high soil moisture. The parametrization 10 already includes information about the expected soil moisture. Since it has been raining the past days, the expected soil moisture is above the predetermined threshold and it will be decided to treat the weed. However, the treatment control unit 210 also is provided by online field data Don, in this case from a soil moisture sensor, providing the treatment control unit 210 with additional data. The decision tree of the configuration file will therefore be decided based on the online field data Don. In an exemplary embodiment, the online field data Don comprises the information that the soil moisture is below the predetermined threshold. Thus, it is decided not to treat the weed.

    [0100] The treatment control unit 210 generates a treatment control signal S based on the parametrization 10, the recognized objects and/or the online field data Don. The treatment control signal S therefore contains information if the recognized object 20 should be treated or not. The treatment control unit 210 then provides the treatment control signal S to the treatment arrangement 270, which treats the plantation based on the control signal S. The treatment arrangement 270 comprises in particular a chemical spot spray gun with different nozzles, which enables it to spray an herbicide, insecticide and/or fungicide with high precision.

    [0101] Thus, a parametrization 10 is provided dependent on offline field data Doff relating to an expected field condition. Based on the parametrization 10 a treatment device 200 can decide, which plantation should be treated only based on the situationally recognized objects in the field. Thus, the efficiency of the treatment and/or the efficacy of the treatment product can be improved. In order to further improve the efficiency of the treatment and/or the efficacy of the treatment product online field data Don can be used to include current measurable conditions of the plantation field.

    [0102] The provided treatment arrangement 400 additionally is capable of learning. The machine learning unit 110 determines the parametrization 10 dependent on a given heuristic. After the plantation treatment based on the provided parametrization 10, it is possible to validate the efficiency of the treatment and the efficacy of the treatment product. For example, the farmer can provide the field manager system 100 with field data of a part of the plantation field that has been treated before based on the parametrization 10. This information is referred to as validation data V. The validation data V is provided to the field manager system 100 via the validation data interface 160, providing the validation data V to the machine learning unit 110. The machine learning unit 110 then adjusts the parametrization 10 or the heuristic, which is used to determine the parametrization 10 according to the validation data V. For example, the validation data V indicates that the weed that has been treated based on the parametrization 10 is not killed, the adjusted parametrization 10 lowers the threshold to treat the plantation in one of the branches of the underlying decision tree.

    [0103] As an alternative to the parametrization 10 in form of a configuration file provided by an external field manager system 100 to a treatment device 200, the functionality of the field manager system 100 can also be embedded into the treatment device 200. For example, a treatment device with relatively high computational power is capable to integrate the field manager system 100 within the treatment device 200. Alternatively, the whole described functionality of the field manager system 100 and the functionality up to the determination of the control signal S by the treatment device 200 can be calculated externally of the treatment device 200, preferably via a cloud service. The treatment device 200 thus is only a “dumb” device treating the plantation dependent on a provided control signal S.

    [0104] FIG. 2 shows a flow diagram of a plantation treatment method. In step 10 a parametrization 10 for controlling a treatment device 200 is received by the treatment device 200 from a field manager system 100, wherein the parametrization 10 is dependent on offline field data Doff relating to expected conditions on the plantation field 300. In step S20 an image 20 of a plantation of a plantation field 300 is taken. In step S30 objects 30 are recognized on the taken image 20. In step S40, a control signal S for controlling a treatment arrangement 240 of the treatment device 200 is determined based on the determined parametrization 10 and the recognized objects 30.

    [0105] FIG. 3 shows a treatment device 200 in form of an unmanned aerial vehicle (UAV) flying over a plantation field 300 containing a crop 410. Between the crop 410 there are also a number of weeds 420, The weed 420 is particularly virulent, produces numerous seeds and can significantly affect the crop yield. This weed 420 should not be tolerated in the plantation field 300 containing this crop 410.

    [0106] The UAV 200 has an image capture device 220 comprising one or a plurality of cameras, and as it flies over the plantation field 300 imagery is acquired. The UAV 200 also has a GPS and inertial navigation system, which enables both the position of the UAV 200 to be determined and the orientation of the camera 220 also to be determined. From this information, the footprint of an image on the ground can be determined, such that particular parts in that image, such as the example of the type of crop, weed, insect and/or pathogen can be located with respect to absolute geospatial coordinates. The image data acquired by the image capture device 220 is transferred to an image recognition unit 230.

    [0107] The image acquired by the image capture device 220 is at a resolution that enables one type of crop to be differentiated from another type of crop, and at a resolution that enables one type of weed to be differentiated from another type of weed, and at a resolution that enables not only insects to be detected but enables one type of insect to be differentiated from another type of insect, and at a resolution that enables one type of pathogen to be differentiated from another type of pathogen.

    [0108] The image recognition unit 230 may be external from the UAV 200, but the UAV 200 itself may have the necessary processing power to detect and identify crops, weeds, insects and/or pathogens. The image recognition unit 230 processes the images, using a machine learning algorithm for example based on an artificial neural network that has been trained on numerous image examples of different types of crops, weeds, insects and/pathogens, to determine which object is present and also to determine the type of object.

    [0109] The UAV also has a treatment arrangement 270, in particular a chemical spot spray gun with different nozzles, which enables it to spray an herbicide, insecticide and/or fungicide with high precision.

    [0110] As shown in FIG. 4, the image capture device 220 takes in image 10 of the field 300. The image recognition analysis detects four objects 30 and identifies two crops 410 (triangle) and two unwanted weeds 420 (circle). Therefore, the UAV 200 is controlled to treat the unwanted weeds 420 based on the parametrization 10, which was determined dependent on offline field data Doff and therefore allows a more precise treatment of the plantation.

    [0111] Thus, the efficiency of the treatment and/or the efficacy of the treatment product can be improved. Thus, an improved method for plantation treatment of a plantation field improving economic return of investment and improving an impact into the ecosystem is provided.

    REFERENCE SIGNS

    [0112] 10 parametrization [0113] 20 image [0114] 30 objects on image [0115] 100 field manager system [0116] 110 machine learning unit [0117] 140 parametrization interface [0118] 150 offline field data interface [0119] 160 validation data interface [0120] 200 treatment device (UAV) [0121] 210 treatment control unit [0122] 220 image capture device [0123] 230 image recognition unit [0124] 240 parametrization interface [0125] 250 online field data interface [0126] 270 treatment arrangement [0127] 300 plantation field [0128] 400 treatment system [0129] 410 crop [0130] 420 weed [0131] S treatment control signal [0132] Don online field data [0133] Doff offline field data [0134] V validation data [0135] S10 receiving parametrization [0136] S20 taking image [0137] S30 recognizing object [0138] S40 determining control signal