METHOD FOR PLANTATION TREATMENT BASED ON IMAGE RECOGNITION
20220254155 · 2022-08-11
Inventors
- Ole JANSSEN (Köln, DE)
- Matthias TEMPEL (Leverkusen, DE)
- Bjoern KIEPE (Köln, DE)
- Mirwaes WAHABZADA (Langenfeld, DE)
Cpc classification
B64U2101/30
PERFORMING OPERATIONS; TRANSPORTING
G05B19/4155
PHYSICS
B64C39/024
PERFORMING OPERATIONS; TRANSPORTING
International classification
A01M21/04
HUMAN NECESSITIES
A01M7/00
HUMAN NECESSITIES
G05B19/4155
PHYSICS
Abstract
Method for plantation treatment of a plantation field, the method comprising taking an image of a plantation of a plantation field; recognizing items on the taken image by running a first image recognition analysis of a first complexity on the taken image based on a stored parametrization of a machine learning algorithm; identifying an unsatisfying image analysis result; determining ambient data corresponding to the taken image; recognizing items on the taken image by miming a second image recognition analysis of a second complexity on the image based on the ambient data on an external device, wherein the second complexity is higher than the first complexity; determining an improved parametrization based on the second image recognition analysis for the machine learning algorithm for improving the first image recognition analysis; and controlling a treatment arrangement of a treatment device based on the first image recognition analysis.
Claims
1. A method for plantation treatment of a plantation field, the method comprising: taking (S10) an image (10) of a plantation of a plantation field (300); recognizing (S20) items (20) on the taken image (10) by running a first image recognition analysis of a first complexity on the taken image (10) based on a stored parametrization (P) of a machine learning algorithm; identifying (S30) an unsatisfying image analysis result (R); determining (S40) ambient data (21) corresponding to the taken image (10); recognizing (S50) items (20) on the taken image (10) by running a second image recognition analysis of a second complexity on the image (10) based on the ambient data (21) on an external device (400), wherein the second complexity is higher than the first complexity; determining (S60) an improved parametrization (PI) based on the second image recognition analysis for the machine learning algorithm for improving the first image recognition analysis; and controlling (S70) a treatment arrangement (70) of a treatment device (200) based on the first image recognition analysis.
2. The method according to claim 1, wherein the ambient data (21) comprises a type of a field crop and/or a growth stage of the field crop and/or illumination characteristics and/or weather conditions.
3. The method according to claim 1, wherein the unsatisfying image analysis result (R) is indicated by a low confidence of the machine learning algorithm.
4. The method according to claim 1, further comprising: buffering the image (10) and/or the ambient data (21) before running the second image recognition analysis.
5. The method according to claim 1, wherein the buffered image (10) and/or buffered ambient data (21) are transmitted to the external device (400), preferably an internet server, based on the availability of a transmission technology, in particular a cell phone coverage, an idle service and/or a WLAN connection.
6. The method according to claim 1, wherein the second image recognition analysis is run based on additional data sources, preferably smart phone apps and/or drone imagery; wherein preferably the additional data sources provide geographical information and/or expected phenotypical differences between regions.
7. The method according to claim 1, wherein the second image recognition analysis is based on more layers and/or more nodes and/or different more complex algorithms for background segmentation than the first image recognition analysis.
8. A controlling device (100) for a treatment device (200) for plantation treatment of a plantation of a plantation field, the controlling device comprising: an image interface (110) being adapted for receiving an image (10) of a plantation of a plantation field; a treatment control interface (130); an image recognition unit (120) being adapted for recognizing items (20) on the taken image (10) by running a first image recognition analysis of a first complexity on the image based on a stored parametrization (P) of a machine learning algorithm; the image recognition unit (120) being adapted for identifying an unsatisfying image analysis result (R); the image recognition unit (120) being adapted for determining ambient data (21) corresponding to the taken image (10); a communication interface (150) being adapted for transmitting the taken image (10) and the determined ambient data (21) to an external device (400) being adapted for recognizing items (20) on the taken on the image (10) based on the ambient data (21), wherein the second complexity is higher than the first complexity; the communication interface (150) being adapted for receiving an improved parametrization (PI) for the first machine learning algorithm for improving the first image recognition analysis from the external device (400); a controlling unit (170) being adapted for generating a treatment controlling signal (S) for a treatment arrangement (70) of a treatment device (200) based on the improved first image recognition analysis; and the controlling unit (170) being adapted for outputting the treatment controlling signal (S) to the treatment control interface (130).
9. The controlling device according to claim 8, further comprising: a machine learning unit (160), being adapted for indicating a unsatisfying image analysis result (R) by a low confidence of the machine learning algorithm.
10. The controlling device according to claim 8, further comprising: a buffer interface (180), being configured for transmitting to and receiving from a buffer (80) the image (10) and the ambient data (21) before them being transmitted to the external device (400).
11. The controlling according to claim 9, wherein the communication interface (150) being adapted for transmitting the buffered image (10) and buffered ambient data (21) to the external device (400) based on the availability of a transmission technology, in particular a cell phone coverage, an idle service and/or a WLAN connection.
12. The controlling device according to claim 8, wherein the second image recognition analysis is run based on additional data sources, preferably smart phone apps and/or drone imagery, wherein preferably the additional data sources provide geographical information and/or expected phenotypical differences between regions.
13. A treatment device (200) for plantation treatment of a plantation of a plantation field, the treatment device comprising: an image capture device (220) being adapted for taking an image (10) of a plant field; a treatment arrangement (60); an image interface (210) being adapted for providing an image (10) captured by the image capture device (220) to a controlling device (100); and a treatment control interface (230) being adapted for receiving a treatment controlling signal (S) from the controlling device (100); wherein the image interface (210) of the treatment device (200) is connectable to an image interface (110) of the controlling device (100); wherein the treatment control interface (230) of the treatment device (200) is connectable to a treatment control interface (130) of the controlling device (100); wherein the treatment device (200) is adapted to activate the treatment arrangement (270) based on the treatment controlling signal (S) received from the controlling device (100) via the treatment control interface (230) of the treatment device (200).
14. The treatment device according to claim 13, wherein the image capture device (220) comprises one or a plurality of cameras, in particular on a boom of the treatment device (200), wherein the image recognition unit (120) is adapted for recognizing insects, plantation and/or pathogen using red-green-blue RGB data and/or near infrared NIR data.
15. The treatment device according to claim 13, further comprising the a controlling device (100).
16. The treatment device according to claim 13, wherein the treatment device (200) is designed as a smart sprayer, wherein the treatment arrangement (270) is a nozzle arrangement.
17. A method for plantation treatment of a plantation field, the method comprising: taking an image of a plantation of a plantation field; recognizing items on the taken image by running a first image recognition analysis of a first complexity on the taken image based on an initially stored parametrization of a machine learning algorithm; identifying an unsatisfying image analysis result; determining ambient data corresponding to the taken image; recognizing items on the taken image by running a second image recognition analysis of a second complexity on the image based on the ambient data and the stored parametrization of the first image recognition on an external device, wherein the second complexity is higher than the first complexity; determining an improved parametrization based on the second image recognition analysis for the machine learning algorithm and updating the stored parametrization of the first image recognition by the improved parametrization of the second image recognition for improving the first image recognition analysis; and controlling a treatment arrangement of a treatment device based on the first image recognition analysis on the taken image based on the updated improved parametrization.
18. A method for plantation treatment of a plantation field, the method comprising: (step 1) taking an image of a plantation of a plantation field; (step 2) recognizing items on the taken image by running a first image recognition analysis of a first complexity on the taken image based on a stored parametrization of a machine learning algorithm; (step 3) identifying an unsatisfying image analysis result; (step 4) determining ambient data corresponding to the taken image; (step 5) recognizing items on the taken image by running a second image recognition analysis of a second complexity on the image based on the ambient data on an external device, wherein the second complexity is higher than the first complexity; (step 6) determining an improved parametrization based on the second image recognition analysis for the machine learning algorithm for improving the first image recognition analysis; (step 7) controlling a treatment arrangement of a treatment device based on the first image recognition analysis with the stored (initial) parametrization unless the improved parametrization is determined, and (step 8) controlling a treatment arrangement of a treatment device based on the first image recognition analysis with improved parametrization when the improved parametrization is determined.
19. The method according to claim 17, further comprising controlling a treatment arrangement of a treatment device based on the first image recognition analysis with improved parametrization is conducted after a certain time period after controlling a treatment arrangement of a treatment device based on the first image recognition analysis with the stored parametrization has started.
20. The method of claim 19, wherein the time period is selected from a group, the group consisting of 0 to 100 seconds, 0 to 100 minutes, 0 to 100 hours, 0 to 10 days, 0 to 10 weeks, and 0 to 12 months.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0105] Exemplary embodiments will be described in the following with reference to the following drawings:
[0106]
[0107]
[0108]
[0109]
[0110]
DETAILED DESCRIPTION OF EMBODIMENTS
[0111]
[0112] Step 10 comprises taking an image 10 of a plantation of a plantation field 300.
[0113] In step 20 a first image recognition analysis of a first complexity is run on the taken image 10. The first image recognition analysis has a first complexity and is based on a stored parametrization P of a machine learning algorithm. The machines learning algorithm preferably is an artificial neural network. Thus, items 20 on the taken image 10 are recognized, at least detected and ideally identified.
[0114] In step 30, it is checked, if the first image recognition analysis provides a satisfying image analysis result R. If an item, which corresponds to an object like a crop, weed, insect or pathogen is detected but cannot be identified, the image analysis result R is unsatisfying. If the image analysis result R is satisfying, the method jumps to step 70 and the first image recognition analysis is complete and a treatment arrangement 270 of a treatment device 200 is treated based on the first image recognition analysis. If the image analysis result R is unsatisfying, the method jumps to step S40. However, the treatment arrangement 270 of the treatment device 200 is still treated based on the first recognition analysis regarding the detected and identified items 20 anyways. The not identified objects are not treated. Alternatively, the treatment arrangement 270 of the treatment device 200 is provided with a supplied map, indicating how the field has been treated in the past, and treats the plantation in the field based on the supplied map. Alternatively, no plantation is treated at all, if the image analysis result R is unsatisfying. This is the safest variation in view of potential environmental and/or economical risk.
[0115] In step S40, in addition to the image 10, ambient data 21 corresponding to the taken image 10 is determined. The ambient data 21 preferably comprises the type of crop, the growth stage of the plantation and/or illumination characteristics. All this information determining the ambient data 21 is a snapshot of the time, the image 10 was taken. The method jumps to step S50.
[0116] In step S50, a second image recognition analysis of a second complexity is run on the taken image 10 and the ambient data 21. The second complexity of the second image recognition analysis is higher than the first complexity of the first image recognition analysis. Normally, the capabilities of a device running the first image recognition analysis on a plantation field 300 are limited. Therefore, the second image recognition analysis is run on an external device 400. The second image recognition analysis is used to recognize and identify items 20 on the image 10. The second image recognition analysis is thereby run by a further machine learning algorithm. The method jumps to step S60.
[0117] In step S60, the further machine learning algorithm determines an improved parametrization PI based on the second image recognition analysis. The improved parametrization PI is then used to improve the first image recognition analysis and used to train the machine learning algorithm providing the parametrization P to the first image recognition analysis in an improved way. The method then jumps to step 20. Ideally, the first image recognition analysis was improved in such a way that it results in a satisfying image analysis result the next time such a situation occurs.
[0118]
[0119] A treatment device 200, preferably a smart sprayer, comprises an image capture device 220 and a treatment arrangement 270 as well as an image interface 210 and a treatment control interface 230.
[0120] The image capture device 220 comprises at least one camera, configured to take an image 10 of a plantation field 300. The taken image 10 is provided to an image interface 210 of the treatment device 200. The image interface 210 transmits the image 10 to a controlling device 100, in particular an image interface 110 of the controlling device 100.
[0121] The controlling device 100 comprises an image recognition unit 120, a machine learning unit 160 and a controlling unit 170. Additionally, the controlling device 100 comprises an image interface 110, a treatment control interface 130, a communication interface 150 and a buffer interface 180. The controlling device 100 may refer to a data processing element such as a microprocessor, microcontroller, field programmable gate array (FPGA), central processing unit (CPU), digital signal processor (DSP) capable of receiving field data, e.g. via a universal service bus (USB), a physical cable, Bluetooth, or another form of data connection. The controlling device 100 may be provided for each treatment device 200. Alternatively, the controlling device may be a central controlling device, e.g. a personal computer (PC), for controlling multiple treatment devices 200 in the field 300.
[0122] The image interface 110 receives the image 10 from the image interface 210 of the treatment device 200 and provides the image 10 to the image recognition unit 120. The image recognition unit 120 runs a first image recognition analysis based on parameters P, which are provided by the machine learning unit 160. Here the machine learning unit 160 may include trained machine learning algorithm(s), wherein the output of the machine learning algorithm(s) may be used for the image recognition. Based on the first image recognition analysis, the image recognition unit 120 determines image analysis results R. The image analysis results R, for example the recognized and identified items 20 of the analyzed image 10, are provided to the controlling unit 170. The controlling unit 170 determines a treatment controlling signal S based on the image analysis results R. For example, when the image analysis results R contain an identified weed that is harmful for the crop and has to be treated, in particular destroyed, the controlling unit 170 determines a treatment controlling signal S that instructs the treatment arrangement 270 to treat the identified weed. In this case, the treatment arrangement 270 comprising a nozzle arrangement of several independent nozzles is instructed to aim for the identified weed and the treatment arrangement 270 sprays the weed with a herbicide through the aiming nozzle. This however, can only be done for items 20, which are detected by the image recognition unit 120 and additionally identified by the image recognition unit 120. If an item 20 is detected, in other words, the image recognition unit 120 is certain that an object has been found, but the item 20 cannot be identified, the controlling unit 170 cannot determine a fitting treatment controlling signal S for this object, since it is not clear if it is a crop or a weed, or which type of insect or which type of pathogen was detected. The image recognition units 120 thus determines that the image analysis results R are unsatisfying.
[0123] In the case of an unsatisfying analysis result R, the image recognition unit 120 provides the image 10, in particular the raw data from the image capture device 220, and additionally ambient data 21 like the type of field crop, the growth stage and/or illumination characteristics, to an external device 400 via a communication interface 150 of the controlling device 100 and a communication interface 450 of the external device 400. The external device 400 preferably is an internet server.
[0124] The image 10 and the ambient data 21 are provided to an image recognition unit 420, which runs a second image recognition analysis, which is more complex than the first image recognition analysis. More complex in this case refers to more deep layers and/or different algorithms for background segmentation. In addition to the higher complexity, the second image recognition analysis is provided by additional data from additional data sources. For example, geographical information and/or expected phenotypical differences between regions can be provided by smart phone apps and/or drone imagery. The second image recognition analysis is also based on an improved parametrization PI of a machine learning algorithm of a machine learning unit 460, which based on the higher amount of input data and better quality of image recognition analysis has improved training and learning characteristics. Here the machine learning unit 460 may include trained machine learning algorithm(s), wherein the output of the machine learning algorithm(s) may be used for the improved image recognition. Therefore, the external device 400 can provide an improved parametrization PI from the machine learning unit 460 via the communication interface 450 of the external device 400 and the communication device 150 of the controlling device 100 to the machine learning unit 160 of the controlling device 100.
[0125] Based on the improved parametrization PI, the machine learning unit 160 can train and learn the machine learning algorithm in an improved way. Therefore, the provided parametrization P to the image recognition unit 120 improves the first image recognition analysis and reduce the cases of an unsatisfying image analysis result R.
[0126] The above method will be described along an exemplary embodiment as follows: For the image recognition a smart sprayer may be equipped with 10 or more cameras. The cameras may have a reaction or response time of less than 100 milliseconds and may record up to 20 and more images per second. As there is a closed control loop on the camera and the system, the sprayer is activated at almost the same moment. Image recognition with high accuracy requires large computing capacities. However, it may be expensive to install e.g. a super powerful processor for several hundred EUR/processor on each camera, so that this can be compensated by the approach of this invention. It may take about 50 to 150 milliseconds from the image acquisition of the camera to the nozzle control, i.e. after about 50 to 150 milliseconds a nozzle control must already be performed after the first image analysis. A smart sprayer drives over the plantation field and sometimes it does not recognize certain weeds, single images are sent to an external server (via e.g. LTE/5G), images are then sent to the CCU (i.e. central computing unit/central processing unit, also referred to as master unit). With the camera including the computational resources on site, for example, only 4-5 weeds (or weed classes) can be distinguished from each other. The data base however can distinguish 110 weeds. However, this requires computational power and in particular an efficient and adapted image recognition with an improved parametrization. This will be provided by an external device to which the image data are transferred for computing the updated parametrization.
[0127] However, there might be cases when it is not wanted or not possible to transmit the image 10 and/or the ambient data 21 directly to the external device 400. For example, different snapshots of images 10 and ambient data 21 should be collected before providing them to the external device 400. In another example, the external device 400 just cannot be reached by the communication interface 150 when the controlling device 100 has no access to any communication means, like WLAN or mobile data like LTE, 5G. In such cases, the image 10 and the ambient data 21 are transmitted to a buffer interface 180. The buffer interface 180 transmits the image 10 and the ambient data 21 to a buffer interface 81 of a buffer 80. The buffer 80 can be any kind of storage device, as long as it is suitable to store the received data for as long as it is needed to be stored it. When the buffered data is needed again, the buffer 80 will transmit the image 10 and the ambient data 21 back to the controlling device 100 via the buffer interface 81 of the buffer 80 and the buffer interface 180 of the controlling device 100. The image 10 and the ambient data 21 are then directly transmitted from the buffer interface 180 of the controlling device 100 via the communication interface 150 of the controlling device 100 to the communication interface 450 of the external device 400 for the second image recognition analysis. Preferably, the buffer interface 180 is provided with a trigger signal (not shown), indicating, if a transmission technology is available. Only if the trigger signal is present, the image 10 and the ambient data 21 and/or data buffered in the buffer 80 will be transmitted to the communication interface 450 of the external device 400 via the communication interface 150 of the controlling device 100. If the trigger signal is not present, the image 10 and the ambient data 21 will be transmitted to the buffer interface 81 of the buffer 80.
[0128] There may be different situations with respect to the access to the external device, i.e. the CCU (central computing unit or Connectivity Control Unit). Depending on the country a different use case is important. In some cases mobile internet is available in the field, then the time intervals between the first image analysis and the second image analysis are short (a few seconds, maximum a few minutes), while the farming machine is driving, the nozzle control can be adjusted after only a few meters (e.g. 5 or 10 meters), already by means of the “parametrization” of the second image analysis, which is used to update the parametrization of the first image recognition. In other cases there is no mobile internet available in the field, however, a CCU is installed on the farming machine, which can carry out the arithmetic operations for the second image analysis, then the time intervals between the first image analysis and the second image analysis are also short (a few seconds). While the tractor is driving, after only a few meters (e.g. 5 or 10 meters) the nozzle control can already be adapted by means of the “parametrization” of the second image analysis. In yet another cases, there is neither mobile internet available in the field, nor is a CCU installed on the farming machine, so that the second image analysis can only be carried out after the entire crop protection application has been completed. The time intervals between the first image analysis and the second image analysis can then be several hours. The time intervals between the above first image analysis, approximately 80 milliseconds after image acquisition, and the second image analysis can then be several hours and the adapted nozzle control by means of the “parametrization” of the second image analysis can be much longer, because the nozzle control is only adapted the next time “driving onto the field”, this can be weeks, months or one season later.
[0129]
[0130] The UAV 200 has an image capture device 220 comprising one or a plurality of cameras, and as it flies over the plantation field 300 imagery is acquired. The UAV 200 also has a GPS and inertial navigation system, which enables both the position of the UAV 200 to be determined and the orientation of the camera 220 also to be determined. From this information, the footprint of an image on the ground can be determined, such that particular parts in that image, such as the example of the type of crop, weed, insect and/or pathogen can be located with respect to absolute geospatial coordinates. The image data acquired by the image capture device 220 is transferred to an image recognition unit 120.
[0131] The image acquired by the image capture device 220 is at a resolution that enables one type of crop to be differentiated from another type of crop, and at a resolution that enables one type of weed to be differentiated from another type of weed, and at a resolution that enables not only insects to be detected but enables one type of insect to be differentiated from another type of insect, and at a resolution that enables one type of pathogen to be differentiated from another type of pathogen.
[0132] The image recognition unit 120 may be external from the UAV 200, but the UAV 200 itself may have the necessary processing power to detect and identify crops, weeds, insects and/or pathogens. The image recognition unit 120 processes the images, using a machine learning algorithm for example based on an artificial neural network that has been trained on numerous image examples of different types of crops, weeds, insects and/pathogens, to determine which object is present and also to determine the type of object.
[0133] The UAV also has a treatment arrangement 270, in particular a chemical spot spray gun with different nozzles, which enables it to spray an herbicide, insecticide and/or fungicide with high precision.
[0134] As shown in
REFERENCE LIST
[0135] 10 image
[0136] 20 (recognized) item on image
[0137] 21 ambient data
[0138] 80 buffer
[0139] 81 buffer interface
[0140] 100 controlling device
[0141] 110 image interface
[0142] 120 image recognition unit
[0143] 130 treatment control interface
[0144] 150 communication interface
[0145] 160 machine learning unit
[0146] 170 controlling unit
[0147] 180 buffer interface
[0148] 200 treatment device, smart sprayer, UAV
[0149] 210 image interface
[0150] 220 image capture device
[0151] 230 treatment control interface
[0152] 270 treatment arrangement
[0153] 300 plantation field
[0154] 400 external device
[0155] 420 image recognition unit
[0156] 450 communication interface
[0157] 460 machine learning unit
[0158] 510 crop
[0159] 520 weed
[0160] 530 unidentified object
[0161] P parametrization
[0162] PI improved parametrization
[0163] R image analysis result
[0164] S treatment controlling signal
[0165] S10 taking image
[0166] S20 recognizing items by first image recognition analysis
[0167] S30 identifying unsatisfying image analysis result
[0168] S40 determining ambient data
[0169] S50 recognizing items by second image recognition analysis
[0170] S60 determining improve parametrization
[0171] S70 controlling treatment arrangement