Apparatus for plant management
11373288 · 2022-06-28
Assignee
Inventors
Cpc classification
A01N25/00
HUMAN NECESSITIES
A01M1/026
HUMAN NECESSITIES
G06Q10/04
PHYSICS
International classification
A01M7/00
HUMAN NECESSITIES
Abstract
The present invention relates to an apparatus for plant management. It is described to provide (210) a processing unit with at least one image of a field. The processing unit analyses (220) the at least one image to determine information relating to a plant that is present. The processing unit determines (230) if the plant is to be controlled or is not to be controlled by a plant control technology based on the information relating to the plant. An output unit outputs (240) information that is useable to activate at least one plant control technology if the determination is made that the plant is to be controlled by the plant control technology.
Claims
1. An apparatus (10) for plant management, comprising: an input unit (20); a processing unit (30); and an output unit (40); wherein, the input unit is configured to provide the processing unit with at least one image of a field; wherein, the processing unit is configured to analyse the at least one image to determine information relating to a plant that is present; wherein, the processing unit is configured to determine if the plant is to be controlled or is not to be controlled by a plant control technology based on the information relating to the plant, wherein the determination if the plant is to be controlled comprises application of at least one biodiversity setting and/or at least one agronomic rule, the at least one biodiversity setting and/or at least agronomic rule comprising a list of at least one weed that is to be controlled and/or a list of at least one weed that is not to be controlled, wherein, when a type of weed is on the list of weeds that are not to be controlled, the type of weed is to be controlled when a number of weeds of that type divided by the area of the field exceeds the threshold number of weeds of that type per unit area; and wherein, if the determination is made that the plant is to be controlled by the plant control technology, the output unit is configured to output information useable to activate the plant control technology.
2. Apparatus according to claim 1, wherein the processing unit is configured to analyse the at least one image to identify the plant, and wherein the information relating to the plant comprises the identity of the plant.
3. Apparatus according to claim 1, wherein the processing unit is configured to analyse the at least one image to determine if the plant is a specimen of a crop grown in the field or if the plant is a weed; and wherein the information relating to the plant comprises the determination of the plant belonging to the crop or the determination that the plant is a weed.
4. Apparatus according to claim 3, wherein if the determination is made that the plant is a weed, the processing unit is configured to analyse the at least one image to determine a type of weed for the plant; and wherein the information relating to the plant comprises the determined type of weed.
5. Apparatus according to claim 4, wherein the processing unit is configured to analyse the at least one image to determine a number of weeds, in particular of that type, in an area of the field; and wherein the information relating to the plant comprises a determination if the number of weeds of that type divided by the area exceeds a threshold number of weeds of that type per unit area.
6. Apparatus according to claim 1, wherein the processing unit is configured to analyse the at least one image to detect at least one insect on the plant; and wherein the information relating to the plant comprises the detection of the at least one insect.
7. Apparatus according to claim 6, wherein the processing unit is configured to analyse the at least one image to determine a type of insect for an insect of the at least one insect detected; wherein the information relating to the plant comprises the type of insect, wherein in particular the processing unit is configured to analyse the at least one image to determine a number of insects of that type on the plant, and the information relating to the plant comprises the determined number of insects of that type on the plant exceeding a threshold number, and wherein further in particular the processing unit is configured to analyse the at least one image to determine a number of plants in an area of the field that have that type of insect, and the information relating to the plant comprises a determination if the number of plants having that type of insect in that area divided by the area exceeds a threshold number.
8. Apparatus according to claim 1, wherein the at least one image was acquired by at least one camera; and wherein the input unit is configured to provide the processing unit with at least one geographical location associated with the at least one camera when the at least one image was acquired.
9. Apparatus according to claim 1, wherein analysis of the at least one image comprises utilisation of a machine learning algorithm.
10. A system (100) for plant management, comprising: at least one camera (110); an apparatus (10) for plant management according to claim 1; and at least one plant control technology (120); wherein, the at least one camera is configured to acquire the at least one image of the field; wherein, the at least one plant control technology is mounted on a vehicle (130); and wherein, the apparatus is configured to activate the at least one plant control technology at a location of the plant if the determination is made that the plant is to be controlled.
11. System according to claim 10, wherein the apparatus is mounted on the vehicle; and wherein the at least one camera is mounted on the vehicle.
12. A method (200) for plant management, comprising: a) providing (210) a processing unit with at least one image of a field; b) analysing (220) by the processing unit the at least one image to determine information relating to a plant that is present; c) determining (230) by the processing unit if the plant is to be controlled or is not to be controlled by a plant control technology based on the information relating to the plant, wherein determining if the plant is to be controlled comprises applying at least one biodiversity setting and/or at least one agronomic rule, the at least one biodiversity setting and/or at least agronomic rule comprising a list of at least one weed that is to be controlled and/or a list of at least one weed that is not to be controlled, wherein, when a type of weed is on the list of weeds that are not to be controlled, the type of weed is to be controlled when a number of weeds of that type divided by the area of the field exceeds the threshold number of weeds of that type per unit area; and d) outputting (240) information by an output unit that is useable to activate the plant control technology if the determination is made that the plant is to be controlled.
13. A computer program element for controlling an apparatus which when executed by a processor is configured to carry out the method of claim 12.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Exemplary embodiments will be described in the following with reference to the following drawings:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
DETAILED DESCRIPTION OF EMBODIMENTS
(10)
(11) In this embodiment, the insect control technology is an insecticide and the weed control technology is a herbicide. Also, the plant control technology is configured as a spray.
(12) The processing unit 30 is configured to analyse the at least one image to determine information relating to a plant that is present. The processing unit 30 is configured also to determine if the plant is to be sprayed with an insecticide and/or herbicide or is not to be sprayed with the insecticide and/or herbicide based on the information relating to the plant. If the determination is made that the plant is to be sprayed with the insecticide and/or herbicide, the output unit is configured to output information useable to activate at least one spray gun.
(13) In an example, the apparatus is operating in real-time, where images are acquired and immediately processed and a decision is immediately made to spray a plant or not to spray the plant, and if the plant is to be sprayed the spray gun is immediately used to spray the plant, and if the plant is not to be sprayed the spray gun is not activated. Thus, for example a vehicle can acquire imagery of its environment and process that imagery to determine if a plant is to be sprayed or not. Thus, for example a UAV can fly around a field and acquire imagery and determine if plants should be sprayed or not, via a spray gun located on the UAV. Thus, for example a robotic land vehicle can move around a field and acquire imagery and determine if plants should be sprayed or not, via a spray gun located on the robotic land vehicle.
(14) In an example, the apparatus is operating in quasi real time, where images are acquired of a field and immediately processed to determine if plants should be sprayed or not. That information can later be used by an appropriate system (or systems) that travel(s) within the field and uses its spray gun to spray some of the plants and leave others of the plants alone. Thus, for example, a first vehicle, such as an unmanned aerial vehicle (UAV) or drone equipped with one or more cameras can travel within a field and acquire imagery. This imagery can be immediately processed to determine which plants whether crop plants or weeds should be sprayed, with the other crop plants and weeds being left alone. Thus, in effect a “spray map” is generated detailing the location of crop plants and weeds that need to be sprayed with an insecticide and/or herbicide. Later, a vehicle equipped with a spray gun or several spray guns that can spray insecticide and/or herbicide can travel within the field and spray the crop plants and weeds that were previously determined to need to be sprayed. Thus, for example, a UAV with a chemical spray gun then flies to the location of the weeds that need to be controlled and sprays the weeds, or a robotic land vehicle travels within the field and uses its chemical spray gun and applies the herbicide and/or insecticide to the plants that have been determined to require to be sprayed.
(15) In an example, the apparatus is operating in an offline mode. Thus, imagery that has previously been acquired is provided later to the apparatus. The apparatus then image processes the imagery and determines which plants in the field whether crop plants or weeds should be sprayed with an insecticide and/or herbicide, to in effect generate a spray map of specific plants and their locations that need to be sprayed. The spray map is then used later by one or more vehicles that then travel within the field and activate their spray guns at the locations of the plants that need to be sprayed, using the spray map, to manage insects in the field.
(16) In an example, the output unit outputs a signal that is directly useable to activate one or more spray guns.
(17) According to an example, the processing unit is configured to analyse the at least one image to identify the plant. The information relating to the plant can then comprise the identity of the plant.
(18) According to an example, the processing unit is configured to analyse the at least one image to determine if the plant is a specimen of a crop grown in the field or if the plant is a weed. The information relating to the plant can then comprise the determination of the plant belonging to the crop or the determination that the plant is a weed.
(19) According to an example, if the determination is made that the plant is a weed, the processing unit is configured to analyse the at least one image to determine a type of weed for the plant.
(20) The information relating to the plant can then comprise the determined type of weed.
(21) According to an example, the processing unit is configured to analyse the at least one image to determine a number of weeds of that type in an area of the field. The information relating to the plant can then comprise a determination if the number of weeds of that type divided by the area exceeds a threshold number of weeds of that type per unit area.
(22) In an example, the processing unit is configured to analyse the at least one image to determine a location of the plant in the at least one image. In other words, an image will have an areal footprint on the ground, and by locating the crop plant or weed in the image, the actual position of the crop plant or weed on the ground can be determined to an accuracy better than the overall footprint of the image. Thus a chemical spray gun carried by the vehicle that acquired and processed the image can be accurately used to spray the crop plant or weed in order to manage the insects in the field more effectively and efficiently. Also, by knowing the position of the crop plant or weed accurately, a different vehicle to that that acquired and processed the image can go though the field and spray the required crop plants and weeds at their locations using its spray gun(s).
(23) According to an example, the processing unit is configured to analyse the at least one image to detect at least one insect on the plant. The information relating to the plant can then comprise the detection of the at least one insect.
(24) According to an example, the processing unit is configured to analyse the at least one image to determine a type of insect for an insect of the at least one insect detected. The information relating to the plant can then comprise the type of insect.
(25) According to an example, the processing unit is configured to analyse the at least one image to determine a number of insects of that type on the plant. The information relating to the plant can then comprise the determined number of insects of that type on the plant exceeding a threshold number.
(26) According to an example, the processing unit is configured to analyse the at least one image to determine a number of plants in an area of the field that have that type of insect. The information relating to the plant can then comprise a determination if the number of plants having that type of insect in that area divided by the area exceeds a threshold number.
(27) In an example, the processing unit is configured to analyse the at least one image to determine a location or a plurality of locations of the at least one insect in the at least one image.
(28) In this way, not all of the plant or area around a plant need be sprayed, but just at the location of an insect or insects, using for example a high precision, or ultra high precision, chemical sprayer.
(29) According to an example, the at least one image was acquired by at least one camera. The input unit can then be configured to provide the processing unit with at least one geographical location associated with the at least one camera when the at least one image was acquired.
(30) In an example, a GPS unit is used to determine the location of the at least one camera when specific images were acquired.
(31) In an example, an inertial navigation unit is used alone, or in combination with a GPS unit, to determine the location of the at least one camera when specific images were acquired. Thus, for example, the inertial navigation unit, comprising for example one or more laser gyroscopes, is calibrated or zeroed at a known location (such as a docking or charging station) and as it moves with the at least one camera the movement away from that known location in x, y, and z coordinates can be determined, from which the location of the at least one camera when images were acquired can be determined.
(32) In an example, image processing of acquired imagery is used alone, or in combination with a GPS unit, or in combination with a GPS unit and inertial navigation unit, to determine the location of the at least one camera when specific images were acquired. Thus visual markers can be used alone, or in combination with a GPS unit and/or an inertial navigation unit to determine the location of the at least one camera when specific images were acquired.
(33) According to an example, analysis of the at least one image comprises utilisation of a machine learning algorithm.
(34) In an example, the machine learning algorithm comprises a decision tree algorithm.
(35) The decision tree algorithm takes into account at least one of the factors, biodiversity of insects on the ground, insects on the plants, weeds, the vulnerable potential of weeds depending on number of seeds, competitive situation in the field with respect to the crop plant, which defines if the crop plant is already larger than the weed and will overturn the weed anyway, and green bridge in the sense of a host plant for pests such as diseases relevant to the culture or insect pests.
(36) In an example, the machine learning algorithm comprises an artificial neural network.
(37) In an example, the machine learning algorithm has been taught on the basis of a plurality of images. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing imagery of at least one type of weed. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing imagery of a plurality of weeds. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing imagery of at least one type of crop plant. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing imagery of a plurality of crop plants. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing imagery of at least one type of insect. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing imagery of a plurality of insects.
(38) The imagery acquired by a camera is at a resolution that enables one type of weed to be differentiated from another type of weed, and at a resolution that enables one type of crop plant to be differentiated from another type of crop plant, and at a resolution that enables not only insects to be detected but enables one type of insect to be differentiated from another type of insect. Thus a vehicle, such as a UAV, with a camera can fly around a field and acquire imagery. The UAV (drone) can have a Global Positioning System (GPS) and this enables the location of acquired imagery to be determined. The drone can also have inertial navigation systems, based for example on laser gyroscopes. The inertial navigation systems can function alone without a GPS to determine the position of the drone where imagery was acquired, by determining movement away from a known or a number of known locations, such as a charging station. The camera passes the acquired imagery to the processing unit.
(39) Image analysis software operates on the processing unit. The image analysis software can use feature extraction, such as edge detection, and object detection analysis that for example can identify structures such in and around the field such as buildings, roads, fences, hedges, etc. Thus, on the basis of known locations of such objects, the processing unit can patch the acquired imagery to in effect create a synthetic representation of the environment that can in effect be overlaid over a geographical map of the environment. Thus, the geographical location of each image can be determined, and there need not be associated GPS and/or inertial navigation based information associated with acquired imagery. In other words, an image based location system can be used to locate the drone. However, if there is GPS and/or inertial navigation information available then such image analysis, that can place specific images at specific geographical locations only on the basis of the imagery, is not required. Although, if GPS and/or inertial navigation based information is available then such image analysis can be used to augment the geographical location associated with an image.
(40) The processing unit therefore runs image processing software that comprises a machine learning analyser. Images of specific weeds are acquired, with information also relating to the size of weeds being used. Information relating to a geographical location in the world, where such a weed is to be found and information relating to a time of year when that weed is to be found, including when in flower etc. can be tagged with the imagery. The names of the weeds can also be tagged with the imagery of the weeds. The machine learning analyser, which can be based on an artificial neural network or a decision tree analyser, is then trained on this ground truth acquired imagery. In this way, when a new image of vegetation is presented to the analyser, where such an image can have an associated time stamp such as time of year and a geographical location such as Germany or South Africa tagged to it, the analyser determines the specific type of weed that is in the image through a comparison of imagery of a weed found in the new image with imagery of different weeds it has been trained on, where the size of weeds, and where and when they grow can also be taken into account. The specific location of that weed type on the ground within the environment, and its size, can therefore be determined. The machine learning algorithm, or another machine learning algorithm that runs on the processing unit is similarly trained to identify crop plants and detect insects and identify insect types on the basis of ground truth imagery as described above with respect to weeds.
(41) Thus, the UAV can fly around a field and acquire imagery from which a decision can be made to spray a plant or not. This information is then later used by another vehicle that has a one or more spray guns, to enter the field and spray the plants that have been determined to need to be sprayed, whilst leaving other plants alone. Similarly, the image acquisition, processing and spraying can be done by the same platform, for example a UAV with a camera, processing unit and spray guns or a land robot with a camera, processing unit and spray guns.
(42) In this way, a vehicle can operate in real time acquiring imagery and spray some plants whilst permitting other plants (probably the majority) not to be sprayed as the vehicle interrogates a field.
(43) The processing unit has access to a database containing different weed types, different types of crop plants, and different types of insects. This database has been compiled from experimentally determined data.
(44) The vehicle could be a robotic land vehicle,
(45) According to an example, the apparatus is mounted on the vehicle and the at least one camera is mounted on the vehicle.
(46) In an example, the vehicle is a robotic land vehicle. In an example, vehicle is a UAV.
(47) Thus, the system can be used to identify a weed, such as a corn flower and in general not spray it because it is known that it will attract many bees (as it has nectar). However, if it was at a stage of growth before it flowered but aphids were anticipated to be in the field, or indeed had been detected on the plant, then the plant could be sprayed with an insecticide to kill the aphids if it was known that the efficacy of the insecticide would have reduced to an extent that when the corn flower was in flower and attracting bees, these bees would not be killed. Also, account can be taken of a weed having gone beyond flowering, and so not be attracting pollinators, and which can then be sprayed with an insecticide including a systemic insecticide. It is to be noted that spraying can be carried out at night, when pollinators are not around, however the system enables spraying during the day because a determination can be made of the identity of the insects present. In some cases, the biodiversity agronomic settings can be such that insecticide are only applied to sentinel/sacrificial plants.
(48) Thus, insects can be managed in a field using in effect biodiversity and/or agronomic setting to encouraging beneficial insects such as bees, ladybirds whilst controlling pests such as aphids and corn borers and at the same time encouraging beneficial plants that: provide feed value for beneficial insects—also ensuring a food supply for beneficial insects after or before flowering of the main crop; provide continuous root growth with respect to certain plants/weeds—feeding the edaphone (soil life) the entire year; provide deep root growth with respect to certain plants/weeds—soil loosening and draining excess water into subsoil later in the season; attract nematode activity, in order for example that they get out of their eggs—but not feeding them then, helping to lower nematode pressure next season; providing ground cover later in the season—this can lead to a reduction in weeds and a reduction in wind and water erosion, and can help the soil to heat up, but in a way that does not compete with crops (natural underseeds); encourage wind obstacles—reducing wind erosion of the leafs of young sugar beets for example, which may be especially applicable in Strip till.
(49) The at least one biodiversity setting and/or at least one agronomic rule can include one or more of the following in addition to what is described elsewhere within this document:
(50) feed value for beneficial insects—also ensuring a food supply for beneficial insects after or before flowering of the main crop;
(51) continuous root growth with respect to certain plants/weeds—feeding the edaphone (soil life) the entire year;
(52) deep root growth with respect to certain plants/weeds—soil loosening and draining excess water into subsoil later in the season.
(53) attracting nematode activity, in order for example that they get out of their eggs—but not feeding them then. This can help to lower nematode pressure next season;
(54) ground cover later in season—this can lead to a reduction in weeds and a reduction in wind and water erosion, and can help the soil to heat up, but in a way that does not compete with crops (natural underseeds);
(55) encourage wind obstacles—reducing wind erosion of the leafs of young sugar beets for example, which may be especially applicable in Strip till.
(56)
(57) in a providing step 210, also referred to as step a), providing a processing unit with at least one image of a field;
(58) in an analyzing step 220, also referred to as step c), analysing by the processing unit the at least one image to determine information relating to a plant that is present;
(59) in a determining step 230, also referred to as step d), determining by the processing unit if the plant is to be sprayed with an insecticide and/or herbicide or is not to be sprayed with the insecticide and/or herbicide based on the information relating to the plant; and
(60) in an outputting step 240, also referred to as step e), outputting information by an output unit that is useable to activate at least one spray gun if the determination is made that the plant is to be sprayed with the insecticide and/or herbicide.
(61) In an example, step c) comprises step c1) analysing 221 the at least one image to identify the plant, and wherein in step d) the information relating to the plant comprises the identity of the plant.
(62) In an example, step c) comprises step c2) analysing 222 the at least one image to determine if the plant is a specimen of a crop grown in the field or if the plant is a weed, and wherein in step d) the information relating to the plant comprises the determination of the plant belonging to the crop or the determination that the plant is a weed.
(63) In an example, in step c2) if the determination is made that the plant is a weed, step c2) comprises step c2a) analysing 223 the at least one image to determine a type of weed for the plant, and wherein in step d) the information relating to the plant comprises the determined type of weed.
(64) In an example, following step c2a) the method comprises step c3) analyzing 224 the at least one image to determine a number of weeds of that type in an area of the field, and wherein in step d) the information relating to the plant comprises a determination if the number of weeds of that type divided by the area exceeds a threshold number of weeds of that type per unit area.
(65) In an example, step c) comprises step c4) analysing 225 the at least one image to detect at least one insect on the plant, and wherein in step d) the information relating to the plant comprises the detection of the at least one insect.
(66) In an example, step c4) comprises step c4a) analysing 226 the at least one image to determine a type of insect for an insect of the at least one insect detected, and wherein in step d) the information relating to the plant comprises the type of insect.
(67) In an example, following step c4a) the method comprises step c5) analysing 227 the at least one image to determine a number of insects of that type on the plant, and wherein in step d) the information relating to the plant comprises the determined number of insects of that type on the plant exceeding a threshold number.
(68) In an example, following step c4a) and/or step c5) the method comprises step c6) analysing 228 the at least one image to determine a number of plants in an area of the field that have that type of insect, and wherein in step d) the information relating to the plant comprises a determination if the number of plants having that type of insect in that area divided by the area exceeds a threshold number.
(69) In an example, the at least one image was acquired by at least one camera, and wherein the method comprises step b) providing 250 at least one geographical location associated with the at least one camera when the at least one image was acquired, and wherein in step d) the information relating to the plant comprises the at least one geographical location.
(70) In an example of the method, analysis of the at least one image comprises utilisation of a machine learning algorithm.
(71) The apparatus, system and method for insect management are now described in more detail with respect to
(72)
(73) The UAV has a camera, and as it flies over the field imagery is acquired. The UAV also has a GPS and inertial navigation system, which enables both the position of the UAV to be determined and the orientation of the camera also to be determined. From this information the footprint of an image on the ground can be determined, such that particular parts in that image, such as the example of the first type of weed, can be located with respect to absolute geospatial coordinates, and also the position of insects on the plants can be located. The imagery is also acquired at a resolution, either due to the camera having a very large CCD element, or using a zoom function or flying close to individual plants, that enables insects to be spotted against the plant foliage and even for the type of insect to be identified. The image data acquired by the camera is transferred to a processing unit external to the UAV. The processing unit image processes the images, using a machine learning algorithm based on an artificial neural network that has been trained on numerous image examples of different types of weeds, crop plans, and insects to determine if a plant being imaged is a crop plant, is a weed and if a weed to determine the type of weed, and whether there are insects on the plant and what insects are present. The processing unit then utilizes biodiversity settings and insect management settings, for this crop in this part of the world, and at the stage of its growth between seeding and harvesting, to determine if the plant should be sprayed with an insecticide and/or herbicide. The biodiversity setting and insect management settings are used by the processing unit to determine if a crop plant with particular insects should be sprayed with an insecticide or not, whether a weed with insects or not should be sprayed with an insecticide and/or herbicide. Also a decision can be made to spray a plant taking into account any associated conditions, such as a threshold number density per unit area.
(74) Referring to the specific weeds shown and discussed with respect to
(75) As described above, some weeds if present can help feed beneficial insects now and in the future, but can if in large enough quantities contaminate the harvest and/or be at such a number density per unit that they would be constitute too large a source of food for such beneficial insects. Thus, over a specific number density of weeds per unit area, which can vary between different types of weeds, the weed stops being of benefit with respect to insect management and starts to be of detrimental value with respect to insect management, and how such insect management relates to the final harvest. Therefore, the processing unit when it detects a weed of a particular type that is beneficial to beneficial insects it logs the type of weed and its location and does this for all weeds that are beneficial. If, however the number density of a particular type of weed goes above a threshold number density of that weed per unit area where the beneficial effects start to become detrimental, the processing unit determines that that a number of those weeds need to be sprayed with herbicide to kill those weeds as part of the overall insect management strategy. Threshold number densities for particular types of weeds can be 0.1 per square metre (such that in 10 square metres of crop one weed (or less) of this type is permitted), 0.02 per square metre (such that in 25 square metres of crop one weed (or less) of this type is permitted). Other threshold number densities of weeds can be used, such as 5, 1, 0.5, 0.05, 0.005, 0.001 per square metre and threshold values can be above or below these figures.
(76) The processing unit thus generates a spray map of the locations of the plants, taking into account whether there are crop plants or weeds and whether there are pest insects or beneficial insects of those weeds and taking into account the type of weed and information relating to the present time in the harvest and how weeds could be beneficial to insects now and in the future.
(77) A second robotic land vehicle then enters the field and using its own GPS and inertial navigation system moves to the locations of the crop plants and weeds that need to be sprayed, and using a first spray gun spray certain crop plants and certain weeds with an insecticide. The robotic land vehicle also uses a second spray gun to spray some of the weeds that have been sprayed with an insecticide with a herbicide, and sprays other weeds with the herbicide even when they were not sprayed with the insecticide, as part of the insect management process.
(78) In another example of the UAV, which can be the UAV shown in
(79)
(80) The above has been described with respect to aphids, bees and ladybirds, however other pests, and beneficial insects can be managed in the above described manner through the targeted use of insecticides and/or herbicides on the basis of image processing.
(81) Image processing to enable analysis to determine a weed type.
(82) A specific example of how an image is processed, and determined to be suitable for image processing in order that a type of weed can be determined is now described:
(83) 1. A digital image—in particular a colored image—of a weed is captured.
(84) 2. Areas with a predefined color and texture within the digital image are contoured within a boundary contour. Typically, one may expect one contoured area from one weed plant. However, there may also be more than one contoured area from different, potentially not connected leafs, from two weed plants, or the like.—Such a detection or determining process detects boundaries of green areas of the digital image. During this process at least one contoured area—e.g., one or more leafs, as well as one or more weed plants—may be built comprising pixels relating to the weed within a boundary contour. However, it may also be possible, that the digital image has captured more than one leaf and/or the stem. Consequently, more than one contoured area may be determined.
(85) 3. Determining if the boundary contour covers a large enough area, and determining a sharpness (e.g. degree of focus) of the image data within the boundary contour. This firstly ensures that there will be sufficient image data upon which a determination can be made as to the type of weed, and secondly determines that a minimum quality of the digital image will be satisfied in order that the type of weed can be made.
(86) 4. If both criteria in 3) are satisfied, the digital image, and specifically that within the boundary contour is sent to the processing unit for image analysis by the artificial neural network to determine the type of weed as described above.
(87) 5. Similar image processing steps are used to determine one crop plant from another crop plant, and used to detect insects and to identify the type of insect.
(88) In another exemplary embodiment, a computer program or computer program element is provided that is characterized by being configured to execute the method steps of the method according to one of the preceding embodiments, on an appropriate apparatus or system.
(89) The computer program element might therefore be stored on a computer unit, which might also be part of an embodiment. This computing unit may be configured to perform or induce performing of the steps of the method described above. Moreover, it may be configured to operate the components of the above described apparatus and/or system. The computing unit can be configured to operate automatically and/or to execute the orders of a user. A computer program may be loaded into a working memory of a data processor. The data processor may thus be equipped to carry out the method according to one of the preceding embodiments.
(90) This exemplary embodiment of the invention covers both, a computer program that right from the beginning uses the invention and computer program that by means of an update turns an existing program into a program that uses invention.
(91) Further on, the computer program element might be able to provide all necessary steps to fulfill the procedure of an exemplary embodiment of the method as described above.
(92) According to a further exemplary embodiment of the present invention, a computer readable medium, such as a CD-ROM, USB stick or the like, is presented wherein the computer readable medium has a computer program element stored on it which computer program element is described by the preceding section.
(93) A computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
(94) However, the computer program may also be presented over a network like the World Wide Web and can be downloaded into the working memory of a data processor from such a network. According to a further exemplary embodiment of the present invention, a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the invention.
(95)
(96) in a providing step 210, also referred to as step a), providing a processing unit with at least one image of a field;
(97) in an analyzing step 220, also referred to as step b), analysing by the processing unit the at least one image to determine that a weed is present;
(98) in a determining step 230, also referred to as step g), determining by the processing unit if the weed is to be controlled by a weed control technology or is not to be
(99) controlled by the weed control technology; and
(100) in an outputting step 240, also referred to as step h), outputting information by an output unit that is useable to activate the weed control technology, if the determination is made that the weed is to be controlled by the weed control technology.
(101) In an example, step g) comprises step g1) applying 232 at least one biodiversity setting and/or at least one agronomic rule.
(102) In an example, the method comprises step c) determining 250 a type of weed for the weed that is present, and wherein in step g1) the at least one biodiversity setting and/or at least agronomic rule comprises a list of at least one weed that is to be controlled and/or wherein in step g1) the at least one biodiversity setting and/or at least agronomic rule comprises a list of at least one weed that is not to be controlled.
(103) In an example, the method comprises step d) analysing 260 the at least one image to determine that a number of weeds in an area are present, and wherein in step g1) the at least one biodiversity setting and/or at least one agronomic rule comprises a threshold number of weeds per unit area, and wherein in step g) the determination that the weed is to be controlled comprises the number of weeds divided by the area exceeding the threshold number of weeds per unit area.
(104) In an example, step d) comprises analysing 262 the at least one image to determine that a number of weeds of a particular type in an area are present, and wherein in step g1) the at least one biodiversity setting and/or at least one agronomic rule comprises a threshold number of weeds of that type per unit area, and wherein in step g) the determination that the weed is to be controlled comprises the number of weeds of a particular type divided by the area exceeding the threshold number of weeds of that type per unit area.
(105) In an example of the method, when the type of weed is on the list of weeds that are not to be controlled, the type of weed is controlled when a number of weeds of that type divided by the area exceeds the threshold number of weeds of that type per unit area.
(106) In an example, the method comprises step e) analysing 270 the at least one image to determine a location of the weed in the at least one image.
(107) In an example, the at least one image was acquired by at least one camera, and wherein the method comprises step f) providing 280 at least one geographical location associated with the at least one camera when the at least one image was acquired.
(108) In an example of the method, analysis of the at least one image comprises utilisation of a machine learning algorithm.
(109) The apparatus, system and method for weed management are now described in more detail with respect to
(110)
(111) The UAV has a camera, and as it flies over the field imagery is acquired. The UAV also has a GPS and inertial navigation system, which enables both the position of the UAV to be determined and the orientation of the camera also to be determined. From this information the footprint of an image on the ground can be determined, such that particular parts in that image, such as the example of the first type of weed, can be located with respect to absolute geospatial coordinates. The image data acquired by the camera is transferred to a processing unit external to the UAV. The processing unit image processes the images, using a machine learning algorithm based on an artificial neural network that has been trained on numerous image examples of different types of weeds, to determine if a weed is present and also to determine the type of weed. The processing unit then utilizes biodiversity settings, for this crop in this part of the world, and at the stage of its growth between seeding and harvesting. The biodiversity settings are used by the processing unit to determine if specific weeds cannot be tolerated, or whether other specific weeds can be tolerated, and if so if there are any associated conditions, such as a threshold number density per unit area. Referring to the specific weeds shown and discussed with respect to
(112) In another example of the UAV, which can be the UAV shown in
(113) In another example, the UAV simply detects a weed as being a plant that is not a crop plant, and destroys all but a number of these weeds to leave a certain number per unit area in the crop or a certain number in the field. For example, a determination can be made to destroy nine out of every ten weeds (or 99 out of 100) detected to leave one tenth (or 1%) of the original weeds in the field, irrespective of the weed or to leave 0.1 weed per square metre in the field. Thus in this manner, there is no need to identify the type of weed or to use biodiversity settings.
(114)
(115) It has to be noted that embodiments of the invention are described with reference to different subject matters. In particular, some embodiments are described with reference to method type claims whereas other embodiments are described with reference to the device type claims. However, a person skilled in the art will gather from the above and the following description that, unless otherwise notified, in addition to any combination of features belonging to one type of subject matter also any combination between features relating to different subject matters is considered to be disclosed with this application. However, all features can be combined providing synergetic effects that are more than the simple summation of the features.
(116) While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing a claimed invention, from a study of the drawings, the disclosure, and the dependent claims.
(117) In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or other unit may fulfill the functions of several items re-cited in the claims. The mere fact that certain measures are re-cited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.