UNMANNED AERIAL VEHICLE

20220127000 · 2022-04-28

Assignee

Inventors

Cpc classification

International classification

Abstract

An unmanned aerial vehicle (UAV) for application of an active ingredient to agricultural crops comprises at least one liquid reservoir, at least one liquid application unit, a processing unit, at least one set of rotor blades, and a plurality of legs. The at least one liquid application unit is configured to receive at least one input from the processing unit. The at least one input is useable to activate the at least one liquid application unit. The UAV is configured to fly within an environment using the at least one set of rotor blades, land within the environment, and walk on the plurality of legs to a location to apply liquid from the liquid reservoir to at least one plant. A location to apply the liquid is determined based on image analysis of one or more image of at least one image of the environment.

Claims

1. An unmanned aerial vehicle for application of an active ingredient to agricultural crops, comprising: at least one liquid reservoir; at least one liquid application unit; a processing unit; at least one set of rotor blades; and a plurality of legs; wherein the liquid reservoir is configured to hold a liquid comprising the active ingredient; wherein, the at least one liquid application unit is in fluid communication with the at least one liquid reservoir; wherein, the at least one liquid application unit is configured to receive at least one input from the processing unit, wherein the at least one input is useable to activate the at least one liquid application unit; wherein, the unmanned aerial vehicle is configured to fly within an environment using the at least one set of rotor blades; wherein, the unmanned aerial vehicle is configured to land within the environment; and wherein, the unmanned aerial vehicle is configured to walk on the plurality of legs to a location to apply the liquid to at least one plant, and wherein the location is determined based on image analysis of one or more image of at least one image of the environment.

2. The unmanned aerial vehicle of claim 1, wherein the processing unit is configured to carry out analysis of the one or more image of the at least one image to determine the location for application of the liquid to the at least one plant.

3. The unmanned aerial vehicle of claim 1, wherein analysis of the at least one image to determine the at least one location for application of the liquid comprises a determination of at least one type of weed, and/or comprises a determination of at least one type of disease, and/or comprises a determination of at least one type of pest, and/or comprises a determination of at least one type of insect, and/or comprises a determination of at least one type of nutritional deficiency.

4. The unmanned aerial vehicle of claim 1, wherein a landing location for the unmanned aerial vehicle is determined based on image analysis of one or more image of the at least one image of the environment.

5. The unmanned aerial vehicle of claim 4, wherein the one or more image analyzed for the determination of the landing location is the same as the one or more image analyzed for the determination of the location to apply the liquid to at least one plant.

6. The unmanned aerial vehicle of claim 4, wherein the one or more image analyzed for the determination of the landing location is different to the one or more image analyzed for the determination of the location to apply the liquid to at least one plant.

7. The unmanned aerial vehicle of claim 1, wherein an end of each of the plurality of legs that is distal to an end that is connected to a body of the unmanned aerial vehicle comprises at least one stability structure.

8. The unmannedUnmanncd aerial vehicle of claim 1, wherein the at least one liquid application unit is moveable with respect to a body of the unmanned aerial vehicle, wherein the processing unit of the unmanned aerial vehicle is configured to move the at least liquid application unit.

9. The unmanned aerial vehicle of claim 8, wherein the at least one liquid application unit is mounted on at least one extendable arm.

10. The unmanned aerial vehicle of claim 8, wherein when the unmanned aerial vehicle has landed and walked to the location for application of the liquid to the at least one plant, the processor is configured to move the at least one liquid application unit to a specific location for activation of the at least one liquid application unit based on the image analysis of one or more image of the at least one image of the environment.

11. The unmanned aerial vehicle of claim 1, wherein the unmanned aerial vehicle comprises a camera connected to a body of the unmanned aerial vehicle, wherein the camera is configured to acquire the at least one image.

12. The unmanned aerial vehicle of claim 11, wherein the camera is configured to move with respect to the body of the unmanned aerial vehicle, wherein the processing unit of the unmanned aerial vehicle is configured to move the camera.

13. The unmanned aerial vehicle of claim 1, wherein the unmanned aerial vehicle is configured to determine the location for application of the liquid after the unmanned aerial vehicle has landed within the environment.

14. The unmanned aerial vehicle of claim 1, wherein the unmanned aerial vehicle is configured to determine the location for application of the liquid before the unmanned aerial vehicle has landed within the environment.

15. The unmanned aerial vehicle of claim 1, wherein the unmanned aerial the vehicle comprises location determination unit.

16. The unmanned aerial vehicle of claim 1, wherein a determination is made to land and walk to the location to apply the liquid based on one or more of: a wind speed, a wind direction, a state of precipitation.

17. The unmanned aerial vehicle of claim 16, wherein the unmanned aerial vehicle is configured to receive information from an external system relating to one or more of: the wind speed, the wind direction, the state of precipitation.

18. The unmanned aerial vehicle of claim 16, wherein the unmanned aerial vehicle comprises one or more of: a wind speed sensor, a wind direction sensor, a precipitation sensor.

19. The unmanned aerial vehicle of claim 1, wherein the unmanned aerial vehicle is configured to stop or feather the at least one set of rotor blades when the unmanned aerial vehicle has landed in the environment.

20. The unmanned aerial vehicle of claim 1, wherein at least one protective cage or protective mesh surrounds the at least one set of rotor blades.

21. The unmanned aerial vehicle of claim 1, wherein the unmanned aerial vehicle is configured to fly to a location to apply the liquid to at least one plant whilst the unmanned aerial vehicle is flying, wherein the location is determined based on image analysis of one or more image of the at least one image of the environment.

22. The unmanned aerial vehicle of claim 21, wherein the processing unit is configured to carry out analysis of the at least one image to determine the location for application of the liquid to the at least one plant whilst the unmanned aerial vehicle is flying.

23. The unmanned aerial vehicle of claim 21, wherein the processing unit is configured to utilize an algorithm to determine locations within the environment to which the unmanned aerial vehicle should walk to apply the liquid to at least one plant whilst on the ground and locations within the environment to which the unmanned aerial vehicle should fly to apply the liquid to at least one plant whilst in the air, wherein the determination comprises an analysis of the at least one image.

24. The unmanned aerial vehicle of claim 23, wherein the determination of the locations to which the unmanned aerial vehicle should walk to apply the liquid and the determination of the locations to which the unmanned aerial should fly to apply the liquid, comprises utilization of a determined power level of a battery configured to power the unmanned aerial vehicle and/or comprises utilization of a determined operation duration required for the environment.

25. The unmanned aerial vehicle of claim 1, wherein each of the at least one liquid application unit is situated beneath one or more of the at least one set of rotor blades.

26. The unmanned aerial vehicle of claim 25, wherein each liquid application unit of the at least one liquid application unit is situated beneath a different set of rotor blades of the at least one set of rotor blades.

27. The unmanned aerial vehicle of claim 1, wherein the at least one liquid application unit comprises at least one nozzle applicator or at least one spinning disc applicator.

28. A method for application of an active ingredient by an unmanned aerial vehicle to agricultural crops, wherein the unmanned aerial vehicle comprises at least one liquid reservoir, at least one liquid application unit, at least one set of rotor blades, and a plurality of legs; the method comprising: holding a liquid comprising the active ingredient in the liquid reservoir housed within or attached to a body of the unmanned aerial vehicle, wherein the at least one liquid application unit is connected to the body of the unmanned aerial vehicle, and the at least one liquid application unit is in fluid communication with the liquid reservoir; receiving by the at least one liquid application unit at least one input from a processing unit, wherein the at least one input is useable to activate the at least one liquid application unit; flying the unmanned aerial vehicle within an environment using the at least one set of rotor blades; landing the unmanned aerial vehicle within the environment to apply the liquid to at least one plant; and walking on the plurality of legs to a location to apply the liquid to at least one plant, wherein the location is determined based on image analysis of one or more image of at least one image of the environment.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0073] Exemplary embodiments will be described in the following, by way of example only, with reference to the following drawings:

[0074] FIG. 1 shows a schematic set up of an example of an unmanned aerial vehicle for application of an active ingredient to agricultural crops;

[0075] FIG. 2 shows a method for application of an active ingredient by an unmanned aerial vehicle to agricultural crops; and

[0076] FIGS. 3a-3f show detailed examples of unmanned aerial vehicles in operation.

DETAILED DESCRIPTION OF EMBODIMENTS

[0077] FIG. 1 shows an example of an unmanned aerial vehicle (UAV) 10 for application of an active ingredient to agricultural crops, according to some embodiments. Features shown in solid lines are essential features, whilst features shown in hashed lines are optional. The UAV comprises at least one liquid reservoir 20, at least one liquid application unit 30, a processing unit 40, at least one set of rotor blades 50, and a plurality of legs 60. This could be three legs, four legs or even more than four legs. The liquid reservoir is configured to hold a liquid comprising the active ingredient. The at least one liquid application unit is in fluid communication with the at least one liquid reservoir. The at least one liquid application unit is configured to receive at least one input from the processing unit. The at least one input is useable to activate the at least one liquid application unit. The unmanned aerial vehicle is configured to fly within an environment using the at least one set of rotor blades. The unmanned aerial vehicle is configured to land within the environment. The unmanned aerial vehicle is configured to walk on the plurality of legs to a location to apply the liquid to at least one plant. The location to apply the liquid is determined based on image analysis of one or more image of at least one image of the environment.

[0078] In an example, the liquid application unit comprises a spray gun or spray nozzle or rotating disc, configured to spray the liquid that can comprises atomization of that liquid as part of the spray process.

[0079] In an example, the liquid application unit comprises an application device configured to contact vegetation during application of the liquid. An example of such an application device is a paintbrush type device, which dispenses liquid to the brushes which is applied to foliage in a brushing manner

[0080] In an example, the unmanned aerial vehicle comprises moveable vegetation holding means, and when the unmanned aerial vehicle has landed within the environment the processor is configured to move the vegetation holding means to hold the at least one plant based on the image analysis of the at least one image of the environment. Thus a plant to which the liquid is being applied can be held steady during application. In an example, the moveable vegetation holding means comprises a moveable arm. In an example, the moveable arm is extendable.

[0081] In an example, the unmanned aerial vehicle is used for weed control along railway tracks and the surrounding area.

[0082] According to an example, the processing unit is configured to carry out the analysis of the one or more image of the at least one image to determine the location for application of the liquid to the at least one plant.

[0083] In an example, analysis of the at least one image to determine the at least one location for activation of the liquid application unit comprises a determination of at least one weed, and/or comprises a determination of at least one disease, and/or comprises a determination of at least one pest, and/or comprises a determination of at least one insect, and/or comprises a determination of at least one nutritional deficiency.

[0084] According to an example, analysis of the at least one image to determine the at least one location for application of the liquid comprises a determination of at least one type of weed, and/or comprises a determination of at least one type of disease.

[0085] According to an example, analysis of the at least one image to determine the at least one location for application of the liquid comprises a determination of at least one type of pest, and/or comprises a determination of at least one type of insect.

[0086] According to an example, analysis of the at least one image to determine the at least one location for application of the liquid comprises a determination of at least one type of nutritional deficiency.

[0087] In an example, analysis of the at least one image comprises utilization of a machine learning algorithm.

[0088] In an example, the machine learning algorithm comprises a decision tree algorithm.

[0089] In an example, the machine learning algorithm comprises an artificial neural network.

[0090] In an example, the machine learning algorithm has been taught on the basis of a plurality of images. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing imagery of at least one type of weed, and/or at least of type of plant suffering from one or more diseases, and/or at least one type of plant suffering from insect infestation from one or more types of insect, and/or at least one type of insect (when the imagery has sufficient resolution), and/or at least one type of plant suffering from one or more pests, and/or at least one type of plant suffering from one or more types of nutritional deficiency. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing such imagery.

[0091] Thus a UAV (also called a drone) can have a one camera and a processing unit which uses the imagery acquired by the camera to activate the liquid application unit. The camera acquires imagery of the environment of a field. The imagery need not be acquired by the drone, but could be acquired by a different drone and then passed to the drone for processing. The imagery acquired by the camera is at a resolution that enables vegetation to be identified as vegetation and indeed can be at resolution that enables one type of weed to be differentiated from another type of weed. The imagery can be at a resolution that enables pest or insect infested crops to be determined, either from the imagery of the crop itself or from acquisition of for examples insects themselves. The drone can have a Global Positioning System (GPS) and this enables the location of acquired imagery to be determined. For example the orientation of cameras and the position of the drone when imagery was acquired can be used to determine the geographical footprint of the image at the ground plane. The drone can also have inertial navigation systems, based for example on laser gyroscopes. In addition to being used to determine the orientation of the drone and hence of the camera, facilitating a determination of where on the ground the imagery has been acquired, the inertial navigation systems can function alone without a GPS to determine the position of the drone, by determining movement away from a known or a number of known locations, such as the filling/charging station. The camera passes the acquired imagery to the processing unit. Image analysis software operates on the processing unit. The image analysis software can use feature extraction, such as edge detection, and object detection analysis that for example can identify structures such in and around the field such as buildings, roads, fences, hedges, etc. Thus, on the basis of known locations of such objects, the processing unit can patch the acquired imagery to in effect create a synthetic representation of the environment that can in effect be overlaid over a geographical map of the environment. Thus, the geographical location of each image can be determined, and there need not be associated GPS and/or inertial navigation based information associated with acquired imagery. In other words, an image based location system can be used to locate the drone. However, if there is GPS and/or inertial navigation information available then such image analysis, that can place specific images at specific geographical locations only on the basis of the imagery, is not required. Although, if GPS and/or inertial navigation based information is available then such image analysis can be used to augment the geographical location associated with an image.

[0092] The processing unit runs further image processing software. This software analyses an image to determine the areas within the image where vegetation is to be found, and also analyses the imagery to determine where vegetation is not to be found (for example at pathways across a field, around the borders of a field and even tractor wheel tracks across a field). This latter information can be used to determine where the liquid is not required to be applied.

[0093] Vegetation can be detected based on the shape of features within acquired images, where for example edge detection software is used to delineate the outer perimeter of objects and the outer perimeter of features within the outer perimeter of the object itself; organic material between ballast can be detected in a similar manner when the unmanned aerial vehicle is used for weed control along a railway track environment. A database of vegetation imagery can be used in helping determine if a feature in imagery relates to vegetation or not, using for example a trained machine learning algorithm such as an artificial neural network or decision tree analysis. The camera can acquire multi-spectral imagery, with imagery having information relating to the color within images, and this can be used alone, or in combination with feature detection to determine where in an image vegetation is to be found. As discussed above, because the geographical location of an image can be determined, from knowledge of the size of an image on the ground, the location or locations of vegetation, and/or other areas where the liquid is to be applied, can be found in an image and can then be mapped to the exact position of that vegetation (area) on the ground.

[0094] The processing unit has access to a database containing different weed types, and the optimum liquid to be applied over that weed. This database has been compiled from experimentally determined data. The image processing software, using the machine learning algorithm, has also been taught to recognize insects, plants infested with insects, plants suffering from pests, and plants that are suffering from nutritional deficiencies. This is done in the same manner as discussed above, through training based on previously acquired imagery. The database also contains what liquid should be applied in what situation.

[0095] According to an example, a landing location for the unmanned aerial vehicle is determined based on image analysis of one or more image of the at least one image of the environment.

[0096] According to an example, the one or more image analyzed for the determination of the landing location is the same as the one or more image analyzed for the determination of the location to apply the liquid to at least one plant.

[0097] According to an example, the one or more image analyzed for the determination of the landing location is different to the one or more image analyzed for the determination of the location to apply the liquid to at least one plant.

[0098] According to an example, an end 62 of each of the plurality of legs that is distal to an end that is connected to a body 70 of the unmanned aerial vehicle comprises at least one stability structure 64.

[0099] In an example, the at least one stability structure comprises one or more of: a spike; a disc; a ball; a cone; a mesh.

[0100] According to an example, the at least one liquid application unit is moveable with respect to a body of the unmanned aerial vehicle. The processing unit of the unmanned aerial vehicle is configured to move the at least liquid application unit.

[0101] According to an example, the at least one liquid application unit is mounted on at least one extendable arm 80.

[0102] According to an example, when the unmanned aerial vehicle has landed and walked to the location for application of the liquid to at least one plant, the processor is configured to move the at least one liquid application unit to a specific location for activation of the at least one liquid application unit based on the image analysis of one or more image of the at least one image of the environment.

[0103] According to an example, the unmanned aerial vehicle comprises a camera 90 connected to a body of the unmanned aerial vehicle, wherein the camera is configured to acquire the at least one image.

[0104] According to an example, the camera is configured to move with respect to the body of the unmanned aerial vehicle. The processing unit of the unmanned aerial vehicle is configured to move the camera.

[0105] In an example, the camera is mounted on an extendable arm.

[0106] In an example, the extendable arm upon which the camera is mounted is the same extendable arm upon which the liquid application unit is mounted.

[0107] In an example, determination of the location for activation of the liquid application unit comprises movement of the camera.

[0108] In an example, the processor of the unmanned aerial vehicle that is configured to move the camera is the processing unit that is configured to analyze the image of the environment.

[0109] According to an example, the unmanned aerial vehicle is configured to determine the location for application of the liquid after the unmanned aerial vehicle has landed within the environment.

[0110] According to an example, the unmanned aerial vehicle is configured to determine the location for application of the liquid before the unmanned aerial vehicle has landed within the environment.

[0111] According to an example, the unmanned aerial the vehicle comprises location determining means 100.

[0112] In an example, the location determining means is configured to provide the processing unit with at least one location associated with the camera when the at least one image was acquired.

[0113] The location can be a geographical location, with respect to a precise location on the ground, or can be a location on the ground that is referenced to another position or positions on the ground, such as a boundary of a field or the location of a drone docking station or charging station. In other words, an absolute geographical location can be utilized or a location on the ground that need not be known in absolute terms, but that is referenced to a known location can be used. Thus, by correlating an image with the location where it was acquired, the liquid application unit can be accurately activated to that location. Thus, even when for example a drone has run out of liquid, and is flying back to a larger reservoir to fill up with liquid, it can continue to acquire imagery to be used to activate the liquid application unit at specific locations even if that location is not immediately addressed but is liquid is applied later when the drone has re-charged. Also, when the drone determines that a location should have a liquid applied that it is not carrying that information can be logged and used by that drone later when it carries the required liquid or transmitted to another drone that carries that liquid, and that other drone can fly to the location and apply its liquid at that location.

[0114] In an example, the location is an absolute geographical location.

[0115] In an example, the location is a location that is determined with reference to a known location or locations. In other words, an image can be determined to be associated with a specific location on the ground, without knowing its precise geographical position, but by knowing the location where an image was acquired with respect to known position(s) on the ground the liquid application unit can then be activated at a later time at that location by moving the liquid application unit to that location or enabling another unmanned aerial vehicle to move to that location at activate its liquid application unit at that location.

[0116] In an example, a GPS unit is used to determine, and/or is used in determining, the location, such as the location of the camera when specific images were acquired.

[0117] In an example, an inertial navigation unit is used alone, or in combination with a GPS unit, to determine the location, such as the location of the camera when specific images were acquired. Thus for example, the inertial navigation unit, comprising for example one or more laser gyroscopes, is calibrated or zeroed at a known location (such as a drone docking or charging station) and as it moves with the at least one camera the movement away from that known location in x, y, and z coordinates can be determined, from which the location of the at least one camera when images were acquired can be determined.

[0118] In an example, image processing of acquired imagery is used alone, or in combination with a GPS unit, or in combination with a GPS unit and inertial navigation unit, to determine the location, such as the location of the camera when specific images were acquired. In other words, as the vehicle moves it can acquire imagery that is used to render a synthetic representation of the environment and from specific markers, such as the position of trees, field boundaries, roads etc. the vehicle can determine its position within that synthetic environment from imagery it acquires.

[0119] According to an example, a determination is made to land and walk to the location to apply the liquid based on one or more of: a wind speed, a wind direction, a state of precipitation.

[0120] According to an example, the unmanned aerial vehicle is configured to receive information from an external system 110 relating to one or more of: the wind speed, the wind direction, the state of precipitation.

[0121] According to an example, the unmanned aerial vehicle comprises one or more of: a wind speed sensor 120, a wind direction sensor 130, and a precipitation sensor 140.

[0122] According to an example, the unmanned aerial vehicle is configured to stop or feather the at least one set of rotor blades when the unmanned aerial vehicle has landed in the environment.

[0123] According to an example, at least one protective cage or protective mesh 150 surrounds the at least one set of rotor blades.

[0124] According to an example, the unmanned aerial vehicle is configured to fly to a location to apply the liquid to at least one plant whilst the unmanned aerial vehicle is flying, wherein the location is determined based on image analysis of one or more image of the at least one image of the environment.

[0125] According to an example, the processing unit is configured to carry out the analysis of the at least one image to determine the location for application of the liquid to the at least one plant whilst the unmanned aerial vehicle is flying.

[0126] According to an example, the processing unit is configured to utilize an algorithm to determine locations within the environment to which the unmanned aerial vehicle should walk to apply the liquid to at least one plant whilst on the ground and locations within the environment to which the unmanned aerial vehicle should fly to apply the liquid to at least one plant whilst in the air, wherein the determination comprises an analysis of the at least one image.

[0127] According to an example, the determination of the locations to which the unmanned aerial vehicle should walk to apply the liquid comprises utilization of a determined power level of a battery configured to power the unmanned aerial vehicle and/or comprises utilization of a determined operation duration required for the environment. Also, the determination of the locations to which the unmanned aerial should fly to apply the liquid, comprises utilization of the determined power level of a battery configured to power the unmanned aerial vehicle and/or comprises utilization of the determined operation duration required for the environment.

[0128] According to an example, each of the at least one liquid application unit is situated beneath one or more of the at least one set of rotor blades.

[0129] According to an example, each liquid application unit of the at least one liquid application unit is situated beneath a different set of rotor blades of the at least one set of rotor blades.

[0130] According to an example, the at least one liquid application unit comprises at least one nozzle applicator 160 or at least one spinning disc applicator 170.

[0131] FIG. 2 shows an example of a method 200 for application of an active ingredient by an unmanned aerial vehicle to agricultural crops, according to some embodiments. The unmanned aerial vehicle comprises at least one liquid reservoir, at least one liquid application unit, at least one set of rotor blades, and a plurality of legs. The method 200 comprises: [0132] in a holding step 210, also referred to as step a), holding a liquid comprising the active ingredient in the liquid reservoir housed within or attached to a body of the unmanned aerial vehicle, wherein the at least one liquid application unit is connected to the body of the unmanned aerial vehicle, and the at least one liquid application unit is in fluid communication with the liquid reservoir; [0133] in a receiving step 220, also referred to as step b), receiving by the at least one liquid application unit at least one input from a processing unit, wherein the at least one input is useable to activate the at least one liquid application unit; [0134] in a flying step 230, also referred to as step c), flying the unmanned aerial vehicle within an environment using the at least one set of rotor blades; [0135] in a landing step 240, also referred to as step d), landing the unmanned aerial vehicle within the environment to apply the liquid to at least one plant; and [0136] in a walking step 250, also referred to as step e), walking on the plurality of legs to a location to apply the liquid to at least one plant, wherein the location is determined based on image analysis of one or more image of at least one image of the environment.

[0137] In an example, the method comprises analyzing by the processing unit the one or more image of the at least one image to determine the location for application of the liquid to the at least one plant.

[0138] In an example, the analyzing of the at least one image to determine the at least one location for application of the liquid comprises a determination of at least one type of weed, and/or comprises a determination of at least one type of disease, and/or comprises a determination of at least one type of pest, and/or comprises a determination of at least one type of insect, and/or comprises a determination of at least one type of nutritional deficiency.

[0139] In an example, the method comprise determining a landing location for the unmanned aerial vehicle based on image analysis of one or more image of the at least one image of the environment acquired by the camera.

[0140] In an example, the one or more image analyzed for the determination of the landing location is the same as the one or more image analyzed for the determination of the location to apply the liquid to at least one plant.

[0141] In an example, the one or more image analyzed for the determination of the landing location is different to the one or more image analyzed for the determination of the location to apply the liquid to at least one plant.

[0142] In an example, an end of each of the plurality of legs that is distal to an end that is connected to a body of the unmanned aerial vehicle comprises at least one stability structure.

[0143] In an example, the at least one liquid application unit is moveable with respect to a body of the unmanned aerial vehicle, and wherein the method comprises moving the at least liquid application unit under control of the processing unit of the UAV.

[0144] In an example, the at least one liquid application unit is mounted on at least one extendable arm.

[0145] In an example, when the unmanned aerial vehicle has landed and walked to the location for application of the liquid to at least one plant, the method comprises moving the at least one liquid application unit under control of the processing unit to a specific location for activation of the at least one liquid application unit based on the image analysis of one or more image of the at least one image of the environment.

[0146] In an example, the unmanned aerial vehicle comprises a camera connected to a body of the unmanned aerial vehicle, wherein the camera is configured to acquire the at least one image.

[0147] In an example, the camera is configured to move with respect to the body of the unmanned aerial vehicle, wherein the processing unit of the unmanned aerial vehicle is configured to move the camera.

[0148] In an example, the method comprises determining the location for application of the liquid after the unmanned aerial vehicle has landed within the environment.

[0149] In an example, the method comprises determining the location for application of the liquid before the unmanned aerial vehicle has landed within the environment.

[0150] In an example, the unmanned aerial the vehicle comprises location determining means.

[0151] In an example, the method comprises determining to land and walk to the location to apply the liquid based on one or more of: a wind speed, a wind direction, a state of precipitation.

[0152] In an example, the method comprises receiving information by the unmanned aerial vehicle from an external system relating to one or more of: the wind speed, the wind direction, the state of precipitation.

[0153] In an example, the unmanned aerial vehicle comprises one or more of: a wind speed sensor, a wind direction sensor, a precipitation sensor.

[0154] In an example, the method comprises stopping or feathering the at least one set of rotor blades when the unmanned aerial vehicle has landed in the environment.

[0155] In an example, at least one protective cage or protective mesh surrounds the at least one set of rotor blades.

[0156] In an example, wherein the method comprises flying the unmanned aerial vehicle to a location to apply the liquid to at least one plant whilst the unmanned aerial vehicle is flying, wherein the method comprises determining location based on image analysis of one or more image of the at least one image of the environment.

[0157] In an example, the method comprises analyzing by the processing unit the at least one image and determining the location for application of the liquid to the at least one plant whilst the unmanned aerial vehicle is flying.

[0158] In an example, wherein the method comprises utilization of an algorithm to determine locations within the environment to which the unmanned aerial vehicle should walk to apply the liquid to at least one plant whilst on the ground and locations within the environment to which the unmanned aerial vehicle should fly to apply the liquid to at least one plant whilst in the air, wherein the determination is comprises an analysis of the at least one image.

[0159] In an example, wherein the determining of the locations to which the unmanned aerial vehicle should walk to apply the liquid and the determining of the locations to which the unmanned aerial should fly to apply the liquid, comprises utilizing a determined power level of a battery configured to power the unmanned aerial vehicle and/or comprises utilizing a determined spray duration required for the environment.

[0160] In an example, each of the at least one liquid application unit is situated beneath one or more of the at least one set of rotor blades.

[0161] In an example, each liquid application unit of the at least one liquid application unit is situated beneath a different set of rotor blades of the at least one set of rotor blades.

[0162] In an example, the at least one liquid application unit comprises at least one nozzle applicator or at least one spinning disc applicator.

[0163] FIGS. 3a-3f show detailed examples of UAVs flying around, and landing within an environment. The individual figures can relate to the same UAV, but can relate to different UAVs. In FIG. 3a, a UAV (also called a drone) is flying around an environment. Its camera, shown in FIG. 3f, can acquire imagery that is analyzed to determine where crop plants need to be sprayed, and is analyzed to determine where to land, in order that the UAV can walk to that location (see FIGS. 3c, 3d, 3e and 3f). The UAV also analyses the imagery and determines where it should fly and spray the crop whilst flying, see FIG. 3b. The UAV uses an algorithm to determine, based on its battery lifetime, and on the time required to spray the environment, where it should land, and then walk, and where it should fly. An example of such an algorithm is a Monte-Carlo minimization routine—for example, this UAV or a different UAV can scan the environment to determine where crop needs to be sprayed, and the UAV as shown in FIG. 3 then determines how best to divided that spraying between walking and flying. Whilst on the ground, as shown in FIG. 3f, the camera can be on an extendable arm to better view the environment, and can also rotate, and one of the spray units is also located on an extendable arm to better spray specific plants or parts of those plants. The liquid chemical can also be applied using a brush to directly apply the chemical. Other spray units are located directly under the rotors, and in this way the spray becomes entrained in the downwash and suffers from reduced drift due to the wind. The rotors can still operate when the UAV is on the ground to entrain the spray, but not generate sufficient lift for take-off and clearly such entrainment applies when the UAV sprays whilst flying. A cage or mesh, not shown, surrounds each of the rotors in order that the rotors are not damaged by vegetation, and that vegetation is not damaged by the rotors.

[0164] The processing unit 50 then runs further image processing software that can be part of the image processing that determines vegetation location on the basis of feature extraction, if that is used. This software comprises a machine learning analyzer. Images of specific weeds are acquired, with information also relating to the size of weeds being used. Information relating to a geographical location in the world, where such a weed is to be found and information relating to a time of year when that weed is to be found, including when in flower etc. can be tagged with the imagery. The names of the weeds can also be tagged with the imagery of the weeds. The machine learning analyzer, which can be based on an artificial neural network or a decision tree analyzer, is then trained on this ground truth acquired imagery. In this way, when a new image of vegetation is presented to the analyzer, where such an image can have an associated time stamp such as time of year and a geographical location such as Germany or South Africa tagged to it, the analyzer determines the specific type of weed that is in the image through a comparison of imagery of a weed found in the new image with imagery of different weeds it has been trained on, where the size of weeds, and where and when they grow can also be taken into account. The specific location of that weed type on the ground within the environment, and its size, can therefore be determined.

[0165] In this way significantly less active ingredient(s) is required since the target weeds, insects and disease are treated directly rather than the whole crop. Furthermore, products can be applied directly and do not first need to be diluted in larger volumes of water for spray application. This has the additional advantage that the weight of product for application that the drone carries can be substantially reduced allowing for the use of much smaller, cheaper and more efficient drones with extended operating times between recharging or exchange of the batteries. Similarly, this application method allows the formulator to exploit the advantages of more concentrated active ingredients and surfactants in smaller deposits.

[0166] Thus, purposely designed formulations with appropriate physical stability can be utilized, providing appropriate wetting for the crop, appropriate biodelivery for the active ingredients, and appropriate resistance to wash-off by rain.

[0167] Off-target losses by drift can be greatly reduced or even effectively eliminated, allowing application to occur in populated and environmentally sensitive areas. Furthermore, the drone can continue to operate in conditions where the wind is too strong for application methods that generate even low levels of spray drift.

[0168] The drone can operate autonomously, reducing the labor required to control targets in agricultural crops.

[0169] The images from the camera can be analyzer by suitable image analysis software to identify targets. This can be performed autonomously onboard the drone with a dedicated processing unit or it can be performed remotely by a separate processing unit with/without input from the operator.

Weed Type Determination

[0170] The following relates to one method by which an image can be processed to determine a type of plant/weed, that also has utility for the detection of types of insects as would be appreciate by the skilled person:

1. An image of a plant is acquired.
2. Different parts of the plant are segmented, for example through contouring.
3. Image data that is within a segment boundary for example within a contour is analysed by an artificial neural network to determine the type of weed.
4. The above can be used to determine one type of crop plant from another type of crop plant, and to detect and identify insects.

[0171] It has to be noted that embodiments of the invention are described with reference to different subject matters. In particular, some embodiments are described with reference to method type claims whereas other embodiments are described with reference to the device type claims. However, a person skilled in the art will gather from the above and the following description that, unless otherwise notified, in addition to any combination of features belonging to one type of subject matter also any combination between features relating to different subject matters is considered to be disclosed with this application. However, all features can be combined providing synergetic effects that are more than the simple summation of the features.

[0172] While the invention, according to some embodiments, has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing a claimed invention, from a study of the drawings, the disclosure, and the dependent claims.

[0173] In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or other unit may fulfill the functions of several items re-cited in the claims. The mere fact that certain measures are re-cited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.