APPARATUS FOR WEED CONTROL
20210015087 · 2021-01-21
Assignee
Inventors
- Sergio JIMENEZ TARODO (Düsseldorf, DE)
- Michael Kilian (Leverkusen, DE)
- James HADLOW (Newmarket, GB)
- Virginie Giraud (Ecully, FR)
- Thomas ARIANS (Rommerskirchen, DE)
Cpc classification
G05B2219/2639
PHYSICS
International classification
A01M21/04
HUMAN NECESSITIES
Abstract
An apparatus for weed control includes a processing unit with a least one sensor data of an environment. The processing unit analyzes at least some of the at least one sensor data to determine at least one ground property for each of a plurality of locations of the environment. The processing unit determines a power setting for activation of at least one electrode based weed control unit for each of the plurality of locations. The determination of the power setting for activation of the electrode based weed control unit comprises utilization of the at least one ground property associated with that location. An output unit outputs information useable to activate the at least one electrode based weed control unit.
Claims
1. An apparatus for weed control, comprising: an input unit; a processing unit; and an output unit; wherein, the input unit is configured to provide the processing unit with a least one sensor data of an environment; wherein, the processing unit is configured to analyze at least some of the at least one sensor data to determine at least one ground property for each of a plurality of locations of the environment; wherein, the processing unit is configured to determine a power setting for activation of at least one electrode based weed control unit for each of the plurality of locations, wherein determination of the power setting for activation of the electrode based weed control unit comprises utilization of the at least one ground property associated with that location; and wherein the output unit is configured to output information useable to activate the at least one electrode based weed control unit.
2. The apparatus of claim 1, wherein the at least one sensor data comprises at least one image, and wherein the processing unit is configured to analyze the at least one image to determine at least one activation location of the plurality of locations for activation of the at least one electrode based weed control unit.
3. The apparatus of claim 2, wherein analysis of the at least one image to determine the at least one activation location comprises a determination of at least one location of vegetation.
4. The apparatus of claim 3, wherein determination of the power setting for activation of the at least one electrode based weed control unit for the at least one activation location comprises utilization of the determined at least one location of vegetation.
5. The apparatus of claim 2, wherein analysis of the at least one image to determine the at least one activation location comprises a determination of at least one type of weed.
6. The apparatus of claim 5, wherein determination of the power setting for activation of the at least one electrode based weed control unit for the at least one activation location comprises utilization of the determined at least one type of weed.
7. The apparatus of claim 2, wherein analysis of the at least one image comprises utilization of a machine learning algorithm.
8. The apparatus of claim 1, wherein the at least one ground property comprises one or more of: a measure of ground moisture; a measure of ground texture; a measure of ground conductivity; a measure of ground temperature; a measure of ground hardness; a measure of plant root occurrence; a measure of ground type; and a measure of salinity.
9. The apparatus of claim 1, wherein the at least one sensor data was acquired by at least one sensor, and wherein the input unit is configured to provide the processing unit with at least one location associated with the at least one sensor when the at least one sensor data was acquired.
10. The apparatus of claim 9, wherein the at least one sensor comprises one or more of: a camera; a ground moisture sensor; a ground texture sensor; an electrical conductivity sensor; a soil insertion sensor; an electromagnetic induction sensor; a temperature sensor; a ground hardness sensor; a root occurrence sensor; a ground type sensor; a salinity sensor; at least one reflectance sensor configured to operate in one or more of the visible, the infrared, the near infrared, the mid infrared, and the far infrared.
11. A system for weed control, comprising: at least one sensor; the apparatus for weed control of claim 1; and the at least one electrode based weed control unit; wherein, the at least one sensor is configured to acquire the at least one sensor data of the environment; wherein, the at least electrode based weed control unit is mounted on a vehicle; and wherein, the apparatus is configured to activate the at least one electrode based weed control unit.
12. The system of claim 11, wherein the apparatus is mounted on the vehicle; and wherein the at least one sensor is mounted on the vehicle.
13. The system of claim 11, wherein the at least one sensor comprises one or more of: a camera; a ground moisture sensor; a ground texture sensor; a ground conductivity sensor; an electromagnetic induction sensor; a ground temperature sensor; a soil insertion sensor; a ground hardness sensor; a root occurrence sensor; a ground type sensor; a salinity sensor, and a reflectance sensor.
14. A method for weed control, comprising: providing a processing unit with a least one sensor data of an environment; analyzing by the processing unit at least some of the at least one sensor data to determine at least one ground property for each of a plurality of locations of the environment; determining by the processing unit a power setting for activation of at least one electrode based weed control unit for each of the plurality of locations, wherein determination of the power setting for activation of the electrode based weed control unit comprises utilization of the at least one ground property associated with that location; and outputting by an output unit output information useable to activate the at least one electrode based weed control unit.
15. A non-transitory computer readable medium storing one or more programs, the one or more programs comprising instructions, which when executed by a processor, cause the processor to: provide a processing unit with a least one sensor data of an environment; analyze by the processing unit at least some of the at least one sensor data to determine at least one ground property for each of a plurality of locations of the environment; determine by the processing unit a power setting for activation of at least one electrode based weed control unit for each of the plurality of locations, wherein determination of the power setting for activation of the electrode based weed control unit comprises utilization of the at least one ground property associated with that location; and output by an output unit output information useable to activate the at least one electrode based weed control unit.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0043] Exemplary embodiments will be described in the following with reference to the following drawings:
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051]
DETAILED DESCRIPTION OF EMBODIMENTS
[0052]
[0053] In an example, the apparatus is operating in real-time, where sensor data are acquired and immediately processed and one, or more than one, electrode based weed control unit is configured and activated.
[0054] In an example, the apparatus is operating in quasi real time, where sensor data are acquired of an environment and immediately processed to determine the correct configuration of the electrode based weed control units. That information can later be used by an appropriate system (or systems) that travel(s) within the environment and activates the electrode based weed control units at particular parts of that environment. Thus for example, a first vehicle, such as a car, train, lorry or unmanned aerial vehicle (UAV) or drone equipped with one or more sensors can travel within an environment and acquire sensor data. This sensor data can be immediately processed to determine ground properties around an environment, from which the configuration of the electrode weed control units can be determined for different locations in the environment. Later, a vehicle equipped with an electrode based weed control unit or units can travel within the environment and activate the electrodes at different specific areas of the environment, where the electrode weed control units are appropriately configured for the different locations.
[0055] In an example, the apparatus is operating in an offline mode. Thus, sensor data that has previously been acquired is provided later to the apparatus. The apparatus then determines what the configuration of the electrode based weed control units should be at different locations in the environment. This information is then used later by one or more vehicles that then travel within the area and activate their electrode weed control units, appropriately configured, to specific parts of the environment.
[0056] In an example, the output unit outputs a signal that is directly useable to activate the at least one electrode based weed control unit.
[0057] According to an example, the at least one sensor data comprises at least one image. The processing unit is configured to analyze the at least one image to determine at least one activation location of the plurality of locations for activation of the at least one electrode based weed control unit.
[0058] According to an example, analysis of the at least one image to determine the at least one activation location comprises a determination of at least one location of vegetation.
[0059] According to an example, determination of the power setting for activation of the at least one electrode based weed control unit for the activation location comprises utilization of the determined at least one location of vegetation.
[0060] According to an example, analysis of the at least one image to determine the at least activation location comprises a determination of at least one type of weed.
[0061] According to an example, determination of the power setting for activation of the at least one electrode based weed control unit for the activation location comprises utilization of the determined at least one type of weed.
[0062] According to an example, analysis of the at least one image comprises utilization of a machine learning algorithm.
[0063] In an example, the machine learning algorithm comprises a decision tree algorithm.
[0064] In an example, the machine learning algorithm comprises an artificial neural network.
[0065] In an example, the machine learning algorithm has been taught on the basis of a plurality of images. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing imagery of at least one type of weed. In an example, the machine learning algorithm has been taught on the basis of a plurality of images containing imagery of a plurality of weeds.
[0066] According to an example, the at least one ground property comprises one or more of: a measure of ground moisture; a measure of ground texture; a measure of ground conductivity; a measure of ground temperature; a measure of ground hardness; a measure of plant root occurrence; a measure of ground type; a measure of salinity.
[0067] According to an example, the at least one sensor data was acquired by at least one sensor. The input unit is configured to provide the processing unit with at least one location associated with the at least one sensor when the at least one sensor data was acquired.
[0068] In an example, the location is an absolute geographical location.
[0069] In an example, the location is a location that is determined with reference to the position or positions of the electrode based weed control units. In other words, an image can be determined to be associated with a specific location on the ground, without knowing its precise geographical position, but by knowing the position of the electrode based weed control units with respect to that location at the time the image was acquired, the required power can then be applied at a later time at that location by moving the appropriate electrode based weed control unit to that location, for example through movement of a vehicle to which the electrode based weed control unit is attached. Thus, for example one or more sensors can be mounted on a vehicle, such as a train, forward of the position where electrodes for controlling weeds are mounted. Knowledge of the distance between these mounting positions, and a speed of the train, enables sensor data to be acquired at a location and then at an appropriate time later depending upon the speed of the train, the electrodes can be activated at the same location where the sensor data was acquired.
[0070] In an example, a GPS unit is used to determine, and/or is used in determining, the location of the at least one sensor when specific sensor data were acquired.
[0071] In an example, an inertial navigation unit is used alone, or in combination with a GPS unit, to determine the location of the at least one sensor when specific data were acquired. Thus for example, the inertial navigation unit, comprising for example one or more laser gyroscopes, is calibrated or zeroed at a known location and as it moves with the at least one sensor the movement away from that known location in x, y, and z coordinates can be determined, from which the location of the at least one sensor when sensor data were acquired can be determined.
[0072] In an example, image processing of acquired imagery is used alone, or in combination with a GPS unit, or in combination with a GPS unit and inertial navigation unit, to determine the location of the at least one sensor when specific data were acquired. Thus visual markers can be used alone, or in combination with GPS derived information.
[0073] According to an example, the at least one sensor comprises one or more of: a camera; a ground moisture sensor; a ground texture sensor; an electrical conductivity sensor; a soil insertion sensor; an electromagnetic induction sensor; a temperature sensor; a ground hardness sensor; a root occurrence sensor; a ground type sensor; a salinity sensor; at least one reflectance sensor configured to operate in one or more of the visible, the infrared, the near infrared, the mid infrared, the far infrared.
[0074]
[0075] According to an example, the apparatus is mounted on the vehicle, and the at least one sensor is mounted on the vehicle.
[0076] In an example, the vehicle is a train.
[0077] In an example, the vehicle is a lorry or truck or Unimog.
[0078] In an example, the input unit is configured to provide the processing unit with at least one location associated with the at least one sensor when the at least one sensor data was acquired. In an example, the location is a geographical location.
[0079] According to an example, the at least one sensor comprises one or more of: a camera; a ground moisture sensor; a ground texture sensor; a ground conductivity sensor; an electromagnetic induction sensor; a ground temperature sensor; a soil insertion sensor; a ground hardness sensor; a root occurrence sensor; a ground type sensor; a salinity sensor, a reflectance sensor.
[0080] In an example, the moisture sensor comprises a frequency domain reflectometer or time domain transmission or time domain reflectometerin this way a probe or probes can be inserted into the ground and from signal frequency or speed of propagation signatures the dielectric constant of a volume element of the ground can be determined, from which the moisture content can be determined.
[0081] In an example, the moisture sensor comprises a ground resistance sensor determining the resistance between two electrodes or probes inserted into the ground, from which the moisture content can be determined in addition to the ground resistance and conductivity itself.
[0082] In an example, the ground texture and hardness is determined from the force required to insert one or more probes into the ground.
[0083] In an example, the ground temperature is measure using a temperature probe that is inserted into the ground.
[0084] In an example, the root occurrence sensor comprises a camera that acquires images, the analysis of which can be used to determine the types of weed present and their ground density from which the expected occurrence of roots and the types of roots in the ground can be determined.
[0085] Additionally, a reflectance signal from the ground, which could be spectrographically interrogated, can be analyzed to provide information regarding the ground properties.
[0086] In this manner, one or more sensors or probes can be pushed into the ground, or otherwise acquire data at that position, and the ground properties determined. This can occur in real time, where the sensors or probes are mounted on the vehicle ahead of the electrode based weed control units, and are continuously pushed into the ground pulled out and then pushed into the next section of ground as the vehicle moves forward, and/or image or reflectance sensor data acquired. The ground properties are then determined, and the electrode based weed control units can be correctly configured, such that if they are required to be activated at any position they are already correctly primed to do so in an optimum manner.
[0087]
[0088] in a providing step 210, also referred to as step a), providing a processing unit with a least one sensor data of an environment;
[0089] in an analyzing step 220, also referred to as step b), analyzing by the processing unit at least some of the at least one sensor data to determine at least one ground property for each of a plurality of locations of the environment;
[0090] in a determining step 230, also referred to as step d), determining by the processing unit a power setting for activation of at least one electrode based weed control unit for each of the plurality of locations, wherein determination of the power setting for activation of the electrode based weed control unit comprises utilization of the at least one ground property associated with that location; and
[0091] in an outputting step 240, also referred to as step e), outputting by an output unit output information useable to activate the at least one electrode based weed control unit.
[0092] In an example, the at least one sensor data comprises at least one image, and wherein the method comprises step c), analyzing 250 by the processing unit the at least one image to determine at least one activation location of the plurality of locations for activation of the at least one electrode based weed control unit.
[0093] In an example, step c) comprises determining at least one location of vegetation.
[0094] In an example, step d) comprises utilizing the determined at least one location of vegetation.
[0095] In an example, step c) comprises determining at least one type of weed.
[0096] In an example, step d) comprises utilizing the determined at least one type of weed.
[0097] In an example, step c) comprises utilizing a machine learning algorithm.
[0098] In an example, the at least one ground property comprises one or more of: a measure of ground moisture; a measure of ground texture; a measure of ground conductivity; a measure of ground temperature; a measure of ground hardness; a measure of plant root occurrence; a measure of ground type; a measure of salinity.
[0099] In an example, the at least one sensor data was acquired by at least one sensor, and wherein the input unit is configured to provide the processing unit with at least one location associated with the at least one sensor when the at least one sensor data was acquired.
[0100] In an example, the at least one sensor comprises one or more of: a camera; a ground moisture sensor; a ground texture sensor; an electrical conductivity sensor; a soil insertion sensor; an electromagnetic induction sensor; a temperature sensor; a ground hardness sensor; a root occurrence sensor; a ground type sensor; a salinity sensor; at least one reflectance sensor configured to operate in one or more of the visible, the infrared, the near infrared, the mid infrared, the far infrared.
[0101] The apparatus, system and method for weed control are now described in more detail in conjunction with
[0102]
[0103] Continuing with
[0104] In further detail, an input unit 20 of the apparatus 10 passes the acquired imagery to a processing unit 30. Image analysis software operates on the processor 30. The image analysis software can use feature extraction, such as edge detection, and object detection analysis that for example can identify structures such as railway tracks, sleepers, trees, level crossings, station platforms. Thus, on the basis of known locations of objects, such as the locations of buildings within the environment, and on the basis of known structure information such as the distance between sleepers and the distance between the railway tracks, the processing unit can patch the acquired imagery to in effect create a synthetic representation of the environment that can in effect be overlaid over a geographical map of the environment. Thus, the geographical location of each image can be determined, and there need not be associated GPS and/or inertial navigation based information associated with acquired imagery. However, if there is GPS and/or inertial navigation information available then such image analysis, that can place specific images at specific geographical locations only on the basis of the imagery, is not required. Although, if GPS and/or inertial navigation based information is available then such image analysis can be used to augment the geographical location associated with an image. Thus for example, if on the basis of GPS and/or inertial navigation based information the center of an acquired image is deemed to be located 22 cm from the side edge and 67 cm from the end of a particular railway sleeper of a section of railway, whilst from the actual acquired imagery, through the use of the above described image analysis, the center of the image is determined to be located 25 cm from the edge and 64 cm from the end of the sleeper, then the GPS/inertial navigation based derived location can be augmented by shifting the location 3 cm in one direction and 3 cm in another direction as required.
[0105] The processor 30 runs further image processing software. This software analyses an image to determine the areas within the image where vegetation is to be found. Vegetation can be detected based on the shape of features within acquired images, where for example edge detection software is used to delineate the outer perimeter of objects and the outer perimeter of features within the outer perimeter of the object itself. A database of vegetation imagery can be used in helping determine if a feature in imagery relates to vegetation or not, using for example a trained machine learning algorithm such as an artificial neural network or decision tree analysis. The camera can acquire multi-spectral imagery, with imagery having information relating to the color within images, and this can be used alone, or in combination with feature detection to determine where in an image vegetation is to be found. As discussed above, because the geographical location of an image can be determined, from knowledge of the size of an image on the ground, the location or locations of vegetation to be found in an image can then be mapped to the exact position of that vegetation on the ground.
[0106] The processor 30 then runs further image processing software that can be part of the image processing that determines vegetation location on the basis of feature extraction, if that is used. This software comprises a machine learning analyzer. Images of specific weeds are acquired, with information also relating to the size of weeds being used. Information relating to a geographical location in the world, where such a weed is to be found and information relating to a time of year when that weed is to be found, including when in flower etc. can be tagged with the imagery. The names of the weeds can also be tagged with the imagery of the weeds. The machine learning analyzer, which can be based on an artificial neural network or a decision tree analyzer, is then trained on this ground truth acquired imagery. In this way, when a new image of vegetation is presented to the analyzer, where such an image can have an associated time stamp such as time of year and a geographical location such as Germany or South Africa tagged to it, the analyzer determines the specific type of weed that is in the image through a comparison of imagery of a weed found in the new image with imagery of different weeds it has been trained on, where the size of weeds, and where and when they grow can also be taken into account. The specific location of that weed type on the ground within the environment, and its size, can therefore be determined.
[0107] The processor 30 has access to a database containing different weed types, and the optimum mode of the electrode based weed control technology to be used in controlling that weed type, which has been compiled from experimentally determined data. This database can be the same database that accounts for configuration of the electrode based weed control unit as a function of the ground properties, and provides a juxtaposition of information relating to different weeds in different ground types, For example, the voltage and/or current and indeed duration of application can vary to account for different weeds at different locations.
[0108] Thus, sensor data is acquired that enables an electrode based weed control technology to be activated optimally to account for ground conditions and for weeds at those locations.
[0109] With continued reference to
[0110]
[0111]
[0112] Continuing with
[0113]
[0114]
[0115] Continuing with
[0116] The above detailed examples have been discussed with respect to a railway, however a weed control train, a truck or lorry or Unimog can have electrode based weed control units mounted on/within it that can use sensors to determine ground properties and types of weeds in order to control those specific weed types as discussed above.
Image Processing to Enable Analysis to Determine a Weed Type
[0117] A specific example of how an image is processed, and determined to be suitable for image processing in order that a type of weed can be determined is now described: [0118] 1. A digital imagein particular a colored imageof a weed is captured. [0119] 2. Areas with a predefined color and texture within the digital image are contoured within a boundary contour. Typically, one may expect one contoured area from one weed plant. However, there may also be more than one contoured area from different, potentially not connected leafs, from two weed plants, or the like.Such a detection or determining process detects boundaries of green areas of the digital image. During this process at least one contoured areae.g., one or more leafs, as well as one or more weed plantsmay be built comprising pixels relating to the weed within a boundary contour. However, it may also be possible, that the digital image has captured more than one leaf and/or the stem. Consequently, more than one contoured area may be determined. [0120] 3. Determining if the boundary contour covers a large enough area, and determining a sharpness (e.g. degree of focus) of the image data within the boundary contour. This firstly ensures that there will be sufficient image data upon which a determination can be made as to the type of weed, and secondly determines that a minimum quality of the digital image will be satisfied in order that the type of weed can be made. [0121] 4. If both criteria in 3) are satisfied, the digital image, and specifically that within the boundary contour is sent to the processing unit for image analysis by the artificial neural network to determine the type of weed as described above.
[0122] In some exemplary embodiments, a computer program or computer program element is provided that is characterized by being configured to execute the method steps of the method according to one of the preceding embodiments, on an appropriate system.
[0123] The computer program element might therefore be stored on a computer unit, which might also be part of an embodiment. This computing unit may be configured to perform or induce performing of the steps of the method described above. Moreover, it may be configured to operate the components of the above described apparatus and/or system. The computing unit can be configured to operate automatically and/or to execute the orders of a user. A computer program may be loaded into a working memory of a data processor. The data processor may thus be equipped to carry out the method according to one of the preceding embodiments.
[0124] Some exemplary embodiments of the invention covers both, a computer program that right from the beginning uses the invention and computer program that by means of an update turns an existing program into a program that uses invention.
[0125] Further on, the computer program element might be able to provide all necessary steps to fulfill the procedure of some exemplary embodiments of the method as described above.
[0126] According to some exemplary embodiment of the present invention, a computer readable medium, such as a CD-ROM, USB stick or the like, is presented wherein the computer readable medium has a computer program element stored on it which computer program element is described by the preceding section.
[0127] A computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
[0128] However, the computer program may also be presented over a network like the World Wide Web and can be downloaded into the working memory of a data processor from such a network. According to some exemplary embodiments of the present invention, a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the invention.
[0129] It has to be noted that embodiments of the invention are described with reference to different subject matters. In particular, some embodiments are described with reference to method type claims whereas other embodiments are described with reference to the device type claims. However, a person skilled in the art will gather from the above and the following description that, unless otherwise notified, in addition to any combination of features belonging to one type of subject matter also any combination between features relating to different subject matters is considered to be disclosed with this application. However, all features can be combined providing synergetic effects that are more than the simple summation of the features.
[0130] While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing a claimed invention, from a study of the drawings, the disclosure, and the dependent claims.
[0131] In the claims, the word comprising does not exclude other elements or steps, and the indefinite article a or an does not exclude a plurality. A single processor or other unit may fulfill the functions of several items re-cited in the claims. The mere fact that certain measures are re-cited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.