RESIDUE SPREAD MONITORING
20220369553 · 2022-11-24
Inventors
- Martin Peter Christiansen (US)
- Ramon Buchaca Tarragona (Randers, DK)
- Morten Stigaard Laursen (Randers, DK)
- Kenneth Düring Jensen (RANDERS, DK)
- Thomas Mygind Bojsen (Randers, DK)
Cpc classification
G06V10/7715
PHYSICS
A01D41/1243
HUMAN NECESSITIES
G06T3/4015
PHYSICS
A01D41/127
HUMAN NECESSITIES
International classification
A01D41/127
HUMAN NECESSITIES
G06T3/40
PHYSICS
Abstract
Systems and methods for monitoring the distribution of residue material from a spreader tool of an agricultural machine including receiving image data from an imaging sensor indicative of a residue material spread by the spreader tool within a sensing region rearwards of the agricultural machine and where one or more image transformations are applied to image data to generate an enhanced image of the distribution of the residue material including colour, distortion and/or correction transformations for generating an enhanced image for view by an operator of the machine and controlling a user interface associated with the agricultural machine to provide an indicator indicative of the enhanced image which is easily interpreted.
Claims
1. A system for monitoring the distribution of residue material from a spreader tool of an agricultural machine, the system comprising: an imaging sensor having a sensing region rearwards of the agricultural machine; and at least one controller, configured to: receive image data from the sensor indicative of residue material spread by the spreader tool within the sensing region; apply one or more image transformations to the image data to generate an enhanced image of the residue material distribution; and output one or more control signals for a user interface associated with the agricultural machine comprising an indicator indicative of the enhanced image.
2. The system of claim 1, wherein the imaging sensor is configured to apply a Bayer filter to the image data.
3. The system of claim 1, wherein the at least one controller is configured to apply a color transformation to the received image data.
4. The system of claim 3, wherein the at least one controller is configured to convert the received image data to RGB data using a Malvar Cutler method.
5. The system of claim 1, wherein the at least one controller is configured to apply a corrective transformation to the image data; and optionally the corrective transformation applied is a Vignette Correction transformation.
6. The system of claim 1, wherein the at least one controller is configured to apply a distortion correction transformation to the image data; and optionally the distortion correction transformation applied to the image data comprises: a delta angle correction; a barrel distortion correction; or a lens distortion correction.
7. The system of claim 4, wherein the at least one controller is operable to convert the RGB data to a different colour space; and optionally the different color space is a LAB colour space; or a CIELAB colour space.
8. The system of claim 1, wherein the at least one controller is configured to remap a tonality of the image data; and optionally wherein the at least one controller is configured to apply an S-curve to the image data.
9. The system of claim 8, wherein the at least one controller is configured to apply a curve to an “L-channel” or “luminance-channel” of the image data after conversion to a LAB or CIELAB colour space.
10. The system of claim 1, wherein the at least one controller is configured to apply a haze removal transformation to the image data.
11. The system of claim 1, wherein the at least one controller is operable to employ a data buffer of a predetermined time period to the data received from the sensor.
12. The system of claim 1, wherein the at least one controller is configured to extract a value for each corresponding pixel in each stored image forming a data buffer, wherein the value comprises: a statistical value for each pixel; or a quartile value for each pixel based on the stored data in the data buffer.
13. The system of claim 1, wherein the at least one controller is configured to apply a colour transformation to processed image data and generate an enhanced image for display; and optionally wherein the colour transformation includes converting the processed image data to a jet or turbo colour space.
14. The system of claim 1, wherein the at least one controller is configured to apply a frequency filter to the image data, the frequency filter being dependent on a forward speed of the agricultural machine.
15. The system of claim 1, wherein the user interface comprises a display wherein the display is part of the agricultural machine, or an interface on a portable device which is remotely operable from the machine.
16. The system of claim 1, configured such that an indicator on the enhanced image comprising a line or overlaid on the image at a position corresponding to the maximum lateral extent at which the material is ejected from the spreader tool.
17. An agricultural machine comprising the system of claim 1.
18. A method for monitoring the distribution of residue material from a spreader tool of an agricultural machine, the method comprising: receiving image data from an imaging sensor indicative of residue material spread by the spreader tool within a sensing region rearwards of the agricultural machine; applying one or more image transformations to the image data to generate an enhanced image of the residue material distribution; and controlling a user interface associated with the agricultural machine for providing an indicator indicative of the enhanced image.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0049] One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
[0050]
[0051]
[0052]
DETAILED DESCRIPTION
[0053]
[0054] The combine 10 is coupled to a header 12 which is operable, in use, to cut and gather a strip of crop material as the combine 10 is driven across a field/area to be harvested during a harvesting operation. A conveyor section 14 conveys the cut crop material from the header 12 into a crop processing apparatus 16 operable to separate grain and non-grain (i.e. material other than grain (MOG) or residue material (used interchangeably herein)) as will be appreciated. It is noted here that apparatus for separating grain and non-grain material are well-known in the art and the present invention is not limited in this sense. The skilled person will appreciate that numerous different configurations for the crop processing apparatus may be used as appropriate. Clean grain separated from the cut crop material is collected in a grain bin 18, which may be periodically emptied, e.g. into a collection vehicle, storage container, etc. utilising unloading auger 20. The remaining non-grain material (MOG)/residue material is separately moved to a spreader tool 22 which is operable in use to eject the non-grain material or MOG from the rear of the combine 10 and onto the ground. In
[0055] The combine 10 also typically includes, amongst other features, an operator cab 26, wheels 28, engine (not shown) and a user interface 32.
[0056] As will be discussed in detail herein, the combine 10 additionally includes a sensor in the form of a camera 30. The camera 30 is used, by a control system 100 of the combine, to obtain image data of a sensing or measurement region rear of the combine 10, indicative of a distribution of residue material associated with the spreader tool 22. In the illustrated embodiment, the camera 30 is shown mounted to a rear surface of the combine 10, and is angled downwards providing a field of view which encompasses a ground surface rear of the combine 10.
[0057]
[0058] The processor 104 is operable to receive image data via input 106 which, in the illustrated embodiment, takes the form of input signals 105 received from the camera 30. As described in detail herein, the camera 30 has a sensing region rearward of the combine 10, with the image data received from the camera 30 being illustrative of residue material within the sensing region. Using the received image data, the processor 104 is operable to perform one or more image processing transformations on the image data to generate an enhanced image for display by the user interface 32 in the manner described herein. Advantageously, the process steps discussed herein may result in the generation of an enhanced image which more clearly illustrates the residue spread associated with the spreader tool 22 for understanding by an operator of the combine 10.
[0059] In the illustrated embodiment of
[0060] It will be appreciated that in alternative arrangements a user interface may instead be provided remote from the combine 10, e.g. as part of a smartphone, tablet computer or computer, for example, for a remote operator to visualise the spread pattern.
[0061] In a variant, as illustrated by
[0062]
[0063] As discussed herein, aspects of the invention relate to the performance of one or more image transformations to the image data received from the camera 30 in order to generate an enhanced image for an operator of the combine 10.
[0064] The following description covers an embodiment of a series of processing steps performed on the image data received from camera 30 to obtain an enhanced image 500 (
[0065] As discussed above,
[0066] As a next step, the corrected RGB data is converted to a different colour space. Specifically, in the illustrated embodiment, this includes converting the RGB data to a LAB colour space before remapping a tonality of the converted LAB image data. Here, this comprises applying an S-curve to the converted image data, and specifically to the “L-channel” or “luminance-channel” of the LAB image data. Advantageously, this results in the residue material being more clearly defined with respect to the background in the image data by effectively enhancing any “luminous” pixels corresponding to the pixel values associated with residue material whilst subduing pixels corresponding to background. At this stage, the remapped image data may be used to generate an enhanced image for display. An example enhanced image 300 is shown in
[0067] In a variant, a haze removal transformation may be applied to the image data. Advantageously, this may filter out dust and other small particles present within the image data which may otherwise obscure the residue material pieces, and suppress stray background light in the image.
[0068] In a next step, a data buffer is employed storing data from a number of sequential images obtained by the camera 10. This is typically of the order of 10-100 separate images. For equivalent pixels of each image, a value is extracted which corresponds to a statistical “value” for that pixel across the multiple images. The statistical values are used to generate an enhanced image comprising the processed pixels—see
[0069] In a final step of the process, a further colour transformation in applied to the processed image data. The further colour transformation here includes converting the processing image data to a false/pseudo colour space, and in particular to a jet colour space. An example image 500 obtained following the further colour transformation is shown in
[0070] In an extension of the method, a frequency filter may be applied to the image data. This may be applied at any point in the above described method. The frequency filter may advantageously be dependent on a forward speed of the agricultural machine to filter from the image data non-residue “stationary” objects within the vehicle's environment.
[0071] Any process descriptions or blocks in flow diagrams should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the embodiments in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure.
[0072] It will be appreciated that embodiments of the present invention can be realised in the form of hardware, software or a combination of hardware and software. Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a ROM, whether erasable or rewritable or not, or in the form of memory such as, for example, RAM, memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a CD, DVD, magnetic disk or magnetic tape. It will be appreciated that the storage devices and storage media are embodiments of machine-readable storage that are suitable for storing a program or programs that, when executed, implement embodiments of the present invention. Accordingly, embodiments provide a program comprising code for implementing a system or method as set out herein and a machine readable storage storing such a program. Still further, embodiments of the present invention may be conveyed electronically via any medium such as a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.
[0073] It will be appreciated that the above embodiments are discussed by way of example only. Various changes and modifications can be made without departing from the scope of the present application.