METHOD AND ARRANGEMENT FOR MANURE HANDLING
20210000065 · 2021-01-07
Inventors
- Józef FURDAK (Tumba, SE)
- Piotr HOFMAN (Tumba, SE)
- Bartlomiej JAKLIK (Tumba, SE)
- Marcin MALECKI (Tumba, SE)
- Mateusz PROKOWSKI (Tumba, SE)
Cpc classification
A01K1/0128
HUMAN NECESSITIES
International classification
Abstract
A method and arrangement for controlling the operation of a manure scraper that operates responsive to information obtained from one or more cameras mounted so as to capture images of the area of operation of the scraper.
Claims
1-10. (canceled)
11. A method for controlling the operation of a manure scraper, the method comprising: obtaining information from one or more cameras regarding a position of the scraper along a scraping route of operation of the scraper by means of image processing applied to image data generated by the one of more cameras, the one or more cameras mounted so as to capture images of a length of the scraping route; determining whether said obtained information fulfills a criterion, the criterion defined as whether the scraper has reached a predefined position along the scraping route and/or whether the scraper has entered a predefined situation relating to a proximity of the scraper to an object located within the scraping route; and in the event that the criterion is satisfied by the obtained information, triggering an action of the scraper, the action comprising an adjustment of a speed or operation of the scraper.
12. The method according to claim 11, wherein the criterion is defined as a position of the scraper in relation to a start position of the scraper along the scraping route, and wherein the action, carried out when the obtained information determines that the scraper has reached the start position, causes a motor that actuates the scraper to stop reversing the scraper and begin a forward scraping motion of the scraper.
13. The method according to claim 11, wherein the criterion is defined as a position of the scraper in relation to an obstacle located along the length of the scraping route, or a relation between positions and/or velocities of the scraper and the obstacle based on the information obtained from the one or more cameras, and wherein the action carried out when the obtained information determines that the scraper will collide with the obstacle is to cause the scraper to change a velocity thereof in order to avoid colliding with the obstacle.
14. The method according to claim 13, wherein upon a determination from the obtained information that an alley in which the scraper operates is clear of animals and objects, the scraper is controlled to run at a higher speed than in a determination from the obtain information of a risk of an animal stepping in the way of the scraper.
15. The method according to claim 11, wherein the criterion is defined as a position of the scraper in relation to an intermediate position of the scraper along the scraping route, and wherein the action carried out when the obtained information determines that the scraper has reached the intermediate position is to adjust the speed of the scraper.
16. The method according to claim 15, wherein the intermediate position is defined as either of a feeding station and an animal crossing pathway where a change of the speed of the scraper is required in order to avoid scaring or hitting an animal, and wherein the action carried out when the obtained information determines that the scraper has reached the intermediate position is to lower the speed of the scraper.
17. The method according to claim 11, wherein the criterion is defined as a position of the scraper in relation to an end position of the scraper along the scraping route, and wherein the action, carried out when the obtained information determines that the scraper has reached the end position, is to cause a driving mechanism of the scraper to stop a forward scraping motion of the scraper and begin a reverse motion of the scraper.
18. The method according to claim 11, wherein the scraper is driven by means of a wire, and the method further comprising: determining an estimated position of the scraper in an alley in which the scraper operates along the length of the scraping route, based on information associated with a driving force of a motor that drives the wire and/or a run length of the wire, comparing the estimated position with a position of the scraper along the scraping route observed by the one or more cameras, determining a difference between the estimated position and the observed position, and correcting the estimated position based on the difference between the estimated position and the observed position, such that a scraper driving mechanism is adjusted based on a determined elongation of the wire and/or slippage of the wire.
19. The method according to claim 18, wherein the wire is automatically shortened or stretched, by means of a shortening/stretching mechanism, based on the difference determined between the estimated position and the observed position.
20. The method according to claim 11, further comprising: placing the scraper in a first position along the scraping route; registering the first position as observed by the one or more cameras as a starting position; moving the scraper to a second position along the scraping route; and registering the second position as observed by the one or more cameras as an end position.
21. The method according to claim 11, wherein a virtual map is created of an area in which the scraper operates based on information derived from the images captured by the one or more cameras, and wherein by use of a user control interface, first and second positions in the virtual map are defined respectively as end and start positions of the scraper.
22. The method according to claim 21, wherein the virtual map is presented as an image of an entire area of operation of the scraper, said image composed from said information from the one or more cameras.
23. The method according to claim 11, wherein the one or more cameras are mounted such that a combined field of view of the one or more cameras covers the length of the scraper route.
24. A scraper arrangement configured to perform the method according to claim 11.
25. A user-interface for performing the method according to claim 11.
Description
[0030]
[0031] Exemplifying Embodiments of Control Unit (Processing Unit),
[0032] An exemplifying embodiment of a control unit is illustrated in a general manner in
[0033] The control unit may be comprised in a system controller in a barn or be comprised in, or as an add on module (additional functionality) to one of the cameras. Such a module could be a part of a camera or a system controller, or could alternatively be external to the one or more cameras and/or other central control equipment. For example, the control unit could be a part of a central system or arrangement for controlling a plurality of barn equipment. The control unit may alternatively be denoted e.g. control device or processing unit. The communication between the control unit and other entities may be performed over a state of the art wireless and/or wired interface. The control unit 500 is configured to perform the actions of at least one of the method embodiments described above. The control unit 500 is associated with the same technical features, objects and advantages as the previously described method embodiments. The control unit will be described in brief in order to avoid unnecessary repetition.
[0034] The control unit may be implemented and/or described as follows:
[0035] The control unit 500 comprises processing circuitry 501 and a communication interface 502. The processing circuitry 501 is configured to cause the control unit 500 to obtain information from other entities, such as one or more cameras. The processing circuitry 501 is further configured to cause the control unit 500 to trigger an action, such as an adjustment of the speed or operation of the scraper, based on the obtained information. The communication interface 502, which may also be denoted e.g. Input/Output (I/O) interface, includes a wired and/or a wireless interface for sending data, such as commands, to other nodes or entities, and for obtaining/receiving information from other nodes or entities, such as sensors or user devices.
[0036]
[0037] An alternative implementation of the processing circuitry 501 is shown in
[0038] The processing circuitry 501 could comprise more units configured to cause the control unit to perform actions associated with one or more of the method embodiments described herein. As examples, unit 508 is provided, having dashed outline. Alternatively, any of the units 507, 509-510 could be configured to also cause the control unit to perform such other actions. The control unit 500 could, for example, comprise a determining unit for determining whether the scraper arrangement is set in a specific mode, implicating certain features. The control unit 500 could further comprise an image analysis and/or object recognition unit 508, for detecting an object and/or a position of an object and/or a predefined scenario in at least one image captured by the one or more cameras. This, and other tasks, could alternatively be performed by one of the other units.
[0039] The control unit 500 may comprise further functionality, for carrying out control unit functions not specifically mentioned herein, related e.g. to standard operation scraper arrangement.
[0040] The foregoing description of a control unit 500 is not intended to be limiting. The processing circuitry may also be implemented by other techniques known in the art, such as, e.g., hard-wired transistor logic or application-specific integrated circuits arranged in a manner sufficient to carry out the actions of the control unit 500 as described above.
[0041]
[0042]
[0043] The features of embodiments described herein could be used separately or in combination, e.g. depending on needs or preference. To summarize some of the elements and features of the described subject matter:
EXEMPLIFYING ELEMENTS AND FEATURES ASSOCIATED WITH EMBODIMENTS
[0044] One or more cameras, operable to at least one of: [0045] capture two- or three dimensional images of an area; [0046] provide information related to the captured images to other entities; [0047] perform image processing and/or object recognition; [0048] One or more scrapers, operable to be pulled (or pushed) along a route of operation in order to clear e.g. manure from a floor; [0049] A scraper driving mechanism operable to at least one of: [0050] pull (and/or push) the scraper along a route of operation; [0051] reverse the scraper to its start position; [0052] change the speed and/or force with which the scraper is pulled; [0053] stop the scraper in response to a command, sensor input, or similar; [0054] receive input from a processing unit, the info being based on information derived from images captured by one or more cameras overlooking e.g. at least part of an area of operation of the scraper. [0055] provide information on characteristics of a motor, such as an applied force over time and/or a resistance felt by the scraper, to other entities; [0056] estimate a position of the scraper along its route of operation based on characteristics of the driving mechanism, such as determined/logged operation of a motor, a determined/logged cable/wire distance; [0057] provide estimated information to other entities, such as e.g. a processing unit associated with the one or more cameras; [0058] calibrating the position of the scraper, e.g. by correcting the estimated position based on information (based on camera input) received from a processing unit, or by shortening of a pulling wire (or similar/corresponding) or other adjustment to the pulling (or pushing) mechanism; [0059] The scraper driving mechanism could comprise or be connected to a control unit with processing circuitry, operable to handle e.g. wired or wireless communication with other entities; estimation and/or logging operations and control functions. [0060] A processing unit operable to at least one of: [0061] obtain information from one or more cameras related to images captured by said one or more cameras; [0062] obtain information associated with a scraper driving mechanism, related e.g. to an estimated position of a scraper; [0063] perform image processing and/or object recognition; [0064] trigger actions to be performed by a scraper driving mechanism; [0065] provide information to a scraper driving mechanism (e.g. a therewith associated processing unit); [0066] provide information or alert signals to a user interface; [0067] obtain information from a user interface, e.g. for configuration/initialization of scraper positions; [0068] defining or adjusting a schedule for operation of a scraper based on information obtained from one or more cameras; [0069] initializing a scraper arrangement, e.g. defining start and end positions, based on information obtained from one or more cameras and/or on information obtained from a user interface.
[0070] To further summarize, the steps, functions, procedures, modules, units and/or blocks described herein may be implemented in hardware using any conventional technology, such as discrete circuit or integrated circuit technology, including both general-purpose electronic circuitry and application-specific circuitry.
[0071] Alternatively, at least some of the steps, functions, procedures, modules, units and/or blocks described above may be implemented in software such as a computer program for execution by suitable processing circuitry including one or more processing units. The software could be carried by a carrier, such as an electronic signal, an optical signal, a radio signal, or a computer readable storage medium before and/or during the use of the computer program in the nodes.
[0072] The flow diagram or diagrams presented herein may be regarded as a computer flow diagram or diagrams, when performed by one or more processors. A corresponding apparatus may be defined as a group of function modules, where each step performed by the processor corresponds to a function module. In this case, the function modules are implemented as a computer program running on the processor.
[0073] It should also be understood that it may be possible to re-use the general processing capabilities of any conventional device or unit in which the proposed technology is implemented. It may also be possible to re-use existing software, e.g. by reprogramming of the existing software or by adding new software components.
[0074] The embodiments described above are merely given as examples, and it should be understood that the proposed technology is not limited thereto. It will be understood by those skilled in the art that various modifications, combinations and changes may be made to the embodiments without departing from the present scope. In particular, different part solutions in the different embodiments can be combined in other configurations, where technically possible.
[0075] When using the word comprise or comprising it shall be interpreted as non-limiting, i.e. meaning consist at least of.
[0076] It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts.
[0077] It is to be understood that the choice of interacting units, as well as the naming of the units within this disclosure are only for exemplifying purpose, and nodes suitable to execute any of the methods described above may be configured in a plurality of alternative ways in order to be able to execute the suggested procedure actions.
[0078] It should also be noted that the units described in this disclosure are to be regarded as logical entities and not with necessity as separate physical entities.