SYSTEM AND COMPUTER-IMPLEMENTED METHOD FOR IMAGE DATA QUALITY ASSURANCE IN AN INSTALLATION ARRANGED TO PERFORM ANIMAL-RELATED ACTIONS, COMPUTER PROGRAM AND NON-VOLATILE DATA CARRIER
20230252615 · 2023-08-10
Inventors
Cpc classification
A01J5/007
HUMAN NECESSITIES
G02B27/0006
PHYSICS
International classification
G06V10/75
PHYSICS
Abstract
An imaging system registers image data (D.sub.img) in connection with an installation performing at least one action relating to an animal. A system for image data quality assurance contains a control unit and a digital storage unit. The control unit obtains image data (D.sub.img) registered by the imaging system when the installation is in an idle mode. The control unit analyzes the obtained image data (D.sub.img) to determine if a cleaning action to remove dirt (D) from a front window of the imaging system has been performed. If it is determined that such a cleaning action has been performed, control unit (120) causes a point in time for the cleaning action to be recorded in the digital storage unit (130) for use performance tracking of the installation in conjunction with the cleaning actions.
Claims
1. A system for image data quality assurance of an installation arranged to perform at least one action relating to an animal (100), the system comprising: a control unit (120) and a digital storage unit (130) communicatively connected to the control unit (120), the control unit (120) being configured to: obtain image data (D.sub.img) registered by an imaging system (110) of the installation when the installation is in an idle mode; analyze the obtained image data (D.sub.img) to determine if a cleaning action to remove dirt (D) from a front window (111) of the imaging system (110) has been performed; and if it is determined that the cleaning action has been performed, cause a point in time (t.sub.i, t.sub.1, t.sub.2, t.sub.3, t.sub.4) for the cleaning action to be recorded in the digital storage unit (130).
2. The system according to claim 1, wherein the control unit (120) is configured to determine that said cleaning action has been performed manually outside of a schedule for automatic cleaning of the front window (111).
3. The system according to claim 1, wherein the control unit (120) is configured to determine that the cleaning action is performed based on at least one of: image data (D.sub.img) estimated to represent a first object (201) covering at least a predefined portion of a total image area (A) registrable by the imaging system (110) through the front window (111), which first object (201) moves (M) and/or is positioned relative to the front window (111) in a way deemed to cause dirt (D) to be removed therefrom; image data (D.sub.img) estimated to represent a second object (202) located within a distance range (d.sub.R) from the front window (111), which second object (202) moves (M) and/or is positioned relative to the front window (111) in a way deemed to cause dirt (D) to be removed therefrom; image data (D.sub.img) designating an intensity of light which is estimated to be reflected from a third object that moves (M) and/or is positioned relative to the front window (111) in a way deemed to cause dirt (D) to be removed therefrom; and image data (D.sub.img) estimated to represent a fourth object (204) having a shape that matches an outline of a predefined cleaning tool to a degree above a threshold matching level, which fourth object (204) moves (M) and/or is positioned relative to the front window (111) in a way deemed to cause dirt (D) to be removed therefrom.
4. The system according claim 1, wherein the control unit (120) is configured to determine that the cleaning action is performed based on a confidence measure indicating a reliability of the image data (D.sub.img).
5. The system according claim 1, wherein the control unit is configured to analyze the obtained image data (D.sub.img) by executing an image processing algorithm adapted to detect an amount of dirt on the front window (111).
6. The system according to claim 5, wherein the control unit is further configured to: execute the image processing algorithm at first and second points in time, the second point in time being later than the first point in time; and if the image processing algorithm detects that the amount of dirt on the front window (111) is lower at the second point in time than at the first point in time, cause the second point in time to be recorded in the digital storage unit (130) as a point in time when the cleaning action was performed.
7. The system according to claim 3, wherein the control unit (120) is further configured to determine that the cleaning action is performed based on a duration of a period during which at least one of the first, second, third and fourth objects (201, 202, 204) is located within a view range of the imaging system (110).
8. The system according claim 1, wherein, the control unit (120) is configured to: obtain, from the digital storage unit (130), a series of recorded points in time ({t.sub.1, t.sub.2, t.sub.3, t.sub.4}) for the cleaning action; and generate a report (R) based on said series of recorded points in time ({t.sub.1, t.sub.2, t.sub.3, t.sub.4}), which report (R) describes cleaning activity performed to remove dirt (D) from the front window (111) of the imaging system (110).
9. The system according to claim 8, wherein the control unit (120) is further configured to forward the report (R) to a display unit for presentation of the report (R) together with information reflecting a performance of said installation.
10. The system according to claim 9, wherein the performance reflects at least a number of animals (100) served by said installation per unit of time during at least one period in relation to at least one point in time in said series of recorded points in time ({t.sub.1, t.sub.2, t.sub.3, t.sub.4}).
11. A computer-implemented method of quality assuring image data of an installation arranged to perform at least one action relating to an animal (100), the method comprising: obtaining image data (D.sub.img) having been registered by an imaging system (110) of said installation when the installation is in an idle mode; analyzing the obtained image data (D.sub.img) to determine if a cleaning action to remove dirt (D) from a front window (111) of the imaging system (110) has been performed; and if it is determined that such a cleaning action has been performed, causing a point in time (t.sub.i, t.sub.1, t.sub.2, t.sub.3, t.sub.4) for the cleaning action to be recorded in a digital storage unit (130).
12. The method according to claim 11, wherein said cleaning action is determined to have been performed manually outside of a schedule for automatic cleaning of the front window (111).
13. The method according to claim 11, wherein the determining that the cleaning action is performed based on at least one of: image data (D.sub.img) estimated to represent a first object (201) covering at least a predefined portion of a total image area (A) registrable by the imaging system (110) through the front window (111), which first object (201) moves (M) and/or is positioned relative to the front window (111) in a way deemed to cause dirt (D) to be removed therefrom; image data (D.sub.img) estimated to represent a second object (202) located within a distance range (d.sub.R) from the front window (111), which second object (202) moves (M) and/or is positioned relative to the front window (111) in a way deemed to cause dirt (D) to be removed therefrom; image data (D.sub.img) designating an intensity of light which is estimated to be reflected from a third object that moves (M) and/ or is positioned relative to the front window (111) in a way deemed to cause dirt (D) to be removed therefrom; and image data (D.sub.img) estimated to represent a fourth object (204) having a shape that matches an outline of a predefined cleaning tool to a degree above a threshold matching level, which fourth object (204) moves (M) and/or is positioned relative to the front window (111) in a way deemed to cause dirt (D) to be removed therefrom.
14. The method according to claim 11, wherein the determining that the cleaning action is performed based on a confidence measure indicating a reliability of the image data (D.sub.img).
15. The method according to claim 11, wherein the analyzing of the obtained image data (D.sub.img) involves executing an image processing algorithm adapted to detect an amount of dirt on the front window (111).
16. The method according to claim 15, wherein the method comprises: executing the image processing algorithm at first and second points in time, the second point in time being later than the first point in time; and if the image processing algorithm detects that the amount of dirt on the front window (111) is lower at the second point in time than at the first point in time, causing the second point in time to be recorded in the digital storage unit (130) as a point in time when the cleaning action was performed.
17. The method according to claim 11, further comprising determining that the cleaning action is performed based on a duration of a period during which at least one of the first, second, third and fourth objects (201, 202, 204) is located within a view range of the imaging system (110).
18. The method according to claim 11, further comprising: obtaining, from the digital storage unit (130), a series of recorded points in time ({t.sub.1, t.sub.2, t.sub.3, t.sub.4}) for the cleaning action; and generating a report (R) based on said series of recorded points in time ({t.sub.1, t.sub.2, t.sub.3, t.sub.4}), which report (R) describes manual cleaning activity performed to remove dirt (D) from the front window (111) of the imaging system (110).
19. The method according to claim 18, further comprising: forwarding the report (R) to a display unit for presentation of the report (R) together with information reflecting a performance of said installation.
20. The method according to claim 19, wherein the performance reflects at least a number of animals (100) served by said installation per unit of time during at least one period in relation to at least one point in time in said series of recorded points in time ({t.sub.1, t.sub.2, t.sub.3, t.sub.4}).
21. A non-volatile data carrier (626) containing a computer program (627), the non-volatile data carrier (626) communicatively connected to a processing unit (625), the computer program (627) comprising software for executing the method according claim 11 when the computer program is run on the processing unit (325).
22. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] The invention is now to be explained more closely by means of preferred embodiments, which are disclosed as examples, and with reference to the attached drawings.
[0022]
[0023]
[0024]
[0025]
[0026]
DETAILED DESCRIPTION
[0027] In
[0028] The imaging system 110 is configured to forward the image data D.sub.img to a control unit 120, which is further communicatively connected to a digital storage unit 130. Here, the control unit 120 and the storage unit 130 form part of a system for image data quality assurance according to one embodiment of the invention.
[0029] The control unit 120 is configured to obtain the image data D.sub.img registered by the imaging system 110 of the installation when the installation is in an idle mode, i.e. when the imaging system 110 is not engaged in registering image data D.sub.img to be used for monitoring and/or controlling a device in the installation, such as a milking robot.
[0030] The control unit 120 is configured to analyze the obtained image data D.sub.img to determine if a cleaning action is performed, which cleaning action aims at removing dirt D from a front window 111 of the imaging system 110. By front window 111 is preferably meant a protective glass for the imaging system’s 110 lens arrangement. However, in the absence of such a protective glass, the frontmost lens of the lens arrangement itself designates the front window 111. If it is determined that said cleaning action is performed, the control unit 120 is configured to cause a point in time t.sub.i for the cleaning action to be recorded in the digital storage unit 130.
[0031] Although the detected cleaning action may be carried out automatically as part of a programmed, or predefined, cleaning routine for the imaging system 110, it is preferably presumed that the detected cleaning action is performed manually outside of a schedule for automatic cleaning of the front window 111. For example, the detected cleaning action may instead be carried out by an operator who is swiping off the surface with a piece of cloth, a squeegee or similar tool.
[0032]
[0033]
[0034] As yet another example (not illustrated), a third object that may be used to perform a cleaning action aiming at removing dirt D from the front window 111 of the imaging system 110. According to one embodiment of the invention, the control unit 120 is configured to determine that the cleaning action is performed based on image data D.sub.img designating an intensity of light which is estimated to be reflected from the third object that, regardless of its shape and distance to the front window 111, moves M and/or is positioned relative to the front window 111 in such a way that it is deemed to cause dirt D to be removed from the front window 111. Thus, the third object that may any kind of object suitable for cleaning the front window 111.
[0035]
[0036] Further, the control unit 120 may be configured to determine that the cleaning action is performed based on a duration of a period during which at least one of the first, second, third and/or fourth objects is located within a view range of the imaging system 110.
[0037] According to one embodiment of the invention, the control unit 120 is configured to determine that the cleaning action is performed based on a confidence measure indicating a reliability of the image data D.sub.img. The confidence measure may be represented by one or more data bits associated to image data D.sub.img, for example to each pixel thereof, which designates how reliable said image data D.sub.img is. Thus, the control unit 120 may weigh in a reliability factor in the analysis of the image data D.sub.img, and consequently rely on this data in proportion to its information value.
[0038] According to one embodiment of the invention, the control unit 120 is configured to analyze the obtained image data D.sub.img by executing an image processing algorithm, which is adapted to detect an amount of dirt on the front window 111. The image processing algorithm may for example be configured to recognize how particles of dirt D occur on the front window 111.
[0039] Preferably, the control unit 120 is further configured to execute the image processing algorithm at first and second points in time, where the second point in time is later than the first point in time. If the image processing algorithm detects that the amount of dirt D on the front window 111 is lower at the second point in time than at the first point in time, it is reasonable that a cleaning action has been performed sometime in between the first and second points in time. Of course, however, it cannot be deduced exactly when. Therefore, if the image processing algorithm detects that the amount of dirt D on the front window 111 is lower at the second point in time than at the first point in time, the control unit 120 is configured to cause the second point in time to be recorded in the digital storage unit 130 as a point in time when the cleaning action was performed.
[0040]
[0041]
[0042] According to one embodiment of the invention, the control unit 120 is also configured to obtain, from the digital storage unit 130, a series of recorded points in time, for example {t.sub.1, t.sub.2, t.sub.3, t.sub.4} for the cleaning actions, and generate a report R based on the series of recorded points in time {t.sub.1, t.sub.2, t.sub.3, t.sub.4}. The report R describes the cleaning activity having been performed to remove dirt D from the front window 111 of the imaging system 110, and thus maintain a reasonably clear view for the imaging system 110 over a period of time.
[0043] Preferably, the control unit 120 is further configured to forward the report R to a display unit (not shown) for presentation of the report R together with information reflecting a performance of the installation in which the imaging system 110 is included. For instance, the performance may reflect a number of animals 100 served by the installation per unit of time during at least one period in relation to at least one point in time in said series of recorded points in time {t.sub.1, t.sub.2, t.sub.3, t.sub.4}. This means that, the report R may for example describe the performance before and after a point in time t.sub.2 when a cleaning action was detected. Thus, the farmer can readily determine the effects of his/her efforts to keep the front window 111 clean in addition to any automatic cleaning procedures applied for the same purpose.
[0044] It is generally advantageous if the control unit 120 is configured to effect the above-described procedure in an automatic manner by executing a computer program 627. Therefore, the control unit 120 may include a memory unit 625, i.e. non-volatile data carrier, storing the computer program 627, which, in turn, contains software for making processing circuitry in the form of at least one processor 623 in the control unit 120 execute the above-described actions when the computer program 627 is run on the at least one processor 623.
[0045] In order to sum up, and with reference to the flow diagram in Figure 7, we will now describe the general computer-implemented method according to the invention of quality assuring image data in an installation arranged to perform at least one action in relation to an animal 100.
[0046] In a first step 710, image data D.sub.img are obtained, which image data D.sub.img have been registered by an imaging system 110 of the installation when the installation is in an idle mode.
[0047] In a subsequent step 720, the obtained image data D.sub.img are analyzed to determine if a cleaning action is performed aiming to remove dirt D from a front window 111 of the imaging system 110. If it is determined that such a cleaning action is performed, a step 730 follows. Otherwise, the procedure loops back to step 710. In step 730, a point in time t.sub.i for the cleaning action is recorded in a digital storage unit. Thereafter, the procedure loops back.
[0048] All of the process steps, as well as any sub-sequence of steps, described with reference to
[0049] Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims.
[0050] The term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components. The term does not preclude the presence or addition of one or more additional elements, features, integers, steps or components or groups thereof. The indefinite article “a” or “an” does not exclude a plurality. In the claims, the word “or” is not to be interpreted as an exclusive or (sometimes referred to as “XOR”). On the contrary, expressions such as “A or B” covers all the cases “A and not B”, “B and not A” and “A and B”, unless otherwise indicated. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
[0051] It is also to be noted that features from the various embodiments described herein may freely be combined, unless it is explicitly stated that such a combination would be unsuitable.
[0052] The invention is not restricted to the described embodiments in the figures, but may be varied freely within the scope of the claims.