Unknown

20200074613 ยท 2020-03-05

    Inventors

    Cpc classification

    International classification

    Abstract

    Image acquisition for meat grading. In one embodiment, a method includes aligning a suspended slaughtered animal half according to a predetermined orientation and retaining this orientation by a positioning device designed as part of a processing line until image acquisition of a cut surface; executing a 3D measurement and setting a trigger point in a control device, wherein the position of the cut surface is determined by an image processing device from a three-dimensional image recorded by a first image acquisition device; aligning the optical components of a second image acquisition device in accordance with the position in space determined for the cut surface; and trigger-initiated execution of at least one two-dimensional image acquisition of the cut surface by the second image acquisition device, the optical components of which are aligned beforehand, when a trigger condition coupled to the trigger point is met, as identified by the above control device.

    Claims

    1. A method for image acquisition, in accordance with which, at a suspended slaughtered animal half moving along a tube track through a processing line, for a meat grading by way of an electronic image processing device, at least one subregion of a cut surface produced previously at the slaughtered animal half undergoes image acquisition, comprising the steps: a. alignment of the suspended slaughtered animal half in accordance with a predetermined orientation and retention of this orientation by means of a positioning device, which is designed as a part of the processing line, until the conclusion of the image acquisition of the cut surface or the subregion thereof; b. execution of a 3D measurement and, in association therewith, setting of a trigger point in a control device, wherein, for the slaughtered animal half, the position in space of the cut surface that is to undergo image acquisition for the meat grading is determined by the image processing device from a three-dimensional image recorded by a first image acquisition device; c. alignment of the optical components of a second image acquisition device, which is designed for the image acquisition of the cut surface or of the subregion of the cut surface, in accordance with the position in space determined for the cut surface that is to undergo image acquisition; d. trigger-initiated execution of at least one two-dimensional image acquisition of the cut surface or of the subregion of the cut surface by means of the second image acquisition device, the optical components of which are aligned beforehand, when a trigger condition coupled to the trigger point set in accordance with method step b) is met, as identified by the mentioned control device.

    2. A method for image acquisition, according to which, for a grading of the rib eye of a side of beef of a slaughtered animal by means of an electronic image processing device, a cut surface, namely, the top cut surface or the bottom cut surface of a cut made beforehand between the 12th rib and the 13th rib in the slaughtered animal half, undergoes image acquisition, with said cut not completely severing the slaughtered animal half and running essentially horizontally in regard to the suspended slaughtered animal half at a tube track moving through a processing line, comprising the steps: a. alignment of the suspended slaughtered animal half, furnished with the gaping cut, in accordance with a predetermined orientation and retention of this orientation by means of a positioning device formed as a part of the processing line until the image acquisition of the cut surface has concluded; b. execution of a 3D measurement and, in association therewith, setting of a trigger point in a control device, wherein, for the slaughtered animal half, the position of the cut surface that contains the rib eye and is to undergo image acquisition for the meat grading is determined by the image processing device from a three-dimensional image recorded by a first image acquisition device; c. alignment of the optical components of a second image acquisition device designed for the image acquisition of the cut surface in accordance with the position in space determined for the cut surface that is to undergo image acquisition; d. trigger-initiated execution of at least one two-dimensional image acquisition of the cut surface by means of the second image acquisition device, the optical components of which are aligned beforehand, when a trigger condition coupled to the trigger point set in accordance with method step b) is met, as identified by the mentioned control device.

    3. The method according to claim 2, wherein, in each case, in the method step d), trigger-started by means of the second image acquisition device, a two-dimensional image acquisition of the cut surface first occurs and, immediately afterwards, a three-dimensional image acquisition of the cut surface occurs.

    4. The method according to claim 2 in continuous application for a plurality of slaughtered animal halves that are suspended next to one another and move through the processing line, wherein the halves of each carcass of a slaughtered animal are aligned in such a way that the entry sides of the cuts made for producing the cut surface in these two slaughtered animal halves of the respective carcass of a slaughtered animal face each other.

    5. The method according to claim 2 in continuous application for a plurality of slaughtered animal halves that are suspended next to one another and move through the processing line, wherein the halves of each carcass of a slaughtered animal are aligned in such a way that their bone sides face the optical components in the image acquisition devices and in that the halves of each carcass of a slaughtered animal face each other at the side of the backbone.

    6. The method according to claim 5, wherein an applied force is exerted by the positioning device against the free end of the slaughtered animal half situated below the cut and the slaughtered animal half is aligned at an inclination toward the vertical transversely to the tube track in such a way that the bottom cut surface of the slaughtered animal half is inclined toward the optical components of the two image acquisition devices, whereas the top cut surface is aligned at an inclination away from the optical components of the image acquisition devices.

    7. The method according to claim 2, wherein, as a positioning device, at least one vertically arranged conveyor belt, which is brought into contact with the slaughtered animal to be aligned in each case, is used, the conveying direction of which corresponds to the direction of movement of the slaughtered animal half in the processing line and which is equipped with an incremental measuring wheel encoder, wherein, for setting a trigger point with the conclusion of the positional determination for the cut surface that is to be evaluated, a counter counting the revolutions of the measuring wheel of said encoder is set to zero and wherein the trigger condition for starting the subsequent image acquisition of the cut surface that is to be evaluated by means of the second image acquisition device involves a path distance that is to be traversed by the vertically arranged conveyor belt after setting of the trigger point and is coded by a number of revolutions of the measuring wheel of the encoder deposited in the control device.

    8. A system for image acquisition of a cut surface, which has, as a bottom cut surface or as a top cut surface of a slaughtered animal suspended along a tube track and moving through a processing line, a horizontally running cut, which does not completely sever the slaughtered animal half, for the purpose of the meat grading, wherein the system has a control device as well as an image processing device, a positioning device for alignment of the suspended slaughtered animal half in accordance with a predetermined orientation for image acquisition of the cut surface, a first image acquisition device for recording a three-dimensional image for subsequently determining, in interaction with the image processing device in the course of a 3D measurement, the position in space of the cut surface that is to undergo image acquisition for the meat grading, with means interacting with the control device for setting a trigger point at the conclusion of the 3D measurement for determining the position of the cut surface, a second image acquisition device, which is aligned by the control device in regard to the optical components thereof in accordance with the position determined for the cut surface, for an acquisition, started by trigger control by means of the control device, of at least one image of the cut surface of the slaughtered animal half that is to be analyzed by the image processing device for the meat grading.

    9. The system according to claim 8, wherein the positioning device is arranged for alignment and guiding of the suspended slaughtered animal half by at least one vertically arranged conveyor belt that is brought into contact with the slaughtered animal, the conveying direction of which is identical to the direction of movement of the slaughtered animal half in the processing line and the top edge of which extends below the bottom cut surface produced by the cut at the slaughtered animal half.

    10. The system according to claim 9, further characterized in that the conveying speed of the at least one vertically arranged conveyor belt forming the positioning device corresponds to the speed at which the slaughtered animal half is moved along the tube track through the processing line.

    11. The system according to claim 9, wherein the at least one vertically arranged conveyor belt forming the positioning device is equipped with an incremental measuring wheel encoder, to which is assigned a counter in the control device interacting with it, said counter being set to zero for setting a trigger point, and wherein the aforementioned counter is incremented with each revolution of the measuring wheel of the measuring wheel encoder and is compared by the control unit with a value deposited in the control unit as a trigger condition, which codes a path distance traversed by the at least vertically arranged conveyor belt since the setting of the trigger point.

    12. The system according to claim 9, wherein the at least one vertical conveyor belt of the positioning device is arranged in a common vertical plane with the tube track at which the slaughtered animal half is moved through the processing line.

    13. The system according to claim 9, wherein the positioning device comprises a second vertically arranged conveyor belt that is brought into contact with the slaughtered animal half, the bottom edge of which extends above the top cut surface produced by the cut at the slaughtered animal half and the conveying speed of which is equal to the conveying speed of the other vertically arranged conveyor belt of the positioning device.

    14. The system according to claim 13, wherein the two conveying belts of the positioning device have a common drive.

    15. The system according to claim 8, wherein the optical components of the second image acquisition device serving for the actual acquisition of the cut surface produced at a slaughtered animal half for the purpose of the meat grading are arranged at an arm of a robot.

    16. The system according to claim 15 wherein the optical components of the second image acquisition device are arranged jointly with the optical components of the first image acquisition device at the arm of a robot.

    17. The system according to claim 8, wherein the first image acquisition device is a laser scanner.

    18. The system according to claim 8, wherein the second image acquisition device is formed for two-dimensional and three-dimensional image acquisition of the cut surface that is to undergo image acquisition for the meat grading.

    19. The system according to claim 18, wherein the second image acquisition device comprises a 2D camera and a laser that projects a plurality of parallel light strips for application of the light section method onto each cut surface that is to undergo image acquisition.

    20. The system according to claim 8, wherein at least one of the two image acquisition devices comprises a stereo camera.

    21. The system according to claim 8, wherein at least one of the two image acquisition devices comprises a TOF camera, that is, a camera operating according to the time of flight principle.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0047] FIG. 1: a plan view of a possible embodiment of the system in a schematic illustration,

    [0048] FIG. 2: the embodiment in accordance with FIG. 1 in a side view.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0049] FIG. 1 shows a possible embodiment of the system according to the invention in a strongly schematic illustration. Shown in the drawing are key components of the system in assignment to a cutout of the tube track 2 of a meat processing line in plan view, in which, by way of illustration, also four slaughtered animal halves 1, that is, two sides of beef of the carcasses of slaughtered animals, which are suspended at the aforementioned tube track and which pass the system during their movement through the processing line, are illustrated.

    [0050] In accordance herewith, the system essentially comprises a positioning device 4, a first image acquisition device 5, and a second image acquisition device 6 as well as an image processing device 8 and a control device 7, with the two last-mentioned devices forming a common unit in the example shown. In accordance with the example, the positioning device 4 is a vertically arranged conveyor belt, the conveying direction and conveying speed of which correspond to the direction of movement 12 and the speed of movement with which the slaughtered animal halves 1 are moved through the processing line. The aforementioned individual devices and units of the system can be connected in a hard-wired manner or in a wireless manner, for example, in order to exchange data and control commands and, accordingly, they can be brought into an operative connection.

    [0051] By means of the positioning device 4, that is, by means of the conveyor belt that is brought into contact with the slaughtered animal halves 1 suspended at the tube track, the slaughtered animal halves 1 are aligned in such a way that their respective bone side 10 faces the optical components, which are not individually shown, in the image acquisition devices, namely, the first image acquisition device 5 and the second image acquisition device 6. The vertically installed conveyor belt of the positioning device 4 is hereby arranged together with the tube track 2 nearly in a common vertical plane. It is hereby achieved that the slaughtered animal halves 1 suspended at the tube track 2 are aligned at an inclination toward the vertical by the positioning device 4 (the conveyor belt) slightly in the direction of the arrow 13. The free bottom end of the slaughtered animal halves 1 suspended at the tube track 2 is hereby forced by the conveyor belt of the positioning device 4 more or less away from the image acquisition devices 5, 6. In this way, in each case, the bottom cut surface 3 produced by making a cut in the slaughtered animal halves (not seen here, but shown in FIG. 2), is inclined toward the optical components in the image acquisition devices 5, 6.

    [0052] By means of the first image acquisition device 5, in interaction with the image processing device 8, through a 3D measurement, the respective position of the cut made in each of the slaughtered animal halves 1 (not seen here), that is, the position of the cut made through said slaughtered animal halves, is determined in space for the cut surface 3 taken for the meat grading. Once the position of this cut surface 3 is determined for a respective slaughtered animal half 1, a trigger point is set in the control device 7 in that a counter, which is assigned to a measuring wheel encoder 9 provided at the perpendicularly arranged conveyor belt of the positioning device 4, is set to the value zero. In the course of the further movement of the conveyor belt and, accordingly, the further movement of the related slaughtered animal half 1 that is brought into contact with the conveyor belt of the positioning device 4 along the tube track 2, the aforementioned counter contained in the control device 7 is increased, that is, incremented with each revolution of the measuring wheel of the measuring wheel encoder 9. Furthermore, the counter value obtained in each case by incrementation is continuously compared with a value that is deposited as a trigger condition in the control unit, said value being an equivalent for the path distance, that is, being coded for the distance, that the conveyor belt of the positioning device 4 and accordingly the slaughtered animal half brought into contact with the conveyor belt have to traverse after the trigger point has been set (after the counter assigned to the measuring wheel has been zeroed) until the cut surface 3 made in the slaughtered animal half 1 for the meat grading can undergo reliable image acquisition by means of the second image acquisition device 6.

    [0053] During the further movement of the conveyor belt and the slaughtered animal half brought into contact with it, moreover, optical components (not shown here) of the second image acquisition device 6 are positioned in space in accordance with the specific position of the cut surface that is to undergo image acquisition for the meat grading by use of the first image acquisition device 5, with the required positioning of these optical components, that is, the change in position thereof that possibly occurs in relation to the coordinate of the direction of movement 12 in the processing line, being taken into consideration in the comparison value representing the trigger for the counter of the measuring wheel encoder 9.

    [0054] When the trigger condition is met, that is, after the conveyor belt of the positioning device 4 and, together with it, the slaughtered animal half 1 that is to be evaluated in each case have traversed a predetermined, typically short distance within the processing line, the image acquisition of the cut surface 3 produced in this slaughtered animal half 1 by means of the second image acquisition device 6 is triggered. The image or images hereby recorded by the second image acquisition device 6 are analyzed by the image processing device 8 and a software running on it and, as a result of this, statements in regard to meat quality, in regard to meat yield, or in regard to both are derived from this analysis. In this way, for example, statements about the meat quality and the meat yield are made by way of the image acquisition of a cut surface 3 that was produced beforehand by making a cut between the 12th rib and the 13th rib in a side of beef of a slaughtered animal and includes the rib eye.

    [0055] In FIG. 2, the embodiment of the system in accordance with FIG. 1 is shown once again in a somewhat detailed side view, but also in a still strongly schematic illustration. In this illustration, in particular, also the outer edges of each of the cut surfaces 3 produced by making a cut in the slaughtered animal halves 1 can be seen. By means of the first image acquisition device 5, a laser scan of a respective slaughtered animal half 1 is carried out, by way of which, in interaction with the image processing device 8, the position of the cut surface 3 produced at the slaughtered animal half 1 and containing the rib eye is calculated in three-dimensional space. In accordance with the outcome of this calculation, the optical components of the second image acquisition device 6 are aligned for the subsequent image acquisition of at least one two-dimensional cut surface 3 on which the meat grading is based. The optical components of the second image acquisition device 6 are, for example, a 2D camera and a laser.

    [0056] After the trigger-controlled start of the image acquisition of the cut surface 3 by the second image acquisition device 6, a two-dimensional image thereof is initially acquired by means of the 2D camera. Immediately following this, a three-dimensional image acquisition of the cut surface 3 occurs. To this end, by means of the laser, parallel lines are projected on the cut surface, by use of which, in interaction with the image processing device 8, a three-dimensional image of the cut surface 3 is obtained from the images acquired with the 2D camera of the second image acquisition device 6. Through analysis of the color variation in the two-dimensional image or images acquired from the cut surface by means of the image processing device 8, it is then possible to make statements in regard to marbling of the rib eye contained in the cut surface 3, in regard to the fat/meat ratio, and in regard to the total meat yield for the rib eye in question.

    [0057] The 3D image or images of the cut surface 3 obtained subsequently to the two-dimensional acquisition of the cut surface 3 by means of the laser-using light section method is or are utilized for possibly required corrections of the results of the two-dimensional acquisition of the cut surface 3, with corrections of this kind being possibly required on account of possible changes in the position of the cut surface 3 during the movement of the slaughtered animal half 1 between the first image acquisition device 5 and the second image acquisition device 6.

    [0058] In the exemplary embodiment shown in FIGS. 1 and 2, the first image acquisition device 5 and the second image acquisition device 6 are arranged at stations that are separated from one another at the side of the tube track 2. As can be seen in FIG. 2, in spite of the schematic illustration, the optical components of the second image acquisition device 6 are hereby arranged at an arm of a robot 11 that serves for the alignment thereof. The robot arm 11 has six dynamic degrees of freedom and is moved, under the control of the control device 7, in accordance with the course of the 3D measurement at the work station with the first image acquisition device 5 in regard to the data for the position of the cut surface 3 at a respective slaughtered animal half 1, in order to align the optical components of the second image acquisition device 6. The optical components of the first image acquisition device 5for example, a laser scannerare arranged in a fixed position at a tripod in the exemplary embodiment shown. At this tripod 14, in the course of setting up the system for the purpose of positional determination of the cut surfaces 3 of the slaughtered animal halves 1 that pass them, said optical components are aligned uniquely by means of the described principle of grayscale values in regard to a locally minimum distance of the cut surfaces 3 serving as a reference line for this purpose.

    [0059] Fundamentally, it is also conceivable to arrange the optical components of the first image acquisition device 5 and of the second image acquisition device 6 jointly at an arm of a robot 11 and, after the spatial position of the cut surface 3 within the slaughtered animal half 1 has been established, to hereby track the robot arm 11 along the further moving carcasses of slaughtered animals 1 for the purpose of the actual image acquisition of the cut surface 3 by means of the optical components of the second image acquisition device 6.