METHOD AND DEVICE FOR CHECKING THE FILL LEVEL OF CONTAINERS

20230236057 ยท 2023-07-27

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for checking the fill level of containers, wherein the containers are transported by a transporter as a container mass flow and measurement data are captured by a sensor unit, and wherein the measurement data are evaluated by an evaluation unit and the fill level of the containers is determined in each case, wherein the measurement data are evaluated by the evaluation unit using an evaluation method that works on the basis of artificial intelligence in order to determine the fill level.

    Claims

    1. A method of checking a fill level of containers, wherein the containers are transported by a transporter as a container mass flow and measurement data of the containers are captured by a sensor unit, and wherein the measurement data are evaluated by an evaluation unit, thereby determining the respective fill level of the containers, wherein the measurement data are evaluated by the evaluation unit using artificial intelligence so as to determine the fill level.

    2. The method according to claim 1, wherein the evaluation unit using artificial intelligence comprises a deep neural network, wherein the measurement data are evaluated with the deep neural network so as to determine the fill level.

    3. The method according to claim 1, wherein the sensor unit comprises a camera, with which the containers are captured as image data, and wherein the measurement data comprises the image data.

    4. The method according to claim 1, wherein the sensor unit comprises different sensors, each operating with a different measurement method, and wherein the containers are captured as the measurement data by the different sensors.

    5. The method according to claim 1, wherein the evaluation unit using artificial intelligence is trained with training data sets, each comprising training measurement data of a training container.

    6. The method according to claim 5, wherein the training measurement data is at least partially evaluated by a user, thereby manually determining additional information.

    7. The method according to claim 5, wherein the training measurement data is at least partially evaluated by a further evaluation unit using a conventionally operating evaluation method while automatically determining additional information.

    8. The method according to claim 5, wherein the training measurement data of the training container is captured by another sensor unit.

    9. The method according to claim 5, further comprising additional information, wherein the additional information comprises at least one of a desired fill level, a completely overfilled state, and a completely underfilled state of the training container captured as the training measurement data, and/or comprises evaluability information about the training measurement data.

    10. The method according to claim 8, wherein the training measurement data of a plurality of training containers is captured establish the training data sets therefrom, wherein each of the plurality of training containers is a different container type and/or grade.

    11. A device for checking a fill level of containers, wherein measurement data are captured and evaluated to determine the fill level performing, the device comprising a transporter configured to transport the containers as a container mass flow, a sensor unit configured to capture the measurement data of the containers, and an evaluation unit configured to evaluate the measurement data and to determine the fill level of each of the containers, wherein the evaluation unit is configured to evaluate the measurement data using artificial intelligence to determine the fill level.

    12. The device according to claim 11, wherein the artificial intelligence comprises a deep neural network to evaluate the measurement data for determining the fill level by using the deep neural network.

    13. The device according to claim 11, wherein the sensor unit comprises a camera to capture the containers as image data, and wherein the measurement data comprises the image data.

    14. The device according to claim 11, wherein the sensor unit comprises different sensors each-configured with a different measurement method to capture the containers as the measurement data.

    Description

    BRIEF DESCRIPTION OF THE FIGURES

    [0035] Further features and advantages of the invention are explained in more detail below with reference to the embodiments shown in the figures. In the figures:

    [0036] FIG. 1 shows a top view of a device for checking the fill level of containers of an embodiment according to the invention;

    [0037] FIG. 2 shows an example of measurement data acquired during checking the fill level; and

    [0038] FIGS. 3A-3B show, as a flow chart, a method of checking the fill level of containers according to an embodiment of the invention.

    DETAILED DESCRIPTION

    [0039] FIG. 1 shows in a top view an embodiment according to the invention of a device 1 for checking the fill level of containers 2. The device 1 is configured for carrying out the method 100 in FIGS. 3A-3B described below.

    [0040] It is evident that the containers 2 are first transferred to the filler 6 by the inlet starwheel 9 and are filled there with a filling material, for example with a beverage. The filler 6 includes, for example, a carousel with filling elements arranged thereon (not shown here), with which the containers 2 are filled with the filling material during transport. Subsequently, the containers 2 are transferred via the intermediate starwheel 10 to the capper 7, where they are provided with a closure, for example with a cork, crown cork, or screw cap. This protects the contents in the containers 2 from environmental influences and prevents them from leaking out of the containers 2.

    [0041] Subsequently, the containers 2 are transferred via the discharge starwheel 11 to the transporter 3, which transports the containers 2 as a container mass flow to the sensor unit 4. The transporter is designed here, by way of example, as a transporter belt, on which the containers 2 are transported in an upright position. The sensor unit 4 arranged thereon includes a first sensor with the illumination device 42 as transmitter and the camera 41 as receiver in order to capture the containers 2 with electromagnetic light radiation in transmitted light. For example, infrared light is used. The illumination device 42 has, for example, a diffusing light emitting disc that is backlit with a plurality of LEDs and that thus forms an illuminated image background for the containers 2 as seen by the camera 41. The camera 41 is then used to capture the measurement data of the containers 2, which is transmitted to the computer system 5 as digital signals. An example of such measurement data from the camera 41 is explained in more detail below with reference to FIG. 2.

    [0042] In addition, it is conceivable that the containers 2 are optionally captured by a second sensor 43, 44, which operates with a different measuring method than the first sensor 41, 42. For example, an X-ray source 44 and an X-ray receiver 43 may be used as transmitter and as receiver, respectively. When the X-ray beam passes through the product, it is attenuated differently than when it passes through the air or foam above the liquid level. Consequently, the containers 2 may be captured by different measuring methods so that in the subsequent evaluation the fill level may be determined even more reliably for different container types and/or grades.

    [0043] Furthermore, the computer system 5 with the evaluation units 51, 52 is illustrated. The computer system 5 includes, for example, a CPU, a memory unit, an input and output unit, and a network interface. Accordingly, the evaluation units 51, 52 are implemented in the computer system 5 as a computer program product.

    [0044] The evaluation unit 51 is designed to evaluate the measurement data of the containers 2 using an evaluation method operating on the basis of artificial intelligence in order to determine the fill level. This is described in more detail below with reference to FIGS. 3A-3B.

    [0045] The further evaluation unit 52 is only optionally present and is used to evaluate training measurement data acquired from training containers (not shown here) by the sensor unit 4. The further evaluation unit 52 is designed to evaluate the training measurement data of the training containers using a conventionally operating evaluation method and, in the process, to automatically determine additional information associated with the respective training container. The additional information is a desired fill level, a completely overfilled state, and/or a completely underfilled state of the training container recorded in the training measurement data and/or evaluability information for the training measurement data. Consequently, the further evaluation unit 52 may be used to automatically provide a plurality of training data sets on a conventional basis to subsequently train the evaluation method of the evaluation unit 51 operating on the basis of artificial intelligence. This is explained in more detail below with reference to FIGS. 3A-3B.

    [0046] After checking the fill level, the containers 2 with the desired fill level are fed to further processing steps, for example to a labeling machine and/or a palletizer. In contrast, containers 2 with a different fill level are diverted from the container mass flow by a diverter for recycling or disposal.

    [0047] FIG. 2 shows an example of measurement data from camera 41 acquired during inspection of the fill level. In this case, it is image data, in which the container 2 is shown in a lateral view with the container body 23, the container shoulder 22 and the container mouth 21. It is evident that the container 2 continues to be filled with the product F, over which the foam S has formed towards the container mouth 21.

    [0048] It can also be seen that the area B2 near the container shoulder 22 and the areas B3.2 at the edge of the container body 23 are shown dark in the measurement data. In contrast, the central area B3.1 of the container body 23 appears bright. This is due to the fact that the electromagnetic light radiation is refracted by the container's transparent material (for example glass or PET) and the filling material F when passing through the container 2, so that only in the central area B3.1 of the container body 23 a direct light path from the illumination device 42 towards the camera 41 is present.

    [0049] Furthermore, it is evident that also the area B2 in the vicinity of the container shoulder 22 does not allow any or only a small direct light path due to the even stronger light refraction. Consequently, depending on the grade, this area B2 is more or less penetrated by scattered light. In addition, the foam S towards the camera 41 is also penetrated by scattered light only, since the electromagnetic light radiation is refracted several times by the bubbles of the foam S.

    [0050] Consequently, the liquid level FS in the measurement data of FIG. 2 may not be identified simply by a jump in brightness. Conventional image processing algorithms would have to be adapted to the type of container and the grade of product F by suitable parameterization. This is where the invention comes in to determine the fill level H.

    [0051] FIGS. 3A-3B show a flow chart of an embodiment of a method 100 according to the invention for checking the fill level of containers 2. The method 100 is described only by way of example with reference to the device 1 previously described with reference to FIG. 1.

    [0052] First, in step 101, the containers 2 are transported by the transporter 3 as a container mass flow. This is done, for example, by means of a transporter belt or a carousel. In this process, the containers 2 are transported to the sensor unit 4.

    [0053] In the subsequent step 102, the measurement data of the containers 2 are captured by the sensor unit 4. For example, the containers 2 are transilluminated by a sensor including the illumination unit 42 and the camera 41 and thus captured in the form of image data.

    [0054] Optionally, in step 103, the containers 2 are additionally captured by a differently configured sensor. For example, an X-ray beam from the X-ray source 44 passes through the containers 2 and is captured by the X-ray receiver 43. As a result of the fact that the containers 2 are inspected with the different measuring methods of the sensors 41, 42 or 43, 44, the determination of the fill level H is particularly reliable.

    [0055] Subsequently, in step 104, the measurement data are evaluated with the evaluation unit 51 using an evaluation method operating on the basis of artificial intelligence, whereby the fill level H of each of the containers 2 is determined. For this purpose, the evaluation method includes at least one method step with a deep neural network, for example a convolutional neural network. In this process, the measurement data first pass through an input layer, several convolution layers and/or hidden layers, a pooling layer and an output layer. With the output layer, for example, the fill level H is output directly. It is also conceivable that a completely overfilled state, a completely underfilled state of the container recorded in the measurement data and/or evaluability information is additionally output for the measurement data.

    [0056] If the fill level H determined in this way is in order according to the following step 106, the containers 2 are fed to further treatment steps in step 107. Otherwise, the containers are rejected for recycling or disposal in step 108.

    [0057] In order to train the evaluation method of step 104, which operates on the basis of artificial intelligence, it is trained in advance with a large number of training data sets (step 105). The training data sets each include training measurement data of a training container and associated additional information. The additional information describes, for example, the fill level, a completely overfilled state, a completely underfilled state of the training container recorded in the training measurement data, and/or evaluability information about the training measurement data. Consequently, for training the deep neural network, data of both the input layer in the form of the training measurement data and the output layer in the form of the associated additional information are known, and the deep neural network may be trained accordingly on different container types and/or grades. Consequently, the user does not have to elaborately parameterize the evaluation to the different container types and/or grades.

    [0058] As shown in FIG. 3B, to create the training data sets, the training containers are captured as the training measurement data by the sensor unit 4 or by another sensor unit not shown here (step 109).

    [0059] Thereafter, in step 110, the training measurement data may be at least partially evaluated by a user to manually determine the additional information. For example, the user may manually mark the fill level H in the image data, as shown in FIG. 2.

    [0060] Alternatively or additionally, the training measurement data is at least partially evaluated with the further evaluation unit 52 using a conventionally operating evaluation method and the additional information is determined automatically in the process. In this way, a particularly large number of training data sets may be provided. This is suitable, for example, for already known container types, for which the conventionally operating evaluation method works particularly well and reliably.

    [0061] Subsequently, in step 112, the training data sets are formed, each including the training measurement data of a training container and the associated additional information. The training data sets are then transferred to step 105, thereby training the evaluation method operating on the basis of artificial intelligence.

    [0062] By evaluating the measurement data by the evaluation unit 51 using the evaluation method operating on the basis of artificial intelligence in order to determine the fill level H, the evaluation method may be set up equally for different container types and/or grades without requiring renewed parameterization when changes occur. Consequently, the evaluation method operating on the basis of artificial intelligence no longer needs to be extensively parameterized and optimized by an experienced user in order to specifically set it up for a container type and/or grade. In addition, incorrect settings may be reduced, whereby the method 100 and the device 1 operate more reliably and thus more cost-efficiently.

    [0063] It is understood that features mentioned in the previously described embodiments are not limited to these feature combinations, but are also usable individually or in any other feature combinations.