Optical inspection method and optical inspection device for containers

10262405 ยท 2019-04-16

Assignee

Inventors

Cpc classification

International classification

Abstract

An optical inspection method for containers. A container is at least partly illuminated or transilluminated with light of an illuminating device and captured from different viewing directions (R4-R6) as a camera image in each case by at least one camera. In a first image analysis step, first image information of a first inspection zone (A) of the container, for example a stain on a container front face, is ascertained from at least two of the camera images by stereoscopically pairing image points. In a second image analysis step, an individual camera image of the two camera images is analyzed, wherein the first image information is first excluded and second image information of a second inspection zone (B) of the container, for example a crack of a container rear face, is then ascertained.

Claims

1. An optical inspection method for containers, comprising: at least partly illuminating or transilluminating a container with light of an illuminating device and capturing from different viewing directions as a camera image in each case by means of at least one camera, ascertaining, in a first image analysis step, first image information of a first inspection zone of said container from at least two of said camera images by stereoscopically pairing image points, wherein said stereoscopically pairing comprises at least one of two-dimensional superpositioning of said camera images, perspective equalization of said camera images, associating 3D-points on the container with respective image points of said camera images, ascertaining a three-dimensional position of a feature on said container, and photogrammetric analysis of said camera images or triangulation analysis of said camera images, and analyzing in a second image analysis step, an individual camera image of said two camera images, wherein said first image information is first excluded and second image information of a second inspection zone (B) of said container is then ascertained.

2. The optical inspection method according to claim 1, wherein one of said camera images is analyzed in a third image analysis step, where said second image information is first excluded and then third image information of a third inspection zone (C) of said container is ascertained.

3. The optical inspection method according to claim 2, wherein said second inspection zone (B) is in the viewing direction (R5) of said camera image used in said second image analysis step arranged behind said first inspection zone (A), and said third inspection zone (C) is in the viewing direction (R4) of said camera image used in said third image analysis step arranged in front of said second inspection zone (B).

4. The optical inspection method according to claim 2, wherein in said third image analysis step, a different camera image is analyzed than in said second image analysis step.

5. The optical inspection method according to claim 2, wherein said third inspection zone (C) is captured only by said camera image analyzed in said third image analysis step.

6. The optical inspection method according to claim 1, wherein said inspection zones (A-C) do not overlap each other.

7. The optical inspection method according to claim 1, wherein said image information is ascertained by an imaging rule of said camera which correlates spatial points and image points with one another, for example, by way of lens parameters.

8. The optical inspection method according to claim 1, wherein said inspection zones (A-C) are each associated with different sections of said container.

9. An optical inspection device for containers with an illuminating device and at least one camera which is at least partly directed onto a light-emitting surface of said illuminating device, and with an image processing device, said image processing device configured to perform the optical inspection method according to claim 1.

10. The optical inspection device according to claim 9, wherein a mirror cabinet is arranged in front of said camera for capturing several viewing directions onto said container in one camera image.

11. A computer program product on a non-transitory computer readable data carrier comprising machine commands for a microprocessor for performing the optical inspection method according to claim 1.

Description

(1) Further features and advantages of the invention shall be explained below with reference to the figures by way of example, where

(2) FIG. 1 shows a representation in a lateral view of an embodiment of an optical inspection device for containers when performing the optical inspection method; and

(3) FIG. 2 shows a representation in a top view of an embodiment of an optical inspection device for containers when performing the optical inspection method.

(4) FIG. 1 shows a representation in a lateral view of an optical inspection device 1 for containers 2 when performing the optical inspection method. It can be seen that container 2 is transported on a conveyor belt 7 between an illuminating device 3 and two cameras 4 and 5 and is there transilluminated. The direction of transport in FIG. 1 runs perpendicular to the drawing plane. Illuminating device 3 comprises light-emitting surface 3a which emits homogeneously distributed light. It passes through container 2 to cameras 4 and 5 and is there captured as a camera image. However, it is also conceivable that container 2 is alternatively or additionally inspected by way of at least one further illuminating device in incident light and/or the dark field.

(5) It can also be seen that different sections of container 2 are respectively associated with the different inspection zones A-E. Container 2 shown here has a piece of mold 10 in the region of the first inspection zone A, a crack 11 in the region of the second inspection zone B, and a label residue 12 in the region of the third inspection zone C. These objects 10-12 are to be recognized with the optical inspection method described below, and container 2 is then to be sorted out.

(6) It can also be seen that first camera 4 is in the viewing direction R.sub.4 directed from obliquely above onto the container side wall. The image field of camera 4 is there roughly divided into parts 01 and 02. Part 01 there captures inspection zones C and B, and part 02 inspection zones A and E. Furthermore, second camera 5 captures container 2 from the viewing direction R.sub.5 from obliquely below. Here as well, the image field of the second camera 5 is subdivided into parts U1 and U2, where part U1 captures inspection zones A and B and part U2 inspection zones D and E, respectively. It is understood that this division is purely by way of example in order to explain the embodiment. However, any arbitrary other division of the image fields is conceivable.

(7) The optical inspection method is performed with optical inspection device 1 as follows

(8) One camera image is respectively captured with the two cameras 4 and 5 from the two viewing directions R.sub.4 and R.sub.5. This occurs substantially simultaneously. Container 2 is there transilluminated by the light of light-emitting surface 3a being emitted from illuminating device 3, so that both the rear and the front face of container 2 are visible in the camera images. However, inspection zones C and B, respectively, A and E are superimposed in the camera image of first camera 4. Furthermore, inspection zones A and B, respectively, D and E are superimposed in the camera image of camera 5.

(9) In the first image analysis step, image points of the two camera images are now superimposed by stereoscopic pairing. It would in a telecentric image be possible, for example, to superimpose the image points two-dimensionally directly in the overlapping area. On the other hand, by use of an imaging rule and with knowledge of the lens or camera parameters, the individual camera images can also be equalized and also superimposed two-dimensionally. When superpositioning the two camera images, stain 10 then appears at the expected image points in the region of inspection zone A. If the stain were within the container or on the rear face, then it would appear doubled in the area of overlap of the two camera images and could there be recognized as not belonging to first inspection zone A. Stain 10 is now stored as first image information with the type, the location, and the size in a memory. In other words, stain 10 is respectively visible in both partial image fields 02 and U1 in the lower region.

(10) In the second image analysis step, only the camera image of second camera 5 is now used. Stain 10 as well as crack 11 can there be seen in part U1 of the image field of second camera 5. However, no stain is captured in second part U2. Since stain 10 has already been recognized in the first image analysis step, its first image information is first deleted from the camera image or marked as being recognized, respectively. The camera image is now further analyzed with the known image evaluation algorithms. For example, crack 11 is detected as a blackening by use a threshold value. As it is further known that container 2 is empty, detected crack 11 can be associated with second inspection zone B. The type, the location, and the size of crack 11 are now stored in the memory as second image information.

(11) It is additionally or alternatively conceivable that the image points of crack 11 in the two camera elements are correlated by stereoscopic pairing by way or an imaging rule of cameras 4, 5 since second inspection zone B is captured both with part 01 of first camera 4 and also with part U1 of second camera 5.

(12) In a third image analysis step, only the camera image of first camera 4 is then analyzed with respect to inspection zone C, which is captured exclusively with part 01. Accordingly, it is not possible to accomplish the inspection of inspection zone C by stereoscopically pairing image points. First, like also in the second image analysis step, the second image information (of crack 11) is excluded from the further analysis. The camera image is subsequently analyzed with image processing algorithms known per se and label residue 12 is recognized as a blackening. Here again, the type, the location, and the size are determined as third image information and stored in a memory.

(13) Overall, the reliability of detecting errors in the area of third inspection zone C is increased by excluding the previously ascertained image information, thereby increasing the efficiency of the inspection method. This makes it possible in the optical inspection method to perform the inspection more reliably in non-overlapping image regions.

(14) FIG. 2 shows a further embodiment in a top view of an optical inspection device 1 for containers 2 when performing the optical inspection method. It can be seen that container 2 is transported with a conveyor belt 7 between an illuminating device 3 and two cameras 4 and 5 and is transilluminated. Light is here as well emitted by illuminating device 3 in the region of light-emitting surface 3.sup.a, transilluminates container 2 and is captured by cameras 4-6. It is further conceivable that container 2 is alternatively or additionally inspected by way of at least one further illuminating device in incident light and/or the dark field. Optical inspection device 1 shown in FIG. 2 differs from that in FIG. 1 in that three instead of two cameras capture container 2. An even larger area can thereby be inspected circumferentially on container 2.

(15) Accordingly, container 2 is also divided into a total of eight inspection zones A-H. Camera 4 in the viewing direction R.sub.4 captures third inspection zone C and second inspection zone B with the first part of the image field L1, and inspection zones E and F with the second part L2, respectively. Furthermore, second camera 5 in the viewing direction R.sub.5 captures inspection zone E and G with the first part M1 and with the second part M2 the first inspection zone A and the second inspection zone B. Furthermore, third camera 6 in the viewing direction R.sub.6 captures inspection zone A and inspection zone H with the first part R1 and with the second part R2 the inspection zones D and G.

(16) The optical inspection method is in optical inspection device 1 shown in FIG. 2 used as follows:

(17) A camera image of container 2 is first captured by the three cameras 4, 5 and 6 substantially simultaneously from the viewing directions R.sub.4-R.sub.6 illustrated, where the container is transilluminated with illuminating device 3.

(18) Subsequently, those camera images of the second and third cameras 5, 6 are in a first image analysis step analyzed by stereoscopically pairing image points. This results in an overlap, which has been captured with both camera images, arising in the region of first inspection zone A. With the stereoscopic pairing, stain 10 appears in a right region in both parts M2 and R1. The type, the location, and the size of stain 10 are then stored as first image information in a memory.

(19) In a second image analysis step, the camera image of second camera 5 is now analyzed. It is there already known from the first image information that stain 10 is located there in part M2 and the corresponding image point is therefore excluded from further analysis. Furthermore, a crack 11 is in the same part M2 of the camera image detected on the container rear face in the second inspection zone B. Since the superimposed image information of stain 10 has already been deleted, the crack can be recognized particularly reliably with image processing algorithms. The type, the location, and the size of crack 11 are now stored as second image information in the memory.

(20) In the subsequent third image analysis step, the camera image of camera 4 alone is analyzed. It captures in part L1 both the already known crack 11 as well as additionally a label residue 12 in the third inspection region C. Since the crack 11 is already known from the second image information, it is first removed from the camera image, whereby label residue 12 superimposed therewith becomes easier to recognize. It can also be seen that the third inspection zone C is only captured by first camera 4. Consequently, no stereoscopic pairing is therefore possible in inspection zone C. However, higher recognition reliability is possible in the third image analysis step due to the optical inspection method described, since crack 11 has already been recognized in the camera image and was therefore excluded from further processing.

(21) It is understood that objects can in the region of the inspection zones D-H also be recognized by way of the optical inspection method described with reference to FIGS. 1 and 2. Moreover, optical inspection devices 1 from FIGS. 1 and 2 are not restricted to the illustrated orientations of containers 2 but are configured for the inspection containers 2 with any random orientation relative to the device.

(22) Furthermore, the optical inspection devices of FIGS. 1 and 2 also comprise an image processing device (presently not shown) in a computer, where the optical inspection method is stored as a computer program product (software) in a memory and executed on a microprocessor.

(23) It is understood that the features mentioned in the embodiments described above are not restricted to these specific combinations and are also possible in any other combination.