Image capturing system and a method for determining the position of an embossed structure on a sheet element
11448501 · 2022-09-20
Assignee
Inventors
Cpc classification
G06V10/751
PHYSICS
G06V10/145
PHYSICS
G01B11/26
PHYSICS
G06V10/60
PHYSICS
G01N2021/8909
PHYSICS
G06V30/144
PHYSICS
International classification
G01B11/26
PHYSICS
G06V30/144
PHYSICS
G06V10/60
PHYSICS
G06V10/145
PHYSICS
G06V10/75
PHYSICS
Abstract
An image capturing system (10), and a corresponding method, for determining the position of an embossed structure (30) on a sheet element (4) moved though a viewing area (18). It includes a camera (12) adapted for capturing a line image of the surface of the sheet element (4) in the viewing area (18), and an illumination unit (14) with a plurality of light sources (20, 21, . . . ) each adapted for illuminating the viewing area (18). The light sources (20, 21, . . . ) are arranged at different inclinations with respect to the viewing area (18). An image evaluation unit (16) determines an inclination-related parameter for the surface of the sheet element (4) in the viewing area (18) across the viewing area (18).
Claims
1. An image capturing system for determining the position of an embossed structure on a sheet element, wherein the sheet element is moved through a viewing area, the system comprising: a camera configured for capturing a line image of the surface of the sheet element in the viewing area, and an illumination unit comprising a plurality of light sources, each light source being configured for illuminating the viewing area to generate a respective line image across the viewing area that is captured by the camera; the light sources being arranged at respective different inclinations with respect to the viewing area; and an image evaluation unit configured to receive the line images from the light sources captured by the camera, configured to determine intensity of reflected light from each light source for each location in the viewing area, configured to construct an intensity profile based on the intensities of reflected light from the light sources for each location in the viewing area, and configured for determining an inclination-related parameter for each location of the surface of the sheet element in the viewing area and across the viewing area based on the intensity profile.
2. The image capturing system of claim 1, wherein the system is configured to determine an inclination-related parameter for each of a plurality of locations in the surface of the sheet element by determining for each of the plurality of the light sources an intensity value of light reflected by each of the plurality of locations.
3. The image capturing system of claim 2, wherein the system is configured to (i) generate an array of inclination-related parameter data corresponding to an area of the sheet element, (ii) identify a subset of the inclination-related parameter data corresponding to a feature of the embossed structure, (iii) determine the position of the feature of the embossed structure in the sheet from position data associated with a subset of the inclination-related parameter data.
4. The image capturing system of claim 1, wherein the light sources generate light at an identical wavelength of the light generated.
5. The image capturing system of claim 1, wherein the wavelength of the light generated by one of the light sources is different from the wavelength of the light generated by any other light source in the illumination unit.
6. The image capturing system of claim 1, wherein each light source comprises a plurality of LEDs arranged adjacent each other.
7. The image capturing system of claim 1, wherein the light sources are arranged in a same plane.
8. The image capturing system according to claim 1, wherein the illumination unit comprises a diffuser.
9. The image capturing system according to claim 1, wherein the camera is a line camera.
10. The image capturing system according to claim 1, wherein the camera is a color camera.
11. The image capturing system of according to claim 1, further comprising a synchronizing module configured for synchronizing activation of the light sources with the camera.
12. The image capturing system according to claim 11, wherein the synchronizing module is linked to movement of a sheet element.
13. A method of determining the position of an embossed structure on a sheet element by using an image capturing system according to claim 1, the method comprising the following steps: directing light from a first one of the light sources onto the viewing area, and using the camera to capture a first line image (I.sub.1) of the viewing area; directing light from a second one of the light sources onto the viewing area, and using the camera to capture a second line image (I.sub.2) of the viewing area, optionally, directing light from at least a further one of the light sources onto the viewing area, and using the camera to capture a corresponding line image (In) of the viewing area, communicating the captured line images to the image evaluation unit, and using the image evaluation unit for determining an inclination-related parameter for the surface of the sheet element in the viewing area across the viewing area.
14. The method of claim 13, further comprising obtaining an inclination-related parameter for each of a plurality of locations in the surface of the sheet element by determining by the evaluation unit for each of the plurality of the light sources an intensity value of light reflected by that location.
15. The method of claim 14, further comprising (i) generating an array of inclination-related parameter data corresponding to an area of the sheet element, (ii) identifying a subset of the inclination-related parameter data corresponding to a feature of the embossed structure, (iii) and determining the position of the feature of the embossed structure in the sheet from position data associated with a subset of the inclination-related parameter data.
16. The method of claim 13, wherein the different light sources are activated one after the other.
17. The method of claim 13, wherein the different light sources are activated simultaneously.
18. The method of claim 13, wherein the inclination-related parameter is an inclination.
19. The method of claim 13, wherein the inclination-related parameter is determined by using a look-up table.
20. The method of claim 19, further comprising providing a plurality of look-up tables and an appropriate table is chosen depending on parameters of the surface of the sheet element.
21. The method of claim 13, wherein two different viewing areas are used which are arranged at a different angle with respect to a direction of movement of the sheet elements.
22. The method of claim 13, further comprising considering a surface finish of the surface of the sheet element when determining the inclination-related parameter.
23. The method of claim 22, wherein the surface finish considered includes color reflectivity or glossiness.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The invention will now be described with reference to a preferred embodiment which is shown in the enclosed drawings. In the drawings,
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
DESCRIPTION OF EMBODIMENTS
(11) In
(12) In the quality control station 2, an image capturing system 10 for determining the position of an embossed structure on the sheet element 4 moved though the quality control station 2 is implemented.
(13) The image capturing system 10 comprises a camera 12, an illumination unit 14 and an image evaluation unit 16.
(14) The camera 12 is here a line camera which is adapted for capturing line images of a viewing area 18 (a “line of interest”) on the surface of the sheet elements 4. The viewing area 18 is an elongate zone which extends transversely to the direction A across the width of the sheet processing machine. The resolution of the camera 12 is such that elements of the order of 0.05 to 0.3 mm on the surface of the sheet elements 4 can be resolved, preferably of 0.1 mm.
(15) The illumination unit 14 in
(16) In the simplified embodiment shown in
(17) The positions and arrangements of the light sources 20, 21, 22 with respect to camera 12 and the line of interest 18 is explained with the aid of a median plane M shown in
(18) In the cross section of
(19) Since light sources 20 and 22 are arranged at angles with respect to median plane M, which are different from the angle α, light generated by any of these light sources and reflected from the surface of sheet element 4 as a specular reflection, cannot be detected by camera 12.
(20) In practice, the reflection from the surface of sheet element 4 will not be (purely) specular but more or less diffuse. Nevertheless, the intensity of the reflected light originating from the light sources 20, 21, 22 will be different. In particular, the intensity of the light originating from light source 21 will be higher than the intensity of the light originating from light sources 20, 22.
(21) With the aid of
(22) For this embodiment, multi-wavelength light sources are used.
(23) In the upper half of
(24) For each line image, all light sources 20, 21, 22 are being activated, and camera 12 captures a line image of the reflected light. Camera 12 is here assumed as being able to determine or understand which intensity originates from which light source 20, 21, 22. This can be achieved by using different wave length for the different light sources 20, 21, 22, and a color camera 12. Thus, this embodiment uses a hyper-spectral camera.
(25) While an embodiment using different wavelengths does produce meaningful results to a certain extent, there is a disadvantage in practice. If the sheet elements have a colored surface, it does influence the intensity of the reflected light. Nevertheless, with the restriction to sheet elements of neutral color (either in white only or in gray only), this simplified embodiment does work and is being used here for explaining the basic principles on which the subject system and the subject method are based.
(26) In a first line image, referred to as I.sub.1, the embossed structure 30 is upstream of the line of interest. Thus, the surface of sheet element 4 is level. In view of light source 21 being arranged in a mirror-symmetric manner with respect to camera 12, the intensity of the light originating from light source 21 is higher than the intensity of the light originating from light sources 20, 22.
(27) Image evaluation unit 16, with the aid of a look-up table, interprets this as “no inclination”. This inclination-related parameter is shown in the lower half of
(28) A second line image I.sub.2 is taken when the embossed structure 30 has been moved to the point where its “forward” inclined surface is at the line of interest 18. When being illuminated, it is no longer the reflected light originating from light source 21 which has the highest intensity but the light originating from light source 20 (because of the now changed orientation of the surface which reflects the light towards the camera).
(29) Again, referring to the look-up table, image evaluation unit 16, interprets the fact that the highest intensity of the reflected light is associated with light source 20, as an “upwardly inclined” surface. This inclination-related parameter is shown in the lower half of
(30) A third line image I.sub.3 is taken when the embossed structure 30 has been moved to the point where its top surface is at the line of interest 18. When being illuminated, it is the middle light source 21 which results in the highest intensity of the reflected light at camera 12 at the center of the embossed structure 30 while it is the lower light source 20 which results in the highest intensity of the reflected light at the lateral portions of the embossed structure.
(31) Again referring to the look-up table, image evaluation unit 16, transforms this distribution of the intensity into inclination-related parameters, namely “no inclination” for the two center pixels and “upward inclination” for the two pixels on either side of the center pixels.
(32) A fourth line image I.sub.4 is taken when the embossed structure 30 has been moved by another increment. When being illuminated, it again is the middle light source 21 which results in the highest intensity of the reflected light at camera 12 at the center of the embossed structure 30 while it is now the higher light source 22 which results in the highest intensity of the reflected light at the lateral portions of the embossed structure 30.
(33) Again referring to the look-up table, image evaluation unit 16, transforms this distribution of the intensity into inclination-related parameters, namely “no inclination” for the two center pixels and “downward inclination” for the two pixels on either side of the center pixels. The inclination-related parameter “downward inclination” is indicated as a crossed pixel.
(34) A fifth line image I.sub.5 is taken when the embossed structure 30 has been moved by another increment. When being illuminated, it is the lateral pixels for which the highest intensity results from the middle light source 21, being interpreted as “no inclination”, and the central pixels for which the highest intensity results from the upper light source 22, being interpreted as “downward inclination”.
(35) A sixth line image I.sub.6 is taken when the embossed structure 30 has been moved by another increment. As the embossed structure now has completely passed the line of interest 18, it is again the middle light source 21 which results in the highest intensity of the light reflected towards camera 12. Accordingly, the image evaluation unit understands that there is “no inclination” at row R.sub.6.
(36) Image evaluation unit 16 thus is able to derive from the inclination-related parameter where an embossed structure 30 is present (by determining where it starts raising above the surface of the sheet element 4), how long and wide it is, etc. With a higher resolution than in the simplified example of
(37) As an alternative to the inclination-related parameter being “no inclination”, “upward inclination” and “downward inclination”, the pixels of the rows of the captured line images could be directly coded with the number of the light source which resulted in the highest intensity of the reflected light. If the captures line images shall be visualized, different colors can be used for visualizing different inclinations.
(38) In practice, the system operates with speed of up to 15 m/s with which the sheet elements 4 are being moved. Camera 12 captures line images at rates of 10,000 to 40,000 line images per second, but higher rates can be used as well.
(39) A more elaborate embodiment of the image capturing system is shown in
(40) Generally speaking, an array of inclination-related parameters is obtained for a plurality of locations on the surface of sheet element 4. Based on this array of parameters, a subset of data will be generated which corresponds to a feature of the embossed structure. Then, the position of the feature of the embossed structure will be determined.
(41) The difference over the embodiment of
(42) Another difference over the embodiment of
(43) The main advantage achieved herewith is that the identification of the embossed structure is independent of the color of the surface of the sheet element 4. Preferably, the LEDs are white for the camera.
(44) It is possible to use a more elaborate strategy for activating the light sources. Depending on the circumstances, some light sources can be activated in pairs simultaneously, or the activation of certain light sources can be “skipped” (which means that also no corresponding sub-image is being captured) if it is clear for the image evaluation unit that this particular sub-image is not necessary for determining the inclination-related parameter.
(45) Camera 12 here is a line camera.
(46) The image evaluation unit 14 evaluates each sub-image. During this evaluation, it can be taken into account and be compensated that the sheet element and the embossed structure to be identified have slightly been moved between the individual sub-images. The compensation can in particular be done by resampling images to the same position.
(47) An example of how the image evaluation unit 14 processes captured sub-images is explained with reference to
(48) The image evaluation unit 14 evaluates, for each pixel of interest, which light source resulted in the highest intensity of the reflected light. The image evaluation unit 14 fits a mathematical curve C which best fits the different intensity measurements for this pixel of interest. The number of intensity measurements is directly the number of sub-images acquired sequentially. Here in
(49) The curve of
(50) Thus, an array of inclination-related parameter data either for the entire surface of sheet element 4 or for a portion of interest is obtained.
(51) Once the inclination-related parameter has been identified for all lines of interest (in the x direction) and along the lines of interest (in the y direction), it can be visualized. An example is shown in
(52) The “gap” in the visualization at the 3 o'clock position and the 9 o'clock position is due to the fact that this part of the embossed feature has local normal vectors which are almost perpendicular to plan P, so that there is (almost) no slope signal in this region, and the slopes cannot be properly measured. As explained above, the measured slope is based on a projection of the local normal vector onto plane P. If the vector points towards plane P, then sensible measurement results cannot be obtained.
(53) It is important to note that the position of any embossed structure on the sheet element 4 can be detected independent of any decorative effects overlaying the embossing (such as for example print, foil, hologram or varnish). This is because the method does not rely on measuring absolute reflectivity but a change of reflectivity profile of the surface due to a change in inclination.
(54) However, the glossiness of the surface of the inspected sheet elements 4 results in a change of the profile of the intensity: a foil will have a sharp profile, but a diffuse substrate will have a more flat intensity profile (“flat intensity profile” here means: intensity for each light angle are more similar). Additionally the relation between the physical real inclination angle and the position of the maximum in profile intensity varies in function of the reflectivity type. In a preferred embodiment, to obtain an inclination-related parameter which is independent from the type of reflectivity (matte or shiny), the computation is separated into three steps:
(55) 1: analyze the intensity profile to determine where the position X.sub.max of the maximum is located (
(56) 2: analyze the intensity profile to determine which look-up table to apply, in function of the reflectivity type (matte, shiny or a mix).
(57) 3: apply the determined look-up table on X.sub.max to output an inclination-related parameter.
(58) This set of look-up tables (multi LUTs) is constructed by means of a calibration. As an example, five or ten look-up tables are built corresponding to five or ten type of reflectivity from
(59) matte to shiny. Then, the look-up tables link the profile to the slope value. The parameter which is being used for identifying the correct look-up table LUT to be used, is the sharpness (width) of the profile.
(60) Instead of using multiple look-up tables, an appropriate formula or a mathematical computation can be used to achieve the same result.
(61) If desired or necessary, two image capturing systems can be used which are arranged with the direction of their viewing areas inclined with respect to each other. As an example, an additional image capturing system of the type described above can be used, with the orientation of the viewing area 18 being different from the orientation of the viewing area of the first image capturing system.
(62) It is also possible to have the viewing area 18 of a first image capturing system arranged at an angle of +45° with respect to the travel direction A of the sheet elements and the viewing area 18 of a second image capturing system arranged at an angle of −45° with respect to direction A.
(63) The same result (differently oriented viewing areas) can be achieved by rotating the sheet element between two subsequent inspections.
(64) The surface inspection system 10 can be part of a more complex inspection unit with other illumination units. In particular, light sources 12, 14 can be part of more complex illumination units which are used for detecting creases and embossed structures on the sheet elements.
(65) Based on the previous Figures, it was described that the line images were obtained by illuminating the sheet elements with only one light source. In practice, the sheet elements are illuminated by different light sources in order to inspect the surface of the sheet elements in different respects. An example is a combination of bright-field illumination and dark-field illumination. The interlaced line images captured under each of the illumination conditions are in practice used for reconstructing a reconstructed BFI image (reconstructed image consisting of the line images captured under bright-field illumination) and a reconstructed DFI image (reconstructed image consisting of the line images captured under dark-field illumination), and these images will be analyzed by the image evaluation unit.
(66)
(67)
(68) Thus, a subset of inclination-related parameter data is obtained.
(69) Should camera 12 capture line images under more than the two illumination conditions which are described here (BFI and DFI) and shown in
(70) Image evaluation unit 16 processes the reconstructed images 40, 50 (either entirely or in those portions which are of interest) in order to detect an item of interest. Here, the reconstructed images 40, 50 are compared in order to identify embossed surface portions. In particular, the position of a feature of interest (e.g. specific corners) is determined based on the subset of inclination-related parameter data.