METHOD FOR DETERMINING A REFERENCE POSITION
20230252669 · 2023-08-10
Inventors
Cpc classification
G06V10/25
PHYSICS
International classification
G06V10/25
PHYSICS
Abstract
A method for determining a reference position, on one side of a flat piece, wherein: image data is acquired; a first reference pattern is detected based on the image data, and first position data is determined for the first reference pattern; based on the first position data, a region of interest is defined in the image data; in the region of interest, a second reference pattern (34) is detected and second position data is determined for the second reference pattern; based on the first position data and the second position data, the reference position is determined. In addition, the present disclosure relates to a system for determining a reference position including an image data acquisition unit and an evaluation unit.
Claims
1. A method for determining a reference position on one side of a flat piece, the method comprising: acquiring image data; detecting a first reference pattern based on the image data, and determining first position data for the first reference pattern; based on the first position data, defining a region of interest in the image data; in the region of interest, detecting a second reference pattern and determining second position data for the second reference pattern; based on the first position data and the second position data, determining the reference position.
2. The method of claim 1, wherein a color-specific or wavelength-specific illumination is applied while acquiring the image data.
3. The method of claim 1, wherein the first reference pattern comprises at least one of: crossing lines, a dot shape, an elliptical shape, a round shape, a rectangular shape, or a square shape.
4. The method of claim 3, wherein the second reference pattern comprises a plurality of features that are arranged periodically along a periodicity axis.
5. The method of claim 4, wherein the second reference pattern comprises parallel lines that run perpendicular to the periodicity axis.
6. The method of claim 4, wherein at least one of an integration step or an averaging step is performed on the image data within the region of interest, and wherein the image data is at least one of integrated or averaged along a direction perpendicular to the periodicity axis.
7. The method of claim 1, wherein determining the second position data comprises a line scan within the region of interest, including at least one of integrating or averaging the image data in a direction perpendicular to the direction of the line scan.
8. The method of claim 1, further comprising, based on the second reference pattern, determining an orientation of the second reference pattern in relation to at least one of the first reference pattern or the region of interest.
9. The method of claim 1, further comprising, based on the second reference pattern, determining a focus property corresponding to acquisition of the image data.
10. The method of claim 1, further comprising determining a plurality of second reference positions for a plurality of second reference patterns.
11. The method of claim 1, further comprising, based on the reference position, determining calibration data and using the calibration data to calibrate a treatment unit.
12. The method of claim 1, further comprising determining the reference position relative to a global or local coordinate system.
13. A system for determining a reference position on one side of a flat piece, the system comprising: an image data acquisition unit configured to acquire image data; and an evaluation unit configured to: receive the image data from image data acquisition unit; detect a first reference pattern based on the image data, and determine first position data for the first reference pattern; based on the first position data, define a region of interest in the image data; in the region of interest, detect a second reference pattern and determine second position data for the second reference pattern; based on the first position data and the second position data, determine the reference position; and output the reference position.
14. The system of claim 13, further comprising an illumination unit configured to illuminate a detection area on the flat piece.
15. The system of claim 14, further comprising a treatment unit, wherein the evaluation unit is further configured to determine calibration data based on the reference position and to provide the calibration data for a calibration of the treatment unit.
16. The system of claim 15, wherein the treatment unit comprises a laser engraving unit.
17. The system of claim 14, wherein the illumination unit is configured to be color-selective or wavelength-selective.
18. The method of claim 1, wherein acquiring the image data comprises using a wavelength-selective photosensitive device.
19. The method of claim 11, wherein the treatment unit comprises a laser engraving unit.
Description
[0087]
[0088]
[0089]
[0090]
[0091] With reference to
[0092] The system 10 has an image data acquisition unit 14, in the present example a camera 14.
[0093] The camera 14 is directed to a surface of an object 12, in the present embodiment, a plastic card 12. The card 12 is held by a holder 18 in a defined position.
[0094] The card 12 may be a plain plastic card, for example, made of polycarbonate PC. The card 12 may have a window, i.e., a transparent area, or a structured area, e.g., with a lenticular structure or a tactile feature on the surface.
[0095] The camera 14 is configured as known in the art.
[0096] In the present case, the camera 14 is configured to acquire greyscale image data within a detection area.
[0097] In another embodiment, the camera 14 may be equipped with a filter to detect a wavelength-selective image. For example, a cyan color filter may be used to selectively image locations that reflect or absorb cyan wavelengths, respectively.
[0098] In the present embodiment, an illumination unit 13 is provided that is configured to illuminate the detection area on the surface of the card 12. Herein, the illumination unit 13 has a filter of a color that is complementary to the color of features that are included in the first and/or second reference patterns 22, 24, 32, 34 (for further details see below), thus using light at a specific wavelength or within a specific wavelength range.
[0099] In the present embodiment, a red filter is used for the illumination unit 13. Thus, cyan features (i.e. areas on the surface of the card 12 that reflect cyan wavelengths) are represented as low-intensity or dark areas within the acquire image data.
[0100] On the surface of the plastic card 12, reference positions 12a, 12b, 12c, 12d are located and provided with a first reference pattern. In the embodiment, the reference positions 12a, 12b, 12c, 12d are configured such that they can be detected by the wavelength-specific detection system of the camera 14; in particular, a cyan color is used to provide the first reference pattern.
[0101] Furthermore, the system 10 includes a treatment unit 16, which herein is a laser engraving unit 16. The laser engraving unit 16 is configured to perform a laser treatment and produce a laser engraving on the surface of the plastic card 12.
[0102] The system 10 also has an evaluation unit 15, which is coupled to both the image data acquisition unit 14 and the treatment unit 16. In particular, the evaluation unit may act as a control unit for the camera 14 and/or the treatment unit 16.
[0103] Also, the illumination unit 13 is coupled to and may be controlled by the evaluation unit 15, for example to switch the illumination unit 13 on or off.
[0104] In further embodiments, the illumination unit 13 may be controlled to change filters in order to apply illumination with different colors for imaging differently colored features and/or reference patterns 22, 24, 32, 34 in subsequent image acquisition steps.
[0105] With reference to
[0106] The second reference pattern 24 includes a plurality of parallel straight lines, which are represented as running horizontally in
[0107] To form the first reference pattern 22, a square is formed between two of the lines of the second reference pattern 24 in this embodiment. In this example, the square has an edge length of twice the distance between neighboring lines of the second reference pattern 24. In particular, the position of the first reference pattern 22 is defined as the position of the center of the square between the two lines of the second reference pattern 24. In the case of this embodiment, the square shape of the first reference pattern 22 is located on one of the lines of the second reference pattern 24. This knowledge can later be used to optimize the determined reference position.
[0108] In this embodiment, the first reference pattern 22 and second reference pattern 24 are formed such that they can be observed within a range of cyan color. For example, the lines of second reference pattern 24 are formed in cyan color, and the filter of the camera 14 is used to filter out other structures from further lines in magenta and yellow color between these lines. In the embodiment, this leads to image data, wherein lower intensity values are detected at the position of the lines and higher intensity values as a background of the lines.
[0109] In further embodiments, other colors or wavelengths or wavelength ranges can be used in a similar way. For example, the first reference pattern 22 and/or second reference pattern 24 can be formed with a color that is outside the visible range, such as infrared or ultraviolet.
[0110] With reference to
[0111] In a first step 42, image data is acquired. This is performed using the camera 14, while the surface of the card 12 is illuminated by the illumination unit 13 with a red filter, i.e., it is illuminated with red light. Thus, cyan colored features appear dark in the image data. In the resulting image data, intensity values are determined for each pixel of a pixel matrix, thus representing the intensity of reflected light that is detected at corresponding locations within the detection area.
[0112]
[0113] In a second step 44, a matching algorithm is used to determine the position of the first reference pattern, i.e., to identify the square shape and find the center point of the square shape between the parallel lines 34 at the first reference pattern 32. In this step, first position data is determined with a first position estimate 38 for the reference position, which is indicated by a cross 38 in
[0114] As can be seen from
[0115] Also in this step 44, a region of interest (ROI) 36 is defined, using the determined first position estimate 38 as a midpoint and selecting a rectangle within the image data. The rectangular region of interest 36 is elongated and has a length or longitudinal axis in the direction of the periodicity axis 30. On the other hand, a width or lateral axis of the region of interest 36 is defined in the direction of the parallel lines 34.
[0116] In further embodiments, the region of interest 36 may be positioned relative to the first position estimate 38 in another way, such as with a defined offset relative to the first position estimate 38. The first position estimate 38 may be positioned at one corner of the region of interest 36 or it may be located at one of its sides. In further embodiments, the region of interest 36 may be defined such that it has a defined size and it is positioned such that it both includes the first position estimate 38 and an optimized area of the second reference pattern 34 as well.
[0117] In a subsequent step 46, a line scan is performed within the region of interest 36. The intensity values for each row of pixels within the region of interest 36 are summed up and plotted versus the pixel position. The graph of
[0118] In another embodiment, a second region of interest is defined around the first position estimate 38 and this area is excluded from the line scan. Thus, the line scan data is not affected by the irregularities that are caused by the first reference pattern 32. For example, it can be avoided that the square shape in this embodiment influences the line scan data, which is supposed to reflect the properties of the second reference pattern 34.
[0119] A fitting algorithm is used to locate positions along the line scan 34a, where a local minimum is reached, indicating that this is a position of one of the parallel lines 34. In the graph of
[0120] Using the fitted data 34b, the positions of peaks are determined. In particular, the positions are found as parameters of the fitted function during the fitting procedure. On the other hand, it is known as a boundary condition that the parallel lines 34 are equidistantly distributed, i.e., there are equal distances between the peaks. In
[0121] In addition to that, the graph of
[0122] In another step 48, the reference position is determined from the first and second position data. To this end, it is determined which peak of the second reference pattern is located closest to the first position estimate 38a, and a corrected value is determined for the reference position 32a, which is shown as a dotted line. Herein, the knowledge is used that the center point of the square shape that represents the first reference pattern 32 is located on one of the parallel lines 34.
[0123] Finally, the corrected value for the reference position is output and transmitted to the treatment unit 16 for a calibration step. To this end, calibration data is determined and the treatment is adjusted such that it is applied to the desired area on the card 12.
[0124] By this method, the reference position can be measured accurately with respect to a feature of the second reference pattern, such as the position of a line as a feature of the second reference pattern. A standard model match procedure can be used first to estimate the fiducial location and thereby determine the region of interest. Then, starting from the estimate for the fiducial, a line scan or profile of the cross-section of the image can be generated over a number of lines within the second reference pattern. This profile represents an average over a defined width of the cross-section for each of the lines, and the center positions of these averaged lines can be determined as a local minimum within this profile. A linear regression method can be applied to this number of minima to average out disturbances or noise in the image data. The desired position of the cyan line next to the fiducial can be extracted out of these measurements by taking the closest negative peak position. As a result of this method, the desired cyan line position is measured based on the average of the position estimates of a set of cyan lines around the required position. This method is less sensitive to disturbances, noise and deviations in the image. Based on the averaging over an area, the resulting position is of a higher accuracy.
REFERENCE SIGNS
[0125] 10 System [0126] 12 Object; flat piece; plastic card [0127] 12a, 12b, 12c, 12d Reference position [0128] 13 Illumination unit [0129] 14 Image data acquisition unit; camera [0130] 15 Evaluation unit [0131] 16 Treatment unit; laser engraving unit [0132] 18 Holder [0133] 22 First reference pattern [0134] 24 Second reference pattern; parallel lines [0135] 24a Periodicity axis [0136] 30 Periodicity axis [0137] 30a Line scan axis [0138] 32 First reference pattern [0139] 32a Reference position (graph) [0140] 34 Second reference pattern; parallel lines [0141] 34a Line scan (intensity) [0142] 34b Fit data [0143] 34c Peak positions [0144] 36 Region of interest [0145] 38 Cross, first position estimate [0146] 38a Position estimate (graph) [0147] 42, 44, 46, 48 Step