Method for Analyzing a Workpiece Surface for a Laser Machining Process and Analysis Device for Analyzing a Workpiece Surface
20230241710 · 2023-08-03
Inventors
Cpc classification
International classification
B23K26/03
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method for analyzing a workpiece surface for a laser machining process includes radiating a light line of light of a first wavelength range into a workpiece surface area and illuminating the area with light of at least one second wavelength range. Further, capturing an image of the workpiece surface area by a sensor device including an image sensor and imaging optics, the optics having different refractive indices for the first and second wavelength ranges. A first plane is defined by the light line and a light exit point. The optics and image sensor are arranged in a Scheimpflug arrangement. Further, evaluating the image to analyze workpiece surface features based on a predetermined offset between the first plane and a second plane for which the light of the second wavelength range from the optics is sharply imaged on the image sensor sensor plane. An analysis device carries out the method.
Claims
1. A method for analyzing a workpiece surface for a laser machining process, comprising the steps of: radiating a flat fan-shaped light beam of light of a first wavelength range to generate a light line on said workpiece surface and illuminating said workpiece surface with light of at least a second wavelength range; capturing an image of said workpiece surface by means of a sensor device, which comprises an image sensor and an optical system for imaging light on said image sensor, wherein said optics has different refractive indices for the first and second wavelength ranges, and wherein a first plane defined by the plane of the fan-shaped light beam, said optics and said image sensor are arranged in a Scheimpflug arrangement; and evaluating the image to analyze features of said workpiece surface based on a predetermined offset on the workpiece surface between the first plane and a second plane for which the light of the second wavelength range from said optics is imaged on said image sensor.
2. The method according to claim 1, wherein the first plane is arranged perpendicularly to said workpiece surface and/or wherein an optical axis of said optics and/or an optical axis of said sensor device form an acute angle with the first plane.
3. The method according to claim 1, wherein the features of said workpiece surface include a weld seam or a joint edge and an optical axis of said sensor device and/or an optical axis of said optics lie in a plane which extends perpendicularly to said workpiece surface and in parallel to the weld seam or joint edge.
4. The method according to claim 1, wherein the first wavelength range comprises blue light, preferably light with a wavelength of 400 nm to 500 nm, particularly preferably 450 nm, and/or wherein the second wavelength range comprises red light, preferably light with a wavelength of 620 nm to 720 nm, particularly preferably 660 nm, and/or wherein a third wavelength range comprises light with a wavelength of 720 nm.
5. The method according to claim 1, wherein evaluating the image comprises: evaluating intensity data of light of the first wavelength range for generating a height profile of the workpiece surface.
6. The method according to claim 1, wherein a partial area of said workpiece surface surrounds a line of intersection of the second plane with said workpiece surface, and wherein evaluating the image comprises: evaluating intensity data of light of the second wavelength range in an area of the image corresponding to the partial area in order to obtain a gray image of the partial area.
7. The method according to claim 1, wherein said workpiece surface is further illuminated with light of at least a third wavelength range and said optics has different refractive indices for the first, the second and the third wavelength ranges, and wherein evaluating the image for analyzing features of said workpiece surface is also carried out based on a predetermined offset on the workpiece surface between the first plane and a third plane for which the light of the third wavelength range is imaged on said image sensor by said optics.
8. The method according to claim 1, wherein said workpiece surface is subsequently moved relative to said sensor device and the above steps are repeated.
9. A method for machining a workpiece using a laser beam, in particular laser welding or laser cutting, comprising: radiating a laser beam onto a point along a machining path on a workpiece surface; the method according to claim 1, wherein the light line is radiated onto said workpiece surface in advance and/or in the wake of the point.
10. An analysis device for analyzing a workpiece surface, comprising: a sensor device with an image sensor for capturing an image and optics for imaging light on said image sensor, said optics having different refractive indices for a first wavelength range and at least one second wavelength range; a light line unit for radiating a light line of a first wavelength range and an illumination unit for radiating light of at least one second wavelength range, and an evaluation unit for evaluating the image captured by said image sensor, wherein said analysis device is configured to carry out the method for analyzing the workpiece surface according to claim 1.
11. The analysis device according to claim 10, wherein said optics comprises a lens, a lens group, a focusing lens, a focusing lens group, an objective and/or a zoom objective.
12. The analysis device according to claim 10, wherein said image sensor comprises a matrix image sensor, a two-dimensional optical sensor, a camera sensor, a CCD sensor, a CMOS sensor, and/or a photodiode array.
13. The analysis device according to claim 10, wherein said light line unit comprises an LED or an LED array.
14. A laser machining head for machining a workpiece by means of a laser beam, comprising an analysis device according to claim 10.
15. The laser machining head according to claim 14, wherein said laser machining head is configured to carry out a method for machining a workpiece using a laser beam, in particular laser welding or laser cutting, comprising: radiating a laser beam onto a point along a machining path on a workpiece surface; analyzing the workpiece surface for a laser machining process, comprising the steps of: radiating a flat fan-shaped light beam of light of a first wavelength range to generate a light line on said workpiece surface and illuminating said workpiece surface with light of at least a second wavelength range; capturing an image of said workpiece surface by means of a sensor device, which comprises an image sensor and an optical system for imaging light on said image sensor, wherein said optics has different refractive indices for the first and second wavelength ranges, and wherein a first plane defined by the plane of the fan-shaped light beam, said optics and said image sensor are arranged in a Scheimpflug arrangement; and evaluating the image to analyze features of said workpiece surface based on a predetermined offset on the workpiece surface between the first plane and a second plane for which the light of the second wavelength range from said optics is imaged on said image sensor; wherein the light line is radiated onto said workpiece surface in advance and/or in the wake of the point.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0040] The invention is described in detail below with reference to figures.
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0050] Unless otherwise noted, the same reference numbers are used below for identical elements and elements with the same effect.
[0051] For a better understanding of the invention,
[0052] The analysis device 10′ is configured to carry out a light section method or a light section triangulation method in order to detect a three-dimensional height profile of a workpiece surface 22′. In this way, for example, a joining edge or step 23′ of the workpiece surface 22′ can be detected. The analysis device 10′ may also be referred to as a triangulation sensor.
[0053] The analysis device 10′ comprises a sensor device 12′ with an image sensor 14′ for capturing an image and optics 16′ for imaging light on the image sensor 14′. The analysis device 10′ further comprises a light line unit 18′ for generating a light line 20′ on the workpiece surface 22′ and capturing an image of the light line 20′. The light line unit 18′ is configured to radiate a fan-shaped light beam 24′, i.e. spread out in only one plane, onto the workpiece surface 22′ in order to generate the light line 20′ on the workpiece surface 22′. A plane 26′ spanned by the fan-shaped light beam 24′, the optics 16′ and a sensor plane of the image sensor 14′ are arranged in the Scheimpflug arrangement or fulfill the Scheimpflug condition. As a result, the plane 26′ is imaged sharply on the image sensor 14′ by the optics 16′. In other words, all points on the plane 26′ are mapped sharply on the image sensor 14′. Since the plane 26′ also includes the light line 20′, light of the light line 20′ reflected by the workpiece surface 22′ is also sharply imaged on the image sensor 14′. The plane 26′ is typically perpendicular to the workpiece surface 22′ and intersects the workpiece surface 22′. Due to the Scheimpflug arrangement, an line of intersection 44′ of the plane 26′ with the workpiece surface 22′ corresponds to the light line 20′ or coincides with the light line 20′.
[0054] In the direction of the optical axis 17′ of the optics 16′, in front of and behind the plane 26′, there is also a certain depth of focus within which points the optics 16′ are imaged sufficiently sharply on the image sensor 14′. In other words, the depth of focus surrounds the first plane 26′. Since the plane 26′ intersects the workpiece surface 22′, all points on planes that intersect the plane 26′ are sharply imaged within the depth of focus. Accordingly, a partial area 28′ of the workpiece surface 22′ which surrounds the line of intersection 44′ of the plane 26′ with the workpiece surface 22′ in the direction of the optical axis 17′ is imaged sharply on the image sensor 14′. In the direction of the optical axis of the optics 16′, the partial area 28′ lies in front of and behind the line of intersection 44′. The partial area 28′ may also be referred to as the “sharp area” since this area of the workpiece surface 22′ is imaged sufficiently sharply for the image evaluation in the captured image.
[0055] The plane 26′, for which all points are sharply imaged on the image sensor 14′ due to the Scheimpflug arrangement and the depth of focus of the optics 16′ which surrounds the first plane 26′ in the direction of the optical axis of the optics 16′ may be defined as the measurement range. The measurement range may also be referred to as the “object field”. The first plane 26′ may also be referred to as the “object plane” of the sensor device 12′. If the workpiece surface 22′ intersects the plane 26′, light reflected from the light line 20′ is also sharply imaged on the image sensor 14′. The area of the workpiece surface 22′ that lies within the object field, i.e., the partial area 28′, is also imaged sufficiently sharply on the image sensor 14′.
[0056]
[0057] Due to the color error of the optics, the image distance, i.e. the position of the measurement range or the object plane, depends on the wavelength. That is, the plane 26′ is imaged differently for different wavelengths in the case of an optics 16′ with color errors. The plane 26′ would be imaged sharply on the image sensor 14′ by the optics 16′, for example for light of a first wavelength range, for example blue light with a wavelength of approx. 450 nm. Due to the color error of the optics 16′, the plane 26′ for light of a second wavelength range, for example red light with a wavelength of approx. 660 nm would not be imaged sharply on the image sensor 14′, however, since the image distance of the optics 16′ is changed for red light. The object distance, i.e. a distance of the optics 16′ to the plane 26′, would have to be changed in order to obtain a sharp image. In other words, the image plane would be at a different distance from a main plane of the uncorrected optics 16′ for each wavelength range.
[0058] According to the invention, the color error of the optics is used to separate a sharp area of the workpiece surface for a two-dimensional image, in particular for a gray image, from an image of the light line for the laser triangulation. According to the present invention, a sensor device with an image sensor and an optical system is used for the three-dimensional detection of a workpiece surface using a light section method, said optics having different refractive indices for light in a first wavelength range and light in a second wavelength range. In parallel to capturing a sharp image of a light line reflected from a workpiece surface, a sharp image of a further area of the workpiece surface which is spaced away from the light line may be captured. With the help of the present invention, the area of the workpiece surface that is sharply imaged for light in the second wavelength range is spatially separated from the sharply imaged light line in the first wavelength range in order to combine the advantage of Scheimpflug imaging with a gray image representation with great depth of focus. In this way, the depth of focus, which severely restricted the evaluation range in the gray image, can be extended to the measurement range of the Scheimpflug image.
[0059]
[0060] Optics that has a color error, in particular a longitudinal color error or a (longitudinal) chromatic aberration, and thus different refractive indices for light of different wavelengths or wavelength ranges, is also referred to as “uncorrected optics” in the context of this disclosure. The optics may include at least one optical element, for example a lens, with such a color error. The uncorrected optics has different object planes for light of different wavelengths or wavelength ranges. In other words, the optics has different planes for light of different wavelength ranges which are imaged sharply on a predefined image plane, for example a sensor plane of an image sensor, by the optics. For each wavelength range, there is a corresponding object or focal plane which is imaged sharply on this image plane by the optics for light in this wavelength range.
[0061]
[0062]
[0063] The analysis device 10 is configured to carry out a method for analyzing a workpiece surface for a laser machining process according to embodiments of the present invention.
[0064] The analysis device 10 is configured to capture a two-dimensional image of the workpiece surface 22 and to evaluate the captured image in order to identify or analyze features of the workpiece surface. In particular, the analysis device 10 is configured to carry out a light section method or a light section triangulation method in order to detect a three-dimensional height profile of the workpiece surface 22. The analysis device 10 may detect, for example, a joint edge or step 23 of the workpiece surface 22 or a weld seam.
[0065] The analysis device 10 comprises a sensor device 12 with an image sensor 14 for capturing an image and optics 16 for imaging light on the image sensor 14 and may comprise an evaluation unit (not shown) for evaluating the image captured by image sensor 14. The image sensor 14 is a planar or two-dimensional optical sensor, for example a CMOS or CCD sensor. The optics 16 may be configured as a lens or an objective, but the invention is not limited thereto.
[0066] The analysis device 10 further comprises a light line unit 18 for generating a light line 20 of light of a first wavelength range on the workpiece surface 22. The light line unit 18 is configured to radiate a fan-shaped light beam 24 onto an area of the workpiece surface 22 in order to generate the light line 20 on the workpiece surface 22. In particular, the light line 20 may be oriented perpendicularly to a course of the joining edge or step 23. The light line unit 18 may be configured as a line laser. Accordingly, the light beam 24 may be a laser beam and the light line 20 may be a laser line, but the invention is not limited thereto. According to embodiments, the light line unit 18 is configured to generate a blue laser beam with a wavelength of approximately 400 nm to 500 nm. According to embodiments, the light line unit 18 is arranged in such a way that it can radiate the light beam 24 perpendicularly onto the workpiece surface 22.
[0067] The analysis device 10 further comprises an illumination unit 42 configured to generate light of a second wavelength range and to radiate it onto the area of the workpiece surface 22. According to embodiments, the illumination unit 42 is configured as a red LED lighting and configured to generate red light, e.g. with a wavelength of approximately 620 nm to 720 nm. The area of the workpiece surface 22 may be illuminated with light of a second wavelength range over a large area and/or non-directionally. An arrangement or orientation of the lighting unit 42 may be arbitrary as long as the area of the workpiece surface 22 is illuminated. The image sensor 14 is sensitive to both wavelength ranges.
[0068] The optics 16 has different refractive indices for light in the first wavelength range and light in the second wavelength range. The optics 16 may be referred to as an uncorrected optics. According to embodiments, an optical axis 17 of the optics 16 may lie in a plane that extends perpendicularly to the workpiece surface 22 and in parallel to the joint edge or step 23 (or through the joint edge or step 23).
[0069] A first plane 26 is defined by a light exit point of the light of the first wavelength range from the light line unit 18 and the light line 22. In other words, the first plane 26 is spanned by the planar fan-shaped light beam 24. The first plane 26, the optics 16 and a sensor plane of the image sensor 14 are arranged in the Scheimpflug arrangement or meet the Scheimpflug condition. As a result, all points of the first plane 26 are imaged sharply on the sensor plane of the image sensor 14 by the optics 16. The first plane 26 may also be referred to as the “triangulation plane”. Since the first plane 26 also includes the light line 20, the light of the light line 20 reflected by the workpiece surface 22 is also imaged sharply on the image sensor 14. The first plane 26 intersects the workpiece surface 22 in a first line of intersection 44. As a result, all points of the workpiece surface 22 on different planes intersecting the first plane 26 are imaged sharply. The first line of intersection 44 of the first plane 26 with the workpiece surface 22 corresponds to the light line 20 or coincides with the light line 20.
[0070] The first plane 26 is preferably arranged substantially perpendicular to the workpiece surface 22. In other words, an optical axis of the light line unit 18 may extend substantially perpendicularly to the workpiece surface 22. The optical axis 17 of the optics 16 may intersect the first plane 26 at an acute angle.
[0071] As previously described, the workpiece surface 22 is also illuminated by the illumination unit 42 with light in the second wavelength range. A second plane 30, for which the light of the second wavelength range is imaged sharply on the sensor plane of the image sensor 14 by the optics 16, differs from the first plane 26, as was explained with reference to
[0072] Moreover, a certain three-dimensional depth of focus range within which points are imaged sufficiently sharply by the optics 16 on the image sensor 14 in front of and behind the plane 26 and in front of and behind the second plane 30 in the direction of the optical axis 17 of the optics 16. In other words, a first depth of focus surrounds the first plane 26 and a second depth of focus surrounds the second plane 30. Since the first plane 26 intersects the workpiece surface, all points on planes intersecting the first plane 26 within the first depth of focus are imaged sharply on the image for reflected light of the first wavelength range. Since the second plane 30 intersects the workpiece surface, all points on planes intersecting the second plane 30 within the second depth of focus are imaged sharply on the image for reflected light of the second wavelength range. Accordingly, a first partial area 28 of the workpiece surface 22, which surrounds the line of intersection 44 of the first plane 26 with the workpiece surface 22, is imaged sufficiently sharply on the image sensor 14 for light of the first wavelength range, and a second partial area 32 of the workpiece surface 22, which surrounds the line of intersection 46 of the second plane 30 with the workpiece surface 22, is imaged sufficiently sharply on the image sensor 14 for light of the second wavelength range. The partial area 28 is in front of and behind the line of intersection 44 in the direction of the optical axis 17 of the optics 16. The partial area 32 is in front of and behind the line of intersection 46 in the direction of the optical axis 17 of the optics 16. Both partial areas 28, 32 are spaced apart from one another according to embodiments and do not intersect or overlap. Accordingly, the light line 20 of the first wavelength range is not in the second partial area 32 which is sharply imaged for light of the second wavelength range.
[0073] Of course, it is possible to use other central wavelengths that differ from one another or are spaced apart from one another for illumination. Here, central wavelength refers to the wavelength that is in the middle of a wavelength range. For this purpose, an illumination unit 42, 42′ may be provided for each illumination wavelength. In addition, an illumination unit 42 that radiates light in a large wavelength range, for example a broadband light source or white light source, and one or more filters that transmit a plurality of wavelengths or wavelength ranges that are spaced apart from one another may be provided. The filter is arranged in front of the image sensor 14, i.e. it may either be arranged on the illumination unit 42, or in front or behind the optics 16. When light from at least one further third wavelength range is used to illuminate, a further third plane 31 results which generates a third partial area 33 offset in parallel to the second partial area 32 on the workpiece surface. Many wavelengths, such as those produced by an illumination unit with a continuous spectrum, generate many such planes of intersection which intersect the workpiece surface in parallel to the second plane 30. Each wavelength generates a depth of focus range or partial area on the workpiece surface offset in parallel to the second partial area 32. In total, the surface area that is imaged sharply can be enlarged. This is shown as an example in
[0074]
[0075] Thus, the sharply imaged light line 20 may be extracted from the captured image and may be evaluated in order to carry out the light section method for detecting the three-dimensional height profile of the workpiece surface. Based on the offset 47 described above, the sharply imaged second partial area 32 for the light of the second wavelength range offset to the light line can be can be extracted from the image in order to obtain a sharp image, in particular a gray image, of the second partial area 32 of the workpiece surface 22.
[0076] The extraction of the image sensor or the image is optimized in such a way that the light line 20 is extracted and the second partial area 32 is extracted offset thereto or offset to the peak intensity of the light line 20. The second partial area 32 typically comprises between 20 and 100 rows in the image, depending on the imaging by the optics and the image sensor used. The points of the light line may be converted directly into distance values via calibration values.
[0077] According to embodiments, the analysis device 10 is arranged on a laser machining head for machining a workpiece by means of a laser beam. The analysis device may be attached to a housing of the laser machining head, for example.
[0078]
[0079] The method comprises the following steps: First, a light line 20 of light in a first wavelength range is radiated into an area of a workpiece surface 22 and the area of workpiece surface 22 is illuminated with light in a second wavelength range (S1). An image of the area of the workpiece surface 22 is then captured using a sensor device 12 (S2). The sensor device 12 comprises an image sensor 14 and optics 16 for imaging light on the image sensor 14, wherein the optics 16 has different refractive indices for the first and second wavelength ranges and a first plane 26 defined by the light line 20 and a light exit point of the light of the first wavelength range, the optics 16 and the image sensor 14 are arranged in a Scheimpflug arrangement. Thus, the image is captured based on light of the first wavelength range and the second wavelength range which is reflected by the area of the workpiece surface 22 and is imaged by the optics 16 on the image sensor 14. Next, the image is evaluated in order to analyze features of the work surface (S3).
[0080] The evaluation is based on the predetermined offset between the first plane 26 and a second plane 30 for which the light of the second wavelength range is sharply imaged by the optics 16 on the sensor plane of the image sensor 14. The specified offset may be a predetermined or known offset. The offset may be modeled, calculated or determined by measurement. The offset may be specified with respect to the workpiece surface or along an optical axis 17 of the optics 16 or the sensor device 12.
[0081] In the captured image, as previously described with reference to
[0082] The method according to the invention may be part of a method for machining a workpiece by means of a laser beam, for example laser welding. As illustrated in
[0083] According to embodiments, the method for analyzing a workpiece surface for a laser machining process may comprise moving the workpiece surface relative to the sensor device and repeating S1 to S3 described above. In other words, a plurality of images of the workpiece surface 22 may be captured one after the other. For this purpose, the respective second partial areas 32 extracted from the plurality of images may be combined to form a gray image of the workpiece surface 22. As a result, a sharp gray image of a larger area of the workpiece surface 22 can be obtained.
[0084] By way of example, such a composite gray image is illustrated in
[0085] Correspondingly, the respective first partial areas 28 extracted from the plurality of images can be combined to form a three-dimensional height profile of a larger area of the workpiece surface 22.
[0086] The present invention is based on the idea that an optics with a longitudinal color error images the object planes differently for different wavelengths. In this case, the image plane is at a different distance from a main plane of the optics for each wavelength. This property may be used in a laser machining process in order to separate an area of a workpiece surface that is sharply imaged by the optics on an image sensor from an image of a light section line for a light section method. As a result, the advantage of a large measuring range in a Scheimpflug arrangement or image can be combined with the advantage of a large depth of focus in a gray image display. This means that the depth of focus, which severely restricted an evaluation area in the gray image, is extended to the measurement range of the Scheimpflug image.
LIST OF REFERENCE SYMBOLS
[0087] 10 analyzing device [0088] 12 sensor device [0089] 14 image sensor [0090] 16 optics [0091] 17 optical axis of the optics [0092] 18 light line unit [0093] 20 light line [0094] 22 workpiece surface [0095] 23 workpiece step [0096] 24 light beam [0097] 26 first plane [0098] 28 first partial area [0099] 30 second plane [0100] 32 second partial area [0101] 31 third plane [0102] 33 third partial area [0103] 34 sensor plane [0104] 36 light of a first wavelength range [0105] 38 light of a second wavelength range [0106] 40 main plane [0107] 42 illumination unit [0108] 44 first line of intersection [0109] 46 second line of intersection [0110] 45 third line of intersection [0111] 47 offset [0112] 48 laser beam [0113] 50 point on the machining path [0114] 52 machining path [0115] 54 advance [0116] 56 wake [0117] 58 machining direction [0118] 60 gray image [0119] 62 weld seam