METHOD FOR SELECTION OF CAMERA IMAGE SECTIONS

20240420306 · 2024-12-19

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for monitoring the process in laser material processing and provides a corresponding method, comprising the steps of taking a real-time image comprising the position and surrounding of the process where material processing occurs by a camera that is arranged in or on a laser material processing head; determining at least one image section in the real-time image and its position on a camera sensor; determining an actual position of the process in the material processing, and a nominal position of the relevant image detail using a projection of programmed path data for controlling the laser material processing head in the section of the real-time image, and the transfer of the at least one image section from the camera to a computer.

    Claims

    1. A method of monitoring laser material processing of a workpiece by a laser material processing head, the method comprising: capturing real-time image data of a spatial area of the workpiece that includes a process point of the laser material processing performed by the laser material processing head; selecting, at a current time (T1), a region of interest of the real-time image data, the region of interest including a current position within the real-time image data of the process point at the current time (T1); and a desired position within the real-time image data of the process point at a future time (T2) in accordance with programmed path data used to control the process point of the laser material processing head; and transferring the real-time image data from within the selected region of interest to a computer configured to calculate a deviation between an actual position of the process point at the future time (T2) and the desired position of the process point at the future time (T2).

    2. The method of claim 1, wherein the region of interest of the real-time image data for each of a plurality of times during the laser material processing is predetermined.

    3. The method of claim 2, wherein the programmed path data includes, for each of the plurality of times, both the desired position of the process point and the region of interest of the real-time image data.

    4. The method of claim 1, wherein selecting the region of interest comprises: identifying the current position within the real-time image data of the process point at the current time (T1); identifying, based on the programmed path data, the desired position within the real-time image data of the process point at the future time (T2); and selecting a region of interest that includes both the current position of the process point at the current time (T1) and the desired position of the process point at the future time (T2).

    5. The method of claim 4, wherein the current position within the real-time image data of the process point at the current time (T1) and the desired position within the real-time image data of the process point at the future time (T2) are identified before the current time (T1).

    6. The method of claim 5, wherein the region of interest of the real-time image data for the current time (T1) is predetermined.

    7. The method of claim 6, wherein the programmed path data includes the region of interest of the real-time image data for the current time (T1).

    8. The method of claim 1, wherein the real-time image data within the region of interest is smaller than the real-time image data of the spatial area.

    9. The method of claim 1, further comprising: predicting, before the future time (T2), a region of interest of real-time image data captured at the future time (T2), the predicted region of interest including: a position of the process point at the future time (T2) within the real-time image data captured at the future time (T2); and a desired position, within the real-time image data captured at the future time (T2), of the process point at a further future time (T3).

    10. The method of claim 9, wherein the region of interest at the future time (T1) is predicted based on the programmed path data and the deviation between the actual position of the process point at the future time (T2) and the desired position of the process point at the future time (T2).

    11. A system for monitoring laser material processing of a workpiece by a laser material processing head, comprising: non-transitory computer readable storage media that stores programmed path data used to control a process point of the laser material processing head; a camera sensor that captures real-time image data of a spatial area of the workpiece that includes the process point of the laser material processing performed by the laser material processing head; a hardware controller that, at a current time (T1), selects a region of interest of the real-time image data that includes: a current position within the real-time image data of the process point at the current time (T1); and a desired position within the real-time image data of the process point at a future time (T2) in accordance with programmed path data used to control the process point of the laser material processing head; and a computer that receives the real-time image data from within the selected region of interest and calculates a deviation between an actual position of the process point at the future time (T2) and the desired position of the process point at the future time (T2).

    12. The system of claim 11, wherein the region of interest of the real-time image data for each of a plurality of times during the laser material processing is predetermined.

    13. The system of claim 12, wherein the region of interest of the real-time image data for each of the plurality of times is pre-stored in the computer readable storage media.

    14. The system of claim 11, wherein selecting the region of interest comprises: identifying the current position within the real-time image data of the process point at the current time (T1); identifying, based on the programmed path data, the desired position within the real-time image data of the process point at the future time (T2); and selecting a region of interest that includes both the current position of the process point at the current time (T1) and the desired position of the process point at the future time (T2).

    15. The system of claim 14, wherein the current position within the real-time image data of the process point at the current time (T1) and the desired position within the real-time image data of the process point at the future time (T2) are identified before the current time (T1).

    16. The system of claim 15, wherein the region of interest of the real-time image data for the current time (T1) is predetermined.

    17. The system of claim 16, wherein the region of interest of the real-time image data for the current time (T1) is pre-stored in the computer readable storage media.

    18. The system of claim 11, wherein the real-time image data within the region of interest is smaller than the real-time image data of the spatial area.

    19. The system of claim 11, wherein the hardware controller is further configured to predict, before the future time (T2), a region of interest of real-time image data captured at the future time (T2), the predicted region of interest including: a position of the process point at the future time (T2) within the real-time image data captured at the future time (T2); and a desired position, within the real-time image data captured at the future time (T2), of the process point at a further future time (T3).

    20. The system of claim 19, wherein the hardware controller is configured to predict the region of interest at the future time (T1) based on the programmed path data and the deviation between the actual position of the process point at the future time (T2) and the desired position of the process point at the future time (T2).

    Description

    BRIEF DESCRIPTION OF THE FIGURES

    [0041] The invention will be described based on figures. It will be understood that the embodiments and aspects of the invention described in the figures are only examples and do not limit the protective scope of the claims in any way. The invention is defined by the claims and their equivalents. It will be understood that features of one aspect or embodiment of the invention can be combined with a feature of a different aspect or aspects of other embodiments of the invention, in which:

    [0042] FIG. 1 shows in its left part a seam to be welded (line, 1), the working area of the remote welding head (large rectangle, 2), and the camera image (small square, 5). The middle part shows the camera image (large square, 5), the seam to be welded (line, 1), and the image section (area of interest, AOI, small rectangle, 10). The right part of the FIGURE shows the image section (large Square, 10), the seam to be welded (Line, 1), and the center of the picture (point in the square, 12)

    DETAILED DESCRIPTION OF THE INVENTION

    [0043] The above-stated object of the invention is achieved by the features of the independent claims. The dependent claims cover further specific embodiments of the invention.

    [0044] The invention provides a method by which the material processing of a workpiece before, during, or after the joining or cutting process can be monitored or even corrected. In this case, the future position of the position of the material processing is calculated beforehand by the combined use of the path data on which the processing head is to be guided together with a section of a real-time image of the material processing.

    [0045] By means of the method according to the invention, it is possible to monitor prior to the process, that is to say where welding is going to take place soon. Thus, the actual state of the nominal position of the position of the material processing can be detected and corrected. Thus, the position of the (future) process can already be corrected and not the current process or the current position of the material processing process. Time is needed for such a correction. The method according to the invention advantageously makes it possible to look ahead a few millimeters (i.e., into the future). The approximate location of the future location can be deduced, in particular in the remote welding process, from the current welding position in combination with the known desired position from the programmed path.

    [0046] A position may also be considered at which the process itself took place within the meaning of what welding occurred. In this case, quality control is usually the aim. The relevant position can be deduced from the position where welding took place, i.e., from the data that are available to the remote welding head, possibly including the correction discussed in the previous paragraph.

    [0047] It is still possible to look at the ongoing process, i.e., the position at which welding is currently being carried out. This variant offers the possibilities to regulate the process itself. However, this location is also well known, both in the real space and on the camera.

    [0048] Due to the use of the planned and known path data of the laser material processing head, the method of the present invention allows not only to match the actual state with a nominal state or accurately reproduce the position of the process of material processing but on the basis of using of previously known data for path guidance of the laser processing head even directly upcoming events may be taken into account. Thus, the selection of the image area can already be planned in advance, wherein the processing of the image is faster because less data due to the reduction of the pixels to be transmitted and processed. As a result, relatively complex algorithms can be used, and it is nevertheless ensured to edit or determine the relevant image sections fast enough.

    [0049] In the method according to the invention, the image area can be selected freely within the search range of a camera at runtime. In connection with the method according to the invention, it should be emphasized that the selection of the image area is accompanied by a reduction of the pixels. A calculation of the measured deviations from the programmed path takes place, and the relevant section in the camera image is predicted on that basis. In the method according to the invention, it is also possible to use only the programmed path for prediction, that is to say completely without seam guidance.

    [0050] FIG. 1 shows in the left part the seam 1 to be welded, the working space 2 of the remote welding head and the camera image 5. In the center of FIG. 1, the camera image 5, the seam 1 to be welded and the relevant area of interest (AOI) are shown. The image section 10 shows the necessary area of the entire camera compartment 5, which is sufficient for the method according to the invention.

    [0051] In the right part of FIG. 1, the image section 10 is shown, which can be seen to be welded seam 1 and the image center 12. The image center 12 corresponds to the position at which the remote welding head predicts the seam

    [0052] The remote welding head can weld seams anywhere in its working space 2 (FIG. 1, left part of the image). Also, there are no restrictions on the orientation of the seams to the workspace and thus to the remote weld head.

    [0053] The seams are preprogrammed, so the remote welding head knows the course of the seams in the three-dimensional space. However, there may be small deviations in the space during processing, because the workpiece has a different shape than planned, the preparation of the workpiece was less accurate, or the welding head is differently orientated in space. In order to ensure that welding takes place exactly where welding should take place, according to the invention, a point is considered which is to be welded in immediate proximity, i.e., a few millimeters before the position of the actual welding process. Because this can be everywhere in principle, a camera takes up the entire spatial area around the position of the welding process (see FIG. 1, Middle).

    [0054] According to the present invention, it is intended not to use the entire camera image, since too many pixels would be included in the calculation, making the calculation more complex resulting in more computing time that will be needed. Furthermore, using the entire image, many different details are seen in the image which are not relevant, and which may interfere with seam detection algorithms.

    [0055] According to the invention, therefore, a small image section (=AoI, Area of Interest) is selected. It is intended to weld in this image section at time T2. However, this is in the future, so there is still time for correction. Thus, already at time T1, the current time of processing, it has to be determined where image section time T2 will lie exactly. This is calculated from the current (T1) position of the welding process and the programmed path, ultimately from the course of the nominal seam. Since the determination takes place in close spatial proximity, a few millimeters before the position of the welding process, the deviation between the nominal and actual seam is small and the actual seam is still located in the selected image section (FIG. 1, right part of the image).

    [0056] FIG. 1 and the text above describe the method of selecting a patch if it is to be used for seam tracking. A similar process can be used to look into the welding process itself or to take the pictures of the already welded seam, e.g., for quality control. These positions can be calculated exactly from the available data. Likewise, these methods can be applied to other laser material processing processes, such as e.g., laser cutting.

    [0057] The foregoing description of the preferred embodiment of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention. The embodiment was chosen and described in order to explain the principles of the invention and its practical application to enable one skilled in the art to utilize the invention in various embodiments as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto, and their equivalents. The entirety of each of the aforementioned documents is incorporated by reference herein.

    REFERENCE NUMERALS

    [0058] 1 scam (to be welded) [0059] 2 Workspace remote welding head [0060] 5 camera image [0061] 10 image section [0062] 12 image center