METHOD AND DEVICE FOR DEMONSTRATING THE INFLUENCE OF CUTTING PARAMETERS ON A CUT EDGE

20220339739 · 2022-10-27

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for recognizing cutting parameters which are particularly important for specific features of a cut edge. A recording of the cut edge is analyzed by an algorithm having a neural network for determining the cutting parameters. Those recording pixels which play a significant part for ascertaining the cutting parameters are identified by backpropagation of this analysis. An output in the form of a representation of these significant recording pixels, in particular in the form of a heat map, demonstrates to a user of the method which cutting parameters need to be changed in order to improve the cut edge. A computer program product and a device for carrying out the method.

    Claims

    1. A method for analyzing a cut edge created by a machine tool, the method comprising the following steps: reading in at least one recording of the cut edge, the recording having a multiplicity of recording pixels; analyzing the recording by way of a trained neural network for determining at least one cutting parameter; analyzing a backpropagation of the neural network for determining a relevance of the recording pixels for ascertaining the determined cutting parameters; outputting the recording with identification of at least one of particularly relevant recording pixels or particularly irrelevant recording pixels.

    2. The method according to claim 1, wherein the trained neural network is a convolutional neural network having a plurality of layers.

    3. The method according to claim 2, wherein each of the plurality of layers have a plurality of filters.

    4. The method according to claim 1, wherein the backpropagation is a layer-wise relevance propagation.

    5. The method according to claim 4, wherein an assignment of the relevance in the layer-wise relevance propagation is based on deep Taylor decomposition.

    6. The method according to claim 1, wherein the identification of the particularly relevant and/or particularly irrelevant recording pixels is outputted as a heat map.

    7. The method according to claim 1, wherein the recording is an RGB photograph or a 3D point cloud.

    8. The method according to claim 1, further comprising creating the recording via a camera.

    9. The method according to claim 8, wherein the camera is a camera of the machine tool.

    10. The method according to claim 1, further comprising creating the cut edge with the machine tool.

    11. The method according to claim 10, wherein the machine tool is a laser cutting machine.

    12. The method according to claim 11, wherein the at least one cutting parameter is: beam parameters; transport parameters; gas dynamics parameters; and/or material parameters.

    13. The method according to claim 12, wherein the beam parameters are a focus diameter and/or laser power.

    14. The method according to claim 12, wherein the transport parameters are focus position, nozzle-focus distance and/or feed.

    15. The method according to claim 12, wherein the gas dynamics parameters are gas pressure and/or nozzle-workpiece distance.

    16. The method according to claim 12, wherein the materials parameter are degree of gas purity and/or melting point of the workpiece.

    17. A computer program product configured for carrying out the method according to claim 1, wherein the computer program product comprises the neural network.

    18. A device, comprising: a machine tool; a computer; and the computer having a computer program product configured for carrying out the method according to claim 1, wherein the computer program product comprises the neural network.

    19. The device according to claim 18, wherein the machine tool is a laser cutting machine.

    20. The device according to claim 18, further comprising a camera.

    Description

    BRIEF DESCRIPTION OF THE FIGURES

    [0050] FIG. 1 shows a schematic illustration of a machine tool in the form of a laser cutting machine for elucidating essential cutting parameters.

    [0051] FIG. 2 shows a schematic overview of the method according to the invention comprising the following method steps:

    [0052] Section A shows the step of creating the cut edge with a plurality of cutting parameters;

    [0053] A) creating a recording of the cut edge;

    [0054] B) reading in the recording;

    [0055] C) analyzing the recording by means of a neural network for determining the cutting parameters;

    [0056] D) backpropagation of the neural network for determining the relevance of the recording pixels with respect to the determined cutting parameters; and

    [0057] E) identified representation of the relevant and/or irrelevant recording pixels.

    [0058] FIG. 3 shows recordings of various cut edges.

    [0059] FIG. 4 schematically shows the functioning of the neural network or more specifically of the backpropagation.

    [0060] FIG. 5 shows, in the left-hand column, recordings of two cut edges and, in the further columns, determined cutting parameters and outputs of the recordings with highlighting of the recording pixels that are relevant for the determined cutting parameters.

    DETAILED DESCRIPTION OF THE INVENTION

    [0061] FIG. 1 shows part of a machine tool 10 in the form of a laser cutting machine. In this case, a cutting head 12 passes over a workpiece 14 with the workpiece 14 being subjected to laser irradiation and exposure to gas. A cut edge 16 is produced in the process. The cut edge 16 is influenced in particular by the following cutting parameters 18: gas pressure 20, feed 22, nozzle-workpiece distance 24, nozzle-focus distance 26 and/or focus position 28.

    [0062] The influence of the individual cutting parameters 18 on the appearance of the cut edge 16 obtained is to a very great extent unclear even to experts. If striation occurs on the cut edge 16, for example, the cutting parameters 18 must be varied until the striation disappears, in which case, firstly, the variation is associated with high consumption of material and energy and expenditure of time and, secondly, it often happens that new artefacts are produced by the variation. There is therefore the need to provide a method and a device by which cutting parameters 18 are assigned to the features of a cut edge 16 in a targeted manner. These cutting parameters 18 can then be changed in order to change the feature of the cut edge 16. The invention therefore solves a problem which cannot be solved by human users on account of the complexity of the problem (“superhuman performance”).

    [0063] FIG. 2 shows an overview of the method according to the invention. In method step A), the cut edge 16 is produced by the machine tool 10 using the cutting parameters 18. In method step B), the cut edge 16 (see method step A)) is recorded using a camera 30. The camera 30 can be configured in the form of a photographic camera and/or a video camera. In method step C), the created recording 32 is read in. In method step D), the recording 32 is analyzed by an algorithm 34. The algorithm 34 has a neural network 36. The neural network 36 serves for determining 38 the cutting parameters 18. The determined cutting parameters 18 can be compared with the set cutting parameters 18 (see method step A)), for example in order to determine a defect of the machine tool 10 (see method step A)).

    [0064] In method step E), the algorithm 34 effects a backpropagation 40 in the neural network 36. The backpropagation 40 of the cutting parameters 18 with respect to the recording 32 establishes the relevance of individual recording pixels 42a, 42b of the recording 40 when determining the cutting parameters 18 in method step D). In method step F), the recording pixels 42a, b are represented (only the recording pixels 42a, b being provided with a reference sign in FIG. 2, for reasons of clarity) and their respective relevance is identified. In the present case, the particularly relevant recording pixel 42a is identified using a first color (for example red) and the particularly irrelevant recording pixel 42b is identified using a second color (for example blue or gray). Owing to formal stipulations, the different colors are represented by different patterns (hatchings) in the present description. On the basis of the particularly relevant recording pixels 42a, a user can directly recognize which regions of the recorded cut edge 16 (see method step A)) are particularly influenced by the cutting parameter 18 respectively determined (see method step D)).

    [0065] FIG. 3 shows by way of example three recordings 32a, 32b, 32c, the recordings 32a-c having been created with different cutting parameters 18 (see FIG. 1):

    TABLE-US-00001 recording 32a: gas pressure 20 15 bar feed 22 21 m/min nozzle-workpiece distance 24 1.5 mm nozzle-focus distance 26 −2 mm

    [0066] By comparison therewith, recording 32b was created with an increased nozzle-focus distance 26. Recording 32c was created with a reduced feed 22 by comparison with recording 32a. It is evident from FIG. 3 that the influence of the cutting parameters 18 (see FIG. 1) is not directly inferable from the recordings 32a-c for human users.

    [0067] FIG. 4 schematically shows the algorithm 34 or more specifically the neural network 36. The neural network 36 is constructed in the form of a convolutional neural network having a plurality of blocks 44a, 44b, 44c, 44d, 44e. In this case, an input block 44a is provided. The blocks 44b-e each have three convolutional layers 46a, 46b, 46c, 46d, 46e, 46f, 46g, 46h, 46i, 46j, 46k, 46l. The blocks 44a-e have filters 48a, 48b, 48c, 48d, 48e. Each layer of the input block 44a has 32 filters 48a. The layers of the block 44b likewise have 32 filters 48b. The layers of the block 44c have 64 filters 48c. The layers of the block 44d have 128 filters 48d and the layers of the block 44e have 256 filters 48e. The filters 48a-e can result in a reduction of the resolution of the recording 32 (e.g. from 200 pixels×200 pixels to 7 pixels×7 pixels) with at the same time an increase in the depth (or number of channels). The filters 48a-e of the third layer of each block 44a-e result in a reduction of the resolution. Convolutional layers are used here for the pooling as well. The depth increases from one block 44a-e to the next. By way of example, the block 44b consists of three convolutional layers, each having 32 filters. In the first two, the spatial resolution is 112×112 pixels. From the second to the third, the spatial resolution decreases from 112×112 pixels to 56×56 pixels. Upon the transition from block 44b (last layer) to block 44c (first layer), the depth increases from 32 to 64. The spatial resolution remains constant.

    [0068] The neural network 36 thereby enables determining 38 of the cutting parameters 18. In the present case, layer-wise relevance propagation is used in the backpropagation 40. The results of this are illustrated in FIG. 5.

    [0069] FIG. 5 shows the recording 32a in the upper column and the recording 32b in the lower column. The recordings 32a, b are reproduced a number of times in each column, the recording pixels 42a, b influenced greatly or little by the feed 22 being highlighted in the second column, the recording pixels 42a, b influenced greatly or little by the focus position 28 being highlighted in the third column, and the recording pixels 42a, b influenced greatly or little by the gas pressure 20 being highlighted in the fourth column. In this case, the outputs 50 can be present in the form of heat maps.

    [0070] The recording pixels 42b influenced particularly little by the respective cutting parameter 18 (see FIG. 1) serve primarily for checking the plausibility of the outputs 50. Preferably, in the outputs 50, only the recording pixels 42a influenced particularly greatly by the respective cutting parameter 18 (see FIG. 1) are highlighted in order to facilitate handling of the outputs 50 for a user.

    [0071] Taking all the figures of the drawing jointly into consideration, the invention relates in summary to a method for recognizing cutting parameters 18 which are particularly important for specific features of a cut edge 16. In this case, a recording 32, 32a-c of the cut edge 16 is analyzed by an algorithm 34 having a neural network 36 for determining 38 the cutting parameters 18. Those recording pixels 42a, b which play a significant part for ascertaining the cutting parameters 18 are identified by backpropagation 40 of this analysis. An output 50 in the form of a representation of these significant recording pixels 42a, b, in particular in the form of a heat map, demonstrates to a user of the method which cutting parameters 18 need to be changed in order to improve the cut edge 16. The invention furthermore relates to a computer program product and respectively a device for carrying out the method.

    [0072] The following is a summary list of reference numerals and the corresponding structure used in the above description of the invention: [0073] 10 Machine tool [0074] 12 Cutting head [0075] 14 Workpiece [0076] 16 Cut edge [0077] 18 Cutting parameters [0078] 20 Gas pressure [0079] 22 Feed [0080] 24 Nozzle-workpiece distance [0081] 26 Nozzle-focus distance [0082] 28 Focus position [0083] 30 Camera [0084] 32, 32a-c Recording [0085] 34 Algorithm [0086] 36 Neural network [0087] 38 Determining the cutting parameters 18 [0088] 40 Backpropagation [0089] 42a, b Recording pixels [0090] 44a-e Blocks of the neural network 36 [0091] 46a-l Layers of the neural network 36 [0092] 48a-e Filters of the neural network 36 [0093] 50 Output