MONITORING OF A LASER MACHINING PROCESS USING A NEUROMORPHIC IMAGE SENSOR
20230036295 · 2023-02-02
Inventors
Cpc classification
B23K31/006
PERFORMING OPERATIONS; TRANSPORTING
International classification
B23K26/03
PERFORMING OPERATIONS; TRANSPORTING
B23K31/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A system for monitoring a laser machining process on a workpiece is disclosed. The system includes: a neuromorphic image sensor configured to generate image data of the laser machining process, and a computing unit configured to determine input data based on the image data, and to determine output data based on the input data by means of a transfer function, the output data containing information about the laser machining process. Further, a method for monitoring a laser machining process on a workpiece is disclosed.
Claims
1. A system for monitoring a laser machining process on a workpiece, said system comprising: a neuromorphic image sensor configured to generate image data from a surface of the workpiece; and a computing unit configured to determine input data based on the image data, and to determine output data based on the input data by means of a transfer function, said output data containing information about the laser machining process.
2. The system according claim 1, wherein said neuromorphic image sensor is configured to generate image data from a machining area, an area in advance of the machining area and/or an area in the wake of the machining area.
3. The system according to claim 1, wherein said neuromorphic image sensor is configured to transmit image data to said computing unit continuously and/or asynchronously.
4. The system according to claim 1, wherein said neuromorphic image sensor comprises a plurality of pixels configured to generate image data independently of one another in response to changes in brightness sensed by the respective pixel.
5. The system according to claim 4, wherein the image data of a pixel comprise at least a pixel address corresponding to the pixel and a time stamp corresponding to the sensed change in brightness.
6. The system according to claim 1, wherein said computing unit is configured to generate the input data by means of a further transfer function based on the image data, and/or wherein the image data transmitted from said neuromorphic image sensor are the input data.
7. The system according to claim 1, wherein the transfer function between the input data and the output data and/or the further transfer function between the image data and the input data is formed by a trained neural network.
8. The system according to claim 7, wherein the trained neural network comprises a convolutional neural network, CNN, a binary neural network, BNN, and/or a recurrent neural network, RNN.
9. The system according to claim 1, wherein the information about the laser machining process includes information about a state of the laser machining process, about a machining result, about a machining error and/or about a machining area of said workpiece.
10. The system according to claim 1, wherein the computing unit is configured to output the output data as control data for a laser machining system carrying out the laser machining process.
11. A laser machining system for machining a workpiece using a laser beam, said laser machining system comprising: a laser machining head for radiating a laser beam onto said workpiece; and the system according to claim 1.
12. The laser machining system according to claim 11, wherein said computing unit is arranged on or in said laser machining head, and/or wherein said neuromorphic image sensor is arranged on an outside of said laser machining head and/or on said laser machining head.
13. The laser machining system according to claim 1, further comprising: a laser source configured to generate the laser beam; and a control unit configured to control, based on the output data determined by said computing unit, said laser machining system and/or said laser machining head and/or said laser source and/or to control the laser machining process.
14. A method for monitoring a laser machining process on a workpiece, said method comprising the steps of: generating image data from a surface of said workpiece using a neuromorphic image sensor; determining input data based on the image data; and determining output data based on the input data by means of a transfer function, said output data containing information about the laser machining process.
15. The method according to claim 14, further comprising the step of: controlling, in real time, at least one parameter of the laser machining process based on the determined output data.
Description
DETAILED DESCRIPTION OF THE DRAWINGS
[0037] Embodiments of the invention are described in detail below with reference to figures, wherein:
[0038]
[0039]
[0040]
DETAILED DESCRIPTION OF THE INVENTION
[0041] Unless otherwise noted, the same reference symbols are used below for elements that are the same or have the same effect.
[0042]
[0043] A laser machining system 1 is configured to machine a workpiece 2 using a laser beam 3. The laser machining system 1 includes a laser machining head 14, such as a laser cutting or laser welding head, and a laser device 15, also called a “laser source”, for providing the laser beam 3. The laser machining head 14 is configured to radiate the laser beam 3 onto the workpiece 2. The laser machining head 14 may comprise collimating optics for collimating the laser beam and/or focusing optics for focusing the laser beam 3. The area of the workpiece surface on which the laser beam 3 is incident on the workpiece 2 may also be referred to as the “machining area” or “process zone” and may in particular include a puncture hole, a vapor capillary and/or a melt pool.
[0044] The laser machining system 1 or parts thereof, in particular the laser machining head 14, and the workpiece 2 may be movable relative to one another in a machining or feed direction 4. For example, the laser machining system 1 or parts thereof, in particular the laser machining head 14, may be moved in the feed direction 4. Alternatively, the workpiece 2 may be moved in the feed direction 4 relative to the laser machining system 1 or to a part thereof, in particular relative to the laser machining head 14. The feed direction 4 may be a cutting or welding direction. In general, the feed direction 4 is a horizontal movement. The speed at which the laser machining system 1 and the workpiece 2 move relative to each other along the feed direction 4 may be referred to as “feed speed”.
[0045] The laser machining system 1 is configured to perform a laser machining process such as laser cutting and laser welding. The laser machining system 1 includes a control unit 10 configured to control the machining head 14 and/or the laser device 15. The control unit 10 may be configured to control the laser machining process. The control includes changing, adjusting or setting at least one parameter of the laser machining process. The at least one parameter may include, for example, the laser power of the laser device 15, the feed rate of the laser machining head 14, and the focal position of the laser beam 3.
[0046] The laser machining system 1 further includes a system for monitoring a laser machining process. The system for monitoring a laser machining process includes a neuromorphic image sensor 13 and a computing unit 11.
[0047] The neuromorphic image sensor 13 is configured to generate image data of the laser machining process or of a surface of the workpiece 2. The computing unit 11 is configured to determine input data based on the image data and to determine output data based on the input data using a transfer function, said output data containing information about the laser machining process. The computing unit 11 may be configured to form the output data in real time. The computing unit 11 or the control unit 10 may be configured to execute the method described below for monitoring a laser machining process. In other words, the method may be executed by the computing unit 11 or the control unit 10.
[0048] The neuromorphic image sensor 13 is based on the principle of only outputting or recording the change in exposure level of each individual pixel. Neuromorphic image sensors, also known as event-based image sensors, sense changes in brightness, so-called “events”. The data transfer takes place in asynchronous form. In event-based image sensors or event-based cameras, there is a continuous transmission of information regarding changes in brightness. Only the information from the pixels that have detected changes in brightness is continuously transmitted. In comparison to frame-based cameras, in which the brightness values for all pixels (including those that have not changed compared to the previous image) are transmitted with each image, neuromorphic image sensors only transmit data when the brightness of a pixel changes significantly. The temporal quantification of the individual pixels results in fewer redundancies than in frame-based image sensors or cameras. At the same time, the loss of information is lower.
[0049] Neuromorphic image sensors have a number of advantages. These include a high dynamic range, e.g. from approx. 100 to 130 dB, so that additional illumination is not required in most cases. In addition, neuromorphic image sensors have a high temporal resolution and are not affected by overexposure/underexposure or fast movement. The recording speed of the neuromorphic image sensors is comparable to that of a high-speed camera, which may have several thousand fps, although with neuromorphic image sensors there are no frames but a continuous data stream. The neuromorphic image sensor 13 may have, for example, a dynamic range of approximately 120 dB, a temporal resolution in the microsecond range, an equivalent frame rate of 1000000 fps, and/or a spatial resolution of 0.1-0.2 MP.
[0050] Due to the greatly reduced amount of data, the computing unit 11 requires significantly less computing power and may therefore move closer to the location of the image data generation, i.e. the neuromorphic image sensor 13.
[0051] According to the embodiment shown in
[0052] In contrast, according to the embodiment shown in
[0053] According to embodiments, the computing unit 11 may be combined with or integrated into the control unit 10. In other words, the functionality of the computing unit 11 may be combined with that of the control unit 10.
[0054] The neuromorphic image sensor 13 is configured to generate image data from the workpiece surface and is in particular configured to generate image data from the machining area of the workpiece surface. According to embodiments, the neuromorphic image sensor 13 may be configured in particular to generate image data from an area in advance of the process zone in the feed direction 4 and/or an area in the wake of the process zone in the feed direction 4.
[0055] The image data of a pixel include, for example, the pixel address or the pixel identity and a time stamp. In addition, the image data may also include the polarity (increase or decrease) of the brightness change or a level of the brightness sensed now.
[0056] The information about the laser machining process, which is contained in the output data determined by the computing unit 11, may include information about a state of the laser machining process, information about a machining result, a machining error and/or a machining area of the workpiece 2. In particular, the machining result may be a current machining result.
[0057] Due to the high recording speed, processing the image data of the neuromorphic image sensor 13 with conventional image processing algorithms entails a loss of performance. Therefore, embodiments of the present invention preferably use machine learning methods for image data processing or for image data evaluation. For example, the transfer function between the input data and the output data may be formed by a trained neural network. The transfer function may be used for image processing or image evaluation of the input data. Advantageously, so-called “CNNs” may be used for image processing and evaluation, “BNNs” for reducing the amount of image data, and “RNNs” for the temporal analysis of the events. In this way, in particular, a loss of performance compared to conventional methods of image processing or evaluation can be avoided. For example, the image data is not converted into frames, but transferred to a suitable vector space, for example by spatio-temporal filtering in the spike event domain.
[0058] With the aid of the neuromorphic image sensors, smaller models compared to frame-based cameras can be used in machine learning methods while achieving comparable performance. Due to the elimination of redundant information in neuromorphic image sensors, the machine learning model has to take fewer features into account, which in the case of a neural network is equivalent to a reduction in the number of neurons contained in the network. This makes it much easier to train the machine learning models since smaller models usually require far fewer examples to train the model. The omission of redundant information also allows for faster execution of the transfer function or the algorithm (“inference”) for image processing or image analysis. In this way, in particular real-time control of the laser machining process becomes possible.
[0059] According to embodiments, the computing unit 11 may be configured to generate control data based on the output data and to transmit them to the control unit 10. Alternatively, the output data are transmitted to the control unit 10 and the control unit 10 may be configured to generate control data. The control unit 10 may further be configured to control and/or regulate the laser machining system or the laser machining process, preferably in real time, based on the output data determined by the computing unit 11. For example, the control unit 10 may be configured to control the laser machining head 14 and/or the laser source 15 based on the output data.
[0060] The computing unit 11 may further be configured to transmit the output data determined to a quality assurance unit 12 of the laser machining system. The quality assurance unit 12 may be configured to determine optimum parameters for at least one step of the laser machining process based on the initial data and to transmit them to the control unit 10.
[0061]
[0062] The method 100 comprises the steps of: generating image data of the laser machining process using a neuromorphic image sensor (S101), determining input data based on the image data (S102), and determining output data based on the input data using a transfer function, the output data being information about the laser machining process (S103).
[0063] The method may also include controlling, in particular in real time, at least one parameter of the laser machining process based on the determined output data. The parameter may include the laser power of the laser source, a feed rate and a focal position.
[0064] The present invention may advantageously be used to control a laser machining process. The output data are preferably transmitted from the computing unit 11 directly to the control unit 10, which may also be referred to as “machine control”. The control unit 10 may be configured to control at least one parameter of the laser machining process or the laser machining system, in particular in real time, based on the output data. The parameter may include the laser power of the laser source, a feed rate and a focal position. This allows for the parameters to be adjusted to the current process status in real time, which means that better machining results can be achieved. These include, for example, better surface quality and an increased feed rate and a shorter piercing time when laser cutting.
[0065] When laser cutting, for example, the piercing process can be analyzed and controlled in real time thanks to the extremely high equivalent frame rate and the resulting high temporal resolution of the camera. In addition, the high dynamic range of the sensor in combination with the high temporal resolution can be used to monitor the cutting front during a laser cutting process and the process quality can be determined in real time. As a result, the cutting process can be controlled, for example, by counteracting, in the event of reduced process quality, by changing, adapting or controlling the parameters of the laser machining process, in particular laser power, feed rate and focal position. The present invention also makes it possible to monitor spatter with an extremely high temporal resolution during laser cutting or laser welding in order to draw conclusions about the process quality. In laser welding, the present invention allows for direct monitoring of the weld pool and control of laser welding parameters.