SPATIAL MAPPING OF SENSOR DATA COLLECTED DURING ADDITIVE MANUFACTURING
20200276669 ยท 2020-09-03
Assignee
Inventors
- John DARDIS (Stroud, GB)
- Kiera Megan JONES (Stroud, GB)
- Ceri BROWN (Plaisance-du-Touch, FR)
- Nicholas Henry Hannaford JONES (Stroud, GB)
Cpc classification
B33Y10/00
PERFORMING OPERATIONS; TRANSPORTING
B23K15/0013
PERFORMING OPERATIONS; TRANSPORTING
B23K26/082
PERFORMING OPERATIONS; TRANSPORTING
B23K15/0086
PERFORMING OPERATIONS; TRANSPORTING
B33Y50/00
PERFORMING OPERATIONS; TRANSPORTING
B22F10/28
PERFORMING OPERATIONS; TRANSPORTING
B22F10/366
PERFORMING OPERATIONS; TRANSPORTING
B22F12/44
PERFORMING OPERATIONS; TRANSPORTING
B23K15/002
PERFORMING OPERATIONS; TRANSPORTING
B22F12/90
PERFORMING OPERATIONS; TRANSPORTING
B22F12/41
PERFORMING OPERATIONS; TRANSPORTING
B33Y50/02
PERFORMING OPERATIONS; TRANSPORTING
Y02P10/25
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
International classification
B33Y10/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method of generating a spatial map of sensor data collected during additive manufacturing, in which a plurality of layers of powder are selectively melted with an energy beam to form an object. The method includes receiving sensor data collected during additive manufacturing of an object, the sensor data including sensor values, the sensor values captured for different coordinate locations of the energy beam during the additive manufacturing of the object, and generating cell values for a corresponding cell-based spatial mapping of the sensor data. Each of the cell values is determined from a respective plurality of the sensor values extending over an area/volume comparable to an extent of the melt pool or the energy beam spot.
Claims
1.-36. (canceed)
37. A method of generating a spatial map of sensor data collected during additive manufacturing, in which a plurality of layers of powder are selectively melted with an energy beam to form an object, the method comprising receiving sensor data collected during additive manufacturing of an object, the sensor data comprising sensor values, the sensor values captured for different coordinate locations of the energy beam during the additive manufacturing of the object, and generating cell values for a corresponding cell-based spatial mapping of the sensor data, wherein each of the cell values is determined from a respective plurality of the sensor values attributed to coordinate locations that are within a defined distance of a cell location, the defined distance extending beyond a spatial extent of the cell.
38. A method according to claim 37, wherein the area/volume has a dimension of between 65 microns and 500 microns.
39. A method according to claim 37, wherein each cell represents a spatial extent comparable to the size of the melt pool or the energy beam spot.
40. A method according to claim 37, wherein each cell has a spatial extent significantly smaller than the size of the melt pool and/or the energy beam spot but whose value is determined from a respective plurality of the sensor values extending over the area/volume comparable to the melt pool or the energy beam spot.
41. A method according to claim 37, wherein each cell value is determined from sensor values from different ones of the plurality of layers.
42. A method according to claim 37, wherein determining each cell value comprises a weighted summation of the respective plurality of sensor values based upon a distance of each sensor value from a cell location.
43. A method according to claim 37, wherein determining cell values comprises a blurring to smooth the sensor values out among the cells of the cell-based spatial mapping.
44. A method according to claim 37, wherein the plurality of sensor values used for the generation of a cell value or the algorithm used to generate the cell value is based upon directionality of a scan and/or a time of capture of the sensor values during the additive manufacturing process.
45. A method according to claim 37, comprising receiving a user/operator input defining the area/volume and determining the respective plurality of sensor values to use in determining the cell value based upon the user/operator input.
46. A method according to claim 37, wherein the cell-based spatial mapping is a volumetric model of the sensor values and a voxel size of the volumetric model is greater than a thickness of a layer in the additive manufacturing process.
47. A method according to claim 37, wherein, the sensor data comprises at least one selected from the group of: i) sensor values derived from radiation emitted from melted regions and/or plasma generated during the additive manufacturing process, for example, the sensor data may be values captured by photodiodes or cameras arranged to view the melted region through an optical train used to deliver a laser beam; ii) sensor values derived from laser spot positions derived from measured positions of elements, such as mirrors, of a scanner during the additive manufacturing process; and iii) sensor values derived from a beam monitoring sensor, for example, the sensor values may be derived from a photodiode monitoring the laser beam and may be a measurement of energy beam power or a measurement of energy beam modulation.
48. A method according to claim 37, comprising comparing one or more of the cell values to a target value or range and controlling the additive manufacturing process based upon the comparison.
49. A method according to claim 48, wherein the target value or range is determined from an average cell value for sensor data generated when producing one or more test parts having a required/target density.
50. A method according to claim 48, comprising carrying out closed loop control during the additive manufacturing based upon the comparison of the one or more cell values to the target cell value or range.
51. A method according to claim 48, wherein controlling the additive manufacturing process comprises adjusting at least one scanning parameter.
52. A method according to claim 48, comprising comparing an aggregation of cell values to the target value or range.
53. A method of controlling an additive manufacturing process, in which layers of powder are selectively melted with a laser beam to form an object, the method comprising receiving sensor data collected during additive manufacturing of an object, the sensor data comprising sensor values generated by a sensor for detecting radiation collected by an optical train used for steering the laser beam onto powder, generating a process value from a plurality of the sensor values, comparing the process value to a setpoint value and adjusting the additive manufacturing process based upon the comparison.
54. A method according to claim 53, wherein the plurality of sensor values extend over an area/volume comparable to an extent of the melt pool or the energy beam spot.
55. A method according to claim 53, wherein the process value is determined by summing the plurality of sensor values.
56. A method according to claim 53, wherein each sensor value is associated with a corresponding layer of a plurality of layers solidified during the additive manufacturing of the object and a coordinate value localising the sensor value to a point within the corresponding layer, wherein the process value is determined from a weighted sum of the plurality of the sensor values, a weighting determined from the coordinate values localising the plurality of the sensor values.
57. A method according to claim 53, comprising determining a process value for a process value location within a layer, the process value determined by summing the plurality of sensor values based upon a distance of each sensor value from the process value location.
58. A method according to claim 53, wherein adjusting the additive manufacturing process comprises adjusting parameters of an additive manufacturing apparatus during manufacture of the object
59. A method according to claim 53, wherein adjusting the additive manufacturing process comprises adjusting parameters of the additive manufacturing process for manufacture of a further object using the additive manufacturing process.
60. A data carrier having instructions stored thereon, wherein the instructions, when executed by a processor cause the processor to carry out the method of claim 37.
61. A visualisation system comprising a display and a processor, the processor arranged to receive sensor data from an additive manufacturing apparatus, carry out the method of claim 37 to generate cell values for a corresponding cell-based spatial mapping of the sensor data and cause the display to display a representation of the sensor values based on the cell-based spatial mapping.
62. A method according to claim 37, wherein the defined distance is comparable to an extent of a dimension of a melt pool or an energy beam spot.
Description
DESCRIPTION OF THE DRAWINGS
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
DESRIPTION OF EMBODIMENTS
[0072] Referring to
[0073] Layers of powder are formed across a working plane by lowering the platform 102 and spreading powder dispensed from dispensing apparatus 108 using wiper 109. For example, the dispensing apparatus 108 may be apparatus as described in WO2010/007396.
[0074] At least one laser module, in this embodiment laser module 105 generates a laser 118 for melting the powder 104. The lasers 118 are directed as required by a corresponding scanner, in this embodiment optical module 106. The laser beam 118 enters the chamber 101 via a window 107. In this embodiment, the laser module 105a comprises a fibre lasers, such as Nd YAG fibre lasers. The laser beam enters the optical module from above and is directed over the surface (the working plane) of the powder bed 104 by movable mirrors tiltable mirrors 150 (only one of which is shown for the optical module 106). One of the mirrors 150 is tiltable to steer the laser beam in an X-direction and the other tiltable mirror 150 is tiltable to steer the laser beam in a Y-direction perpendicular to the X-direction. Movement of each tiltable mirror 150 is driven by a galvanometer. A position of each galvanometer is measured by a transducer 157. In this embodiment, the transducer is in accordance with the transducer described in U.S. Pat. No. 5,844,673. The optical module 106 further comprises movable focussing optics 155 for adjusting the focal length of the corresponding laser beam.
[0075] A beam splitter 156 directs light of the laser wavelength from an input to the tiltable mirrors 150 and transmits light of other wavelengths that is emitted from the powder bed 104 to an in-process monitoring module 160. In process monitoring module 160 comprises at least one photodetector 161 for detecting an integrated intensity and/or spatial intensity of the transmitted light. The scanner further comprises a beam dump 163 for collecting a small proportion of the laser light that passes through the beam splitter 156. In the beam dump 163 is a beam monitor 164 which generates sensor signals based upon the laser light that is transmitted to the beam dump 163.
[0076] The signals from sensors 157, 161 and 164 are sent and stored in controller 121. As described in WO2017/085469, each sensor value is associated with a time stamp with a time at which the sensor value was generated and measured positions of mirrors 150. The controller 121 packages the sensor data for a layer together with demand data for that layer, such as the demand positions for the mirrors at different times during solidification of the layer, demand laser modulation, demand laser power and layer thickness. The package may also comprise laser parameters such as laser spot size, hatch spacing, point spacing and exposure time. As the build progresses, the controller 121 packages this data on a per layer basis and transmits this to visualisation apparatus.
[0077]
[0078] In other machines, rather than the laser spot being scanned along a hatch line in a series of point exposures, a laser spot is scanned continuously along the hatch lines. In such an arrangement, it is typical for scan speed to be a defined scan parameter rather than point distance and exposure time.
[0079]
[0080] On receipt of a package of sensor data for a layer, the visualisation apparatus generates a two-dimensional pixel map of the sensor data based upon user settings and a representation of this two-dimensional spatial map can be displayed on a display of the visualisation apparatus, if requested by a user. The user settings comprise pixel size, and an algorithm for calculating the pixel values.
[0081] To determine the pixel values, first a position is attributed to each sensor value of the sensor data selected to be visualised. The position is based upon the measured mirror positions or the demand positions for the mirrors at the time the sensor value was generated. For the photodiode data of detector 161, the sensor values may be reduced to those that were generated when the laser was firing, for example as determined from the demand data for laser modulation or the detector 164 in the beam dump 163. Pixel values are then determined from this reduced set of sensor values and the selected algorithm. In this embodiment, the selected algorithm may be any one of the maximum sensor value of the sensor values that fall within a spatial extent of the pixel, a mean of the sensor values that fall within a spatial extent of the pixel, an extreme value of the sensor values that fall within a spatial extent of the pixel or a sum of the sensor values that fall within a spatial extent of the pixel; the algorithm may incorporate Gaussian blurring.
[0082]
[0083]
[0084] To determine the pixel values using the Gaussian blur, each sensor value is first assigned to a pixel of a fine pixel mesh having a pixel size less than that pixel size of the final two-dimensional mapping. In the representation shown in
[0085]
[0086] In the above described embodiment, a contribution of each sensor value to a pixel value is independent of direction of the sensor value from a centre of the pixel or a time at which the sensor value was generated. However, the scanning process has directionality (both in terms of scan direction and hatch formation direction, D) and therefore, time dependence as the scan of the laser beam progresses across each layer. The algorithm used to determine the pixel values may take these factors into account. For example, the blur may use a normal distribution skewed based upon a direction of the scan and/or hatch formation direction.
[0087] The visualisation apparatus 202a, 202b, 202c may also be arranged to generate a volumetric mapping of the sensor data. In this embodiment, the volumetric mapping is generated from a plurality of pixel maps, for example as described above, corresponding to layers of the additive manufacturing process. The sensor data may be sent to the visualisation apparatus 202a, 202b, 202c in packets or batches, each containing sensor data for a layer. On receipt of each packet of sensor data, a two-dimensional map is formed for the layer as described above.
[0088] After the receipt of sensor data for a predetermined number of layers having a combined thickness corresponding to a depth of a melt pool, voxel values are calculated from the pixel values of the two-dimensional maps generated for that sensor data. The voxel values for the volumetric mapping may be determined by summing together all pixels that fall within a volume represented by the voxel, or by determining a mean value for all pixels that fall within that volume.
[0089] Alternatively, a blurring algorithm may be applied in the determination of the voxel values. In this latter case, the voxels may have a size smaller than the melt pool or even the same as a thickness of a layer but the voxel values are determined from sensor or pixel values for a plurality of different layers. As a melt pool is formed across many layers of the powder bed, sensor values captured for melt pools generated for multiple different layers may contribute to a voxel value. However, the sensor values that can contribute to a voxel value may be confined to those for layers intersecting the voxel and, optionally, a predetermined number of layers above the voxel. The predetermined number of layers may be based upon an expected melt pool depth. Accordingly, blurring for the volumetric model may also have directionality such that sensor data for lower layers does not provide a contribution to the voxel value.
[0090] This volumetric model may be determined and displayed in real-time on the visualisation apparatus 202a, 202b, 202c as sensor data is received during the additive manufacturing process.
[0091] In another embodiment, the pixel values for the two-dimensional mapping may be determined based upon sensor values generated for multiple different layers, such as sensor data for the predetermined number of layers above the layer being mapped.
[0092] In the embodiments in which the cell values of the mapping are dependent on sensor values collected for subsequent layers, a plurality of spatial maps may be generated for a particular layer or volume for different times during the build. Accordingly, the user could view a mapping of the sensor data for layer 1 after layer 1 has been formed but also after layers 2, 3, etc have been formed. These spatial maps may be displayed as an animation allowing the user to view how the pixel/voxel values change with time during the build.
[0093] Furthermore, a directionality or weighting applied to sensor values for the blurring for a voxel value may vary dependent on the time during the build represented by the mapping. For example, a weighting given to a sensor/pixel value of a layer may be given greatest weighting (such as 100% weighting) when determining a voxel/pixel value corresponding to a mapping representing at time just after formation of that layer but may be given a lesser weighting for mappings representing a time after the formation of later layers.
[0094] It is envisaged that the spatial mappings described above may also be used in a computer implemented process, such as closed loop control of additive manufacturing apparatus. As such, rather than the determination of the spatial mappings being carried out in a visualisation apparatus remote from the additive manufacturing apparatus, the controller of the additive manufacturing apparatus may be arranged to determine and analyse the spatial mappings. A setpoint value may be determined for the pixel values as determined in accordance with the method described with reference to
EXMAMPLE
[0095] A series of test parts (each comprising a cube shape built on the top of an inverted pyramid) were built in a single build from Inconel 718 using a Renishaw RenAM 500 M additive manufacturing machine, wherein 13 different combinations of scanning parameters were used. Each set of scanning parameters was used three times at different locations on the bed. The layer thickness was 60 micrometres.
[0096]
[0097]
[0098]
[0099]
[0100]
[0101] Accordingly, it is believed that there is sufficient difference between the Summed Pixel Values for bad parts and good parts and small enough variation within a part such that these values can be used as a process variable for a control loop in which a setpoint value is defined for the Summed Pixel value.
[0102] It is believed that other sensor data could be used for feedback control of the process. For example, Summed Pixel Values for a photodiode detecting radiation having wavelengths of above 1000 nm.
[0103] Further factors may be taken into account when determining a setpoint value. For example, a direction of a scan and a position of a scan in the working plane.
[0104]
[0105] The variation in the mean value of the Summed Pixel Values with position in the working plane is illustrated by
[0106] It is believed that the Summed Pixel Value provides a better representation of the physical effects detected by the sensor and therefore, the Summed Pixel Values provides a better basis for in-process control.
[0107] The invention is not limited to generating a spatial mapping from sensor data collected during a single build but may generate a spatial mapping from sensor data generated during multiple nominally identical builds. Each cell value of the spatial mapping may be generated from a plurality of sensor values generated across the plurality of builds. For example, the sensor values used to create the spatial mapping may be values generated from builds that have been verified as good builds by appropriate post-testing of the manufactured objects. The cell values may be generated using the Summed Pixel Value technique as described above, but when applied to the sensor values generated during the good builds. Spatial mappings may be generated for multiple statistical measurements, such as a mean of the sensor values, standard deviation of the sensor values, sum of the sensor values and/or inter-quartile range for the multiple good builds.
[0108] A spatial mapping for a subsequent nominally identical build can then be compared to the spatial mapping(s) generated for the multiple builds on a cell by cell basis to determine if the cell values for the subsequent build fall within the expected cell values for a good build. In this way, the comparison takes account of expected variations in the sensor values with, for example, position in the build, scan direction and changes in the scan parameters during the build. Such variations may be build specific being dependent on geometry of the object and the build design. This may provide a more sensitive method for determining whether an object has been formed within or outside a defined specification compared to comparing all cell values to a global allowable range for the cell values.
[0109] The results of the comparison may be used to verify the build/the object. If it is determined that one or more cell values are not within the expected values, the object may be sent for further testing, processing and/or may be discarded. In the case of there being multiple spatial mappings for good builds based on multiple statistical measures, the results of the comparison may be visualised, for example by colour coding a representation of the object, to identify which statistical measure has been failed at which location on the subsequent build. This visualisation may be used by a user to assess whether the build/object meets a specification. The comparison may be used as the basis for generation of an alert if cell values generated from the sensor values fall outside the expected cell values as determined through the spatial mapping.
[0110] Rather than generating the spatial mapping for multiple builds, a single spatial mapping may be generated from sensor values collected for multiple nominally identical objects whether they are all built in a single build or across multiple builds.
[0111] Furthermore, the multi-build spatial mapping may be determined from multiple nominally identical builds in the same machine. The expected variation in cell values for a spatial mapping created from sensor values from a single machine may be less than what would be expected across multiple machines, for example because the sensor response across different machines may not have been normalised. Hence, a variation in sensor values collected across multiple machines may not represent a variation in the build but may represent a variation in a response of the sensors. This problem may be overcome by spatially mapping z-scores (standard scores) for the cell values or other suitable statistical normalisation of the cell values between different machines.
[0112] The spatial mapping generated from multiple builds and/or multiple objects may be generated from only a portion of the sensor values generated for the builds/objects. For example, through appropriate analysis of the build or object it may be possible to identify good portions of the build/object (i.e. portions of the build/object that meet a specification as determined, for example as determined by a CT scan of other non-destructive (NDT) or even destructive testing (DT) method) and bad portions of the build/objection (i.e. portions of the build/object that do not meet the specification). The spatial mapping may be generated from the sensor values generated only for the good portions of the builds/objects. In this way. even a portion of the sensor data generated during a bad build can be aggregated in the spatial mapping to provide a benchmark for the cell values of later spatial mappings for nominally identical builds/builds of nominally identical objects. Types of testing that may be carried out to determine whether a build or object is a good of bad build/object is disclosed in WO2017/085468, which is incorporated herein by reference.
[0113] It will be understood that modifications and alterations may be made to the above described embodiments without departing from the invention as defined in the claims.