METHOD FOR PRODUCING A THREE-DIMENSIONAL OBJECT AND CORRESPONDING DEVICE

20230226750 ยท 2023-07-20

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for producing a three-dimensional object by an additive manufacturing process includes introducing at least one manufacturing material fed in a flowable state from at least one feed-in opening of at least one feed-in needle into a supporting material. After being fed in, the at least one manufacturing material is cured, but it remains flexible or elastic after curing. The contour and/or position of at least one part of the three-dimensional object within the support material is detected by at least one sensor.

    Claims

    1. A method for producing a three-dimensional object by an additive manufacturing process, comprising: introducing at least one manufacturing material in a flowable state from at least one feed-in opening of at least one feed-in needle into a supporting material; curing the at least one manufacturing material to produce all or a portion of the three-dimensional object, wherein the at least one manufacturing material is elastic or flexible after curing; and detecting at least one of a contour and a position of at least one part of the three-dimensional object within the support material with at least one sensor.

    2. The method according to claim 1, wherein the detecting step detects the contour of an entirety of the three-dimensional object.

    3. The method according to claim 1 further comprising having at least one object in the support material whose contour and/or position within the support material is known or which is detected by the at least one sensor.

    4. The method according to claim 1 wherein the contour of the three-dimensional object detected by the at least one detector comprises one or more of an outer surface, an inner surface, and a wall thickness.

    5. The method according claim 1 wherein the at least one sensor comprises at least one optical sensor.

    6. The method according to claim 5 wherein the at least one sensor is a camera.

    7. The method according claim 1 wherein the at least one sensor comprises at least one ultrasonic sensor.

    8. The method according to claim 1 further comprising moving the at least one sensor relative to the three-dimensional object in order to detect the contour.

    9. The method according to claim 1 wherein the at least one sensor is positioned in the support material.

    10. The method according to claim 9, wherein the at least one sensor is positioned in the support material so as to detect an inner surface of the three-dimensional object.

    11. The method according to claim 9 wherein the at least one sensor is positioned in the support material so as to permit printing to occur at least partially around the at least one sensor.

    12. The method according to claim 1 wherein the contour is detected after or during the curing of the at least one manufacturing material.

    13. The method according to claim 1 further comprising detecting the at least one manufacturing material introduced through the at least one feed-in needle using at least one additional sensor arranged on the at least one feed-in needle.

    14. The method according to claim 1 wherein the step of detecting at least one of the contour and the position is performed while the at least one manufacturing material is introduced in the flowable state, and further comprising the step of influencing further introduction of the at least one manufacturing material based on the detected contour and/or position.

    15. A device for conducting a method according to claim 1, comprising: a container filled or fillable with support material; at least one feed-in needle by which a manufacturing material can be introduced into the container; and at least one sensor for detecting a contour of at least one part of a three-dimensional object located in or created by an additive process in the container.

    Description

    DESCRIPTION OF THE DRAWINGS

    [0024] In the following, a number of embodiment examples of the invention will be explained in more detail with the aid of the accompanying drawings. The numbers used in the drawings show the same elements.

    [0025] FIG. 1 is a schematic representation of one embodiment of a device configuration for conducting a method for detecting a contour or position of an object in a container.

    [0026] FIG. 2 is a schematic representation of another embodiment of device configuration for conducting a method for detecting a contour or position of an object in a container.

    [0027] FIG. 3 is a schematic representation of a variation on the embodiment shown in FIG. 1 of a device configuration for conducting a method for detecting a contour or position of an object in a container.

    [0028] FIG. 4 is a schematic representation of a variation on the embodiment shown in FIG. 2 of a device configuration for conducting a method for detecting a contour or position of an object in a container.

    [0029] FIG. 5 is a schematic representation of yet another embodiment of device configuration for conducting a method for detecting a contour or position of an object in a container.

    [0030] FIG. 6 is a schematic representation of a different variation on the embodiment shown in FIG. 1 of a device configuration for conducting a method for detecting a contour or position of an object in a container.

    [0031] FIG. 7 is a schematic representation of a different variation on the embodiment shown in FIG. 2 of a device configuration for conducting a method for detecting a contour or position of an object in a container.

    [0032] FIG. 8 is a schematic representation of still another embodiment of device configuration for conducting a method for detecting a contour or position of an object in a container.

    [0033] FIG. 9 is a schematic representation of a further embodiment of device configuration for conducting a method for detecting a contour or position of an object in a container.

    DETAILED DESCRIPTION

    [0034] FIG. 1 depicts a container 2 in which support material 4 is located. A three-dimensional object 6 is depicted in the support material 4, wherein said object is to be measured and has been produced in the container 2. To this end an emitter 8 is shown, which emits measurement radiation 10. The measurement radiation 10 is refracted at the wall of the container 2 and strikes the object 6 in the support material 4. There, it is reflected and exits the container as reflected radiation 12. It is detected by a detector 14. The emitter 8 and the detector 14 are arranged outside of the container 2. In the embodiment example shown, they are not movable. Emitter 8 and detector 14 together form the detector that detects the contour and/or position of the object 6 within the container 2. The measurement radiation 10 can be strip lighting, for example.

    [0035] FIG. 2 depicts a similar embodiment. The emitter 8 and the detector 14 are positioned within the container 2 and within the support material 4. This has the advantage that neither the measurement radiation 10 nor the reflected radiation 12 is refracted at the wall of the container 2.

    [0036] FIG. 3 shows an embodiment similar to FIG. 1. Here, too, the emitter 8 and the detector 14 are arranged outside of the container 2. Both are arranged together on an actuator 16, which in the embodiment example shown is designed as a robotic arm. As a result, emitter 8 and detector 14 can be moved collectively and it is possible to scan the object 6 and detect it from different sides and different perspectives. For the sake of simplicity, the measurement radiation 10 and the reflected radiation 12 are not shown as refracted at the wall of the container. Nevertheless, the radiation is refracted every time it passes through the interfaces.

    [0037] FIG. 4 shows the actuator 16 with emitter 8 and detector 14 within the container 2. As in the other figures, the object 6 is shown as a liner. In contrast to FIGS. 1 to 3, the liner in FIG. 4 has been produced with an open end 18 at the bottom, which is the proximal end. This makes it easier to remove the three-dimensional object 6 from the container 2.

    [0038] In FIG. 5, the emitter 8 and the detector 14 are arranged within the object 6. Instead of detecting the outer side, as in the other figures, the embodiment in FIG. 5 detects the inner side of the object 6.

    [0039] FIG. 6 corresponds to the representation from FIG. 1 with a different emitter 8 and a different detector 14. In this case, the emitter 8 is a laser, wherein the determination of the contour and/or position of the object in the embodiment example shown occurs via triangulation. Both are arranged outside of the container 2.

    [0040] FIG. 7 depicts the emitter 8 and the detector 14 within the container 2.

    [0041] In FIG. 8, emitter 8 and detector 14 are arranged on a common actuator 16 by which they can be moved collectively.

    [0042] FIG. 9 shows a different configuration. The emitter 8, which is designed as an ultrasonic transmitter, is inside the container 2. The measurement radiation 10 is emitted in the form of ultrasonic waves. It is represented by solid lines. The measurement radiation 10 strikes the object 6 and is reflected by the outer surface 20 and the inner surface 22. This results in two different reflected radiations 12, which are shown as dashed and dotted lines. The emitter 8 is simultaneously the detector 12 and sends the measurement data to an electrical control unit 24. The transmission peak 26 and the two reflection peaks 28 can be seen in the measurement data.

    Reference List

    [0043] 2 container [0044] 4 support material [0045] 6 object [0046] 8 emitter [0047] 10 measurement radiation [0048] 12 reflected radiation [0049] 14 detector [0050] 16 actuator [0051] 18 open end [0052] 20 outer surface [0053] 22 inner surface [0054] 24 electrical control unit [0055] 26 transmission peak [0056] 28 reflection peak