METHOD FOR POSITIONING A LIMP, FLAT WORKPIECE AND POSITIONING APPARATUS

20220244699 · 2022-08-04

    Inventors

    Cpc classification

    International classification

    Abstract

    Disclosed is a method of positioning a limp flat workpiece is described, wherein the flat workpiece is provided in a random state on a manipulation surface. Subsequently, a camera image showing the flat workpiece is generated, and a grippable edge of the flat workpiece is identified by extracting characteristic image features of the camera image. Thereafter, a first gripping point for a first gripper and a second gripping point for a second gripper are determined, the second gripping point being spaced apart from the first gripping point. Also disclosed is positioning device for positioning a limp flat workpiece is presented.

    Claims

    1-19. (canceled)

    20: A method of positioning a limp flat workpiece using a positioning device which includes at least one control unit, a camera, at least one first gripper and at least one second gripper, the method comprising the steps of: a) providing the flat workpiece in a random state on a manipulation surface; b) generating, by the camera, a camera image showing the flat workpiece; c) identifying, by a first machine learning module of the control unit, a grippable edge of the flat workpiece by extracting characteristic image features of the camera image; d) determining, by a second machine learning module of the control unit, a first gripping point for the first gripper at the grippable edge; and e) determining, by the second machine learning module of the control unit, a second gripping point for the second gripper at the grippable edge, the second gripping point being spaced apart from the first gripping point.

    21: The method according to claim 20, wherein the flat workpiece is gripped in an automated manner by the first gripper and the second gripper, the first gripper gripping at the first gripping point and the second gripper gripping at the second gripping point.

    22: The method according to claim 20, wherein the flat workpiece is held by the second gripper and the first gripper is moved along the gripped edge until the first gripper grips a first corner of the flat workpiece, and/or the flat workpiece is held by the first gripper and the second gripper is moved along the gripped edge until the second gripper grips a second corner of the flat workpiece.

    23: The method according to claim 22, wherein for moving along with the first gripper, a gripping force of the first gripper is reduced in comparison to a gripping force of the second gripper, or the first gripper is opened by a specified amount.

    24: The method according to claim 22, wherein for moving along with the second gripper, a gripping force of the second gripper is reduced in comparison to a gripping force of the first gripper, or the second gripper is opened by a specified amount.

    25: The method according to claim 21, wherein the flat workpiece is deposited in a flat state by the first gripper and the second gripper.

    26: The method according to claim 20, wherein prior to generating the camera image, an area of contact of the flat workpiece with the manipulation surface is enlarged starting from the random state.

    27: The method according to claim 20, wherein prior to identifying the grippable edge, a type of the flat workpiece is identified by a third machine learning module of the control unit.

    28: The method according to claim 27, wherein the third machine learning module includes a trained artificial neural network configured to receive the camera image or partial images of the camera image for extracting characteristic image features.

    29: The method according to claim 20, wherein the first machine learning module includes a trained artificial neural network configured to receive the camera image or partial images of the camera image for extracting characteristic image features.

    30: The method according to claim 20, wherein for identifying the grippable edge, the camera image is subdivided into a plurality of partial images, and characteristic image features are extracted by the first machine learning module of the control unit for each partial image and, based on the characteristic image features, for each partial image a probability is established with which it shows an edge of the flat workpiece.

    31: The method according to claim 20, wherein the first gripping point and the second gripping point are determined in a section of the flat workpiece for the imaging of which in the camera image or in an associated partial image the relatively highest probability with which the camera image or the partial image shows an edge has been established by the first machine learning module.

    32: The method according to claim 20, wherein the second machine learning module includes a trained artificial neural network which, for determining the first gripping point and the second gripping point, receives the camera image or at least one partial image of the camera image and extracts characteristic image features of the camera image or of the at least one partial image, wherein, based on the characteristic image features, gripping coordinates and a gripping orientation are calculated for the first gripper and/or gripping coordinates and a gripping orientation are calculated for the second gripper.

    33: The method according to claim 20, wherein the flat workpiece is provided in a folded state on a manipulation surface in step a).

    34: A positioning device for positioning a limp flat workpiece, comprising a manipulation surface on which the flat workpiece can be provided, a camera that is configured to generate an image of the flat workpiece present on the manipulation surface, a first manipulator carrying a first gripper, and a second manipulator carrying a second gripper, wherein the first gripper and the second gripper are configured to each grip at least a portion of the flat workpiece provided on the manipulation surface.

    35: The positioning device according to claim 34, wherein a control unit is provided which comprises a first machine learning module having a trained artificial neural network which is configured to identify a grippable edge of the flat workpiece, and/or a second machine learning module having a trained artificial neural network which is configured to determine a first gripping point for the first gripper and a second gripping point for the second gripper on the grippable edge, and/or a third machine learning module having a trained artificial neural network which is configured to identify a type of the flat workpiece.

    36: The positioning device according to claim 35, wherein the control unit and the camera are configured to carry out a method comprising the steps of: a) providing the flat workpiece in a random state on a manipulation surface; b) generating, by the camera, a camera image showing the flat workpiece; c) identifying, by a first machine learning module of the control unit, a grippable edge of the flat workpiece by extracting characteristic image features of the camera image; d) determining, by a second machine learning module of the control unit, a first gripping point for the first gripper at the grippable edge; and e) determining, by the second machine learning module of the control unit, a second gripping point for the second gripper at the grippable edge, the second gripping point being spaced apart from the first gripping point.

    37: The positioning device according to claim 34, wherein a drop distance for the flat workpiece and a ventilation unit are provided, the ventilation unit comprising a plurality of outflow openings and delimiting the drop distance at least in sections.

    38: The positioning device according to claim 37, wherein the plurality of outflow openings is provided at a ventilation surface, the ventilation surface extends substantially horizontally, and the ventilation surface coincides with the manipulation surface.

    39: The positioning device according to claim 37, wherein the plurality of outflow openings is provided at a ventilation surface that is inclined at an angle of less than 90 degrees in relation to a horizontal plane, and the manipulation surface adjoins a lower edge of the ventilation surface.

    Description

    [0060] The invention will be described below by reference to various exemplary embodiments that are shown in the accompanying drawings, in which:

    [0061] FIG. 1 shows a perspective illustration of a positioning device according to the invention, by means of which a method according to the invention can be carried out, with a further processing machine being additionally shown;

    [0062] FIG. 2 shows a side view of the positioning device from FIG. 1;

    [0063] FIG. 3 shows a perspective illustration of a detail of the positioning device from FIGS. 1 and 2;

    [0064] FIG. 4 shows a schematic illustration of an alternative embodiment of a ventilation surface of the positioning device according to the invention in a first operating state;

    [0065] FIG. 5 shows a schematic illustration of the ventilation surface from FIG. 4 in a second operating state;

    [0066] FIG. 6 and FIG. 7 show perspective detailed views of the positioning device according to the invention from FIGS. 1 and 2;

    [0067] FIGS. 8 to 13 show detailed views of the positioning device according to the invention at different points in time while carrying out gripping operations of the method according to the invention;

    [0068] FIG. 14 shows an alternative embodiment of a gripper of a positioning device according to the invention;

    [0069] FIG. 15 shows the gripper of FIG. 14 in a sectional view along the plane XV-XV; and

    [0070] FIG. 16 shows the gripper of FIG. 15 in a sectional view along the plane XVI-XVI.

    [0071] FIGS. 1 and 2 show a positioning device 6 for positioning a limp flat workpiece 8.

    [0072] In the exemplary embodiment shown, the positioning device 6 serves to place the flat workpiece 8 onto and in full surface contact with a feed surface 10 of a further processing machine 11, which is only shown in sections.

    [0073] The limp, flat workpiece 8 here is a piece of laundry, more precisely a flat piece of laundry, which has already gone through a laundering process and a drying process. The further processing machine 11 is, for example, a folding machine.

    [0074] To this end, the positioning device 6 comprises a manipulation surface 12, on which the flat workpiece 8 can be provided, and a first manipulator 14a having a first gripper 15 and a second manipulator 14b having a second gripper 16.

    [0075] Here, both grippers 15, 16 are configured to each grip at least a portion of the flat workpiece 8 made available on the manipulation surface 12.

    [0076] In the illustrated embodiment, the manipulators 14a, 14b are positioned and secured on the manipulation surface 12.

    [0077] In the present case, the manipulators 14a, 14b used are industrial robots.

    [0078] The grippers 15, 16 are illustrated in detail in FIG. 3 and each include a pair of fingers 15a, 15b, 16a, 16b, the respective lower fingers 15a, 16a being provided with a so-called single ball and the respective upper fingers 15b, 16b being provided with a double ball. The gripping surfaces provided by the lower fingers 15a, 16a thus comprise a spherical surface section. The gripping surfaces provided by the upper fingers 15b, 16b each comprise two spherical surface sections arranged next to each other. In the closed state of the grippers 15, 16, the spherical surface section of the respective lower finger 15a, 16a engages in the space between the two spherical surface sections of the associated upper finger 15b, 16b.

    [0079] Of course, it is also possible here to equip the respective upper fingers with a single ball and the respective lower fingers with a double ball.

    [0080] Furthermore, a camera 18 is provided which is designed to generate a camera image B of the flat workpiece 8 that lies on the manipulation surface 12.

    [0081] In addition, the manipulation surface 12 may be illuminated by means of an illumination unit 20.

    [0082] Furthermore, a drop distance 22 for the flat workpiece 8 is provided within the positioning device 6, which is symbolized by a dashed arrow in FIGS. 1 and 2. In the vertical direction at the bottom, the drop distance 22 is bounded by a ventilation unit 24, which includes a plurality of outflow openings 26, all of which are arranged on a ventilation surface 28.

    [0083] In the embodiment illustrated in FIGS. 1 and 2, the ventilation surface 28 is inclined by 45 degrees to 60 degrees, in particular by about 50 degrees, in relation to a horizontal plane. The ventilation surface 28 continues at its lower edge 30 into the manipulation surface 12. In other words, the manipulation surface 12 adjoins the lower edge 30 of the ventilation surface 28.

    [0084] An alternative embodiment of the ventilation surface 28 can be seen in FIGS. 4 and 5. Here, the ventilation surface 28 extends substantially horizontally and coincides with the manipulation surface 12.

    [0085] The positioning device 6 also comprises a control unit 32, which, on the one hand, is connected to the camera 18 in terms of signaling and, on the other hand, is coupled to the manipulators 14a, 14b and the grippers 15, 16 in terms of signaling.

    [0086] The control unit 32 includes a total of three machine learning modules, with a first machine learning module 34 comprising an artificial neural network 36 which is configured to ascertain a grippable edge K of the flat workpiece 8 (see FIGS. 6 to 13).

    [0087] A second machine learning module 38 also includes an artificial neural network 40 which is configured to determine a first gripping point on the grippable edge K for the first gripper 15 and a second gripping point on the grippable edge K for the second gripper 16.

    [0088] A third machine learning module 42 also comprises an artificial neural network 44 that is configured to identify a type of the flat workpiece 8.

    [0089] Here, all artificial neural networks 36, 40, 44 have already been trained. The training is performed on the basis of images or partial images that have been annotated in accordance with the task of the respective artificial neural network 36, 40, 44.

    [0090] Such a positioning device 6 can be used to carry out a method for positioning the limp flat workpiece 8.

    [0091] In the present example, the aim of this method is to place the flat workpiece 8, starting from a random state, in a defined and planar manner onto the feed surface 10 of the further processing machine 11 (see FIGS. 1 and 7).

    [0092] To this end, the flat workpiece 8 is introduced into the positioning device 6 via the drop distance 22, while air flows from the outflow openings 26 and towards the flat workpiece 8. The flat workpiece 8 is thus dropped onto the ventilation surface 28 against the airflow thus generated, and then slides over the edge 30 and onto the manipulation surface 12 owing to the inclined position of the ventilation surface 28 (see FIG. 1).

    [0093] In the event that the ventilation surface 28 is designed according to the embodiment of FIGS. 4 and 5, the ventilation surface 28 coincides with the manipulation surface 12. The flat workpiece 8 is then dropped directly onto the manipulation surface 12 against the airflow.

    [0094] Thus, in both alternatives, the flat workpiece 8 is provided on the manipulation surface 12.

    [0095] Here, both the state of the flat workpiece 8 in which it is introduced into the drop distance 22 and the state in which the flat workpiece 8 comes to rest on the manipulation surface is a random one.

    [0096] However, the movement of the flat workpiece 8 relative to the airflow causes its area of contact with the manipulation surface 12 to be increased, which facilitates the method steps described below.

    [0097] Subsequently, a camera image B showing the flat workpiece 8 is generated by means of the camera 18. This image is a two-dimensional color image with a resolution of 2456 pixels by 2054 pixels.

    [0098] This camera image B is received by the third machine learning module 42, more precisely by the associated artificial neural network 44. In order to keep the computational effort in the artificial neural network 44 low, the camera image B may be reduced beforehand to a resolution of 224 pixels by 224 pixels.

    [0099] The artificial neural network 44 serves to identify a type of flat workpiece 8, for which purpose characteristic image features of the camera image B are identified by means of the artificial neural network 44.

    [0100] In the embodiment illustrated, in this connection a certain number of predefined types are stored on the control unit 32. By means of the artificial neural network 44, probabilities are calculated which comprise, for each predefined type, a value for the probability with which the flat workpiece 8 in the camera image B belongs to this type.

    [0101] Since in the present case the flat workpiece 8 involved is a flat piece of laundry, the predefined types are, e.g., small towel, large towel, pillowcase, napkin and tablecloth.

    [0102] The third machine learning module 42 assigns to the flat workpiece 8 the type for which the highest probability value has been established.

    [0103] Such a type determination serves two purposes.

    [0104] For one thing, flat workpieces 8 that do not match any of the predefined types may be removed from the manipulation surface 12. Such flat workpieces 8 are thus not treated further. Alternatively, the process may be cancelled or aborted if the flat workpiece 8 does not match any of the predefined types. For this purpose, a minimum value for the probability values assigned to the predefined types may be fixed.

    [0105] For another thing, the type as detected of the flat workpiece 8 may be used to set the operation of the grippers 15, 16 specifically for this type. For example, gripping forces may be configured depending on the type.

    [0106] Further, the camera image B is partitioned by the control unit 32 into partial images which, in the example shown, each have a size of 224 pixels by 224 pixels and either overlap only slightly or are extracted from the camera image B without overlapping. A so-called sliding window approach is used for this purpose.

    [0107] These partial images are received by the first machine learning module 34, more precisely by the associated artificial neural network 36, which is used to ascertain the grippable edge K of the flat workpiece 8.

    [0108] By means of the artificial neural network 36, characteristic image features are extracted for each partial image for this purpose, and for each of the partial images a value of the probability is determined with which it shows an edge of the flat workpiece.

    [0109] In this connection, too, predefined categories are again used and, by means of the artificial neural network 36 and on the basis of the characteristic image features, for each partial image probability values are established with which it belongs to these categories. In the exemplary embodiment illustrated, the categories are “no edge”, “grippable edge”, “ungrippable edge”. The category “grippable edge” here also includes the case in which the partial image shows a corner.

    [0110] The first machine learning module 34 then selects that partial image which shows an edge with the highest probability value, i.e. which belongs, with the highest probability value, to the “grippable edge” category.

    [0111] This partial image is received by the second machine learning module 38, more precisely by the associated artificial neural network 40, which serves to determine gripping points for the first gripper 15 and the second gripper 16.

    [0112] This means that the first gripping point and the second gripping point are fixed in a section of the flat workpiece 8 for the imaging of which in the associated partial image the relatively highest value of probability that this imaging shows an edge has been determined by means of the first machine learning module 34.

    [0113] By means of the artificial neural network 40, characteristic image features of the partial image are again extracted. On this basis, gripping coordinates and a gripping orientation are calculated for each of the first gripper 15 and the second gripper 16.

    [0114] These are first ascertained in image coordinates in the partial image and then converted to machine coordinates of the manipulators 14a, 14b and the grippers 15, 16.

    [0115] Since the flat workpiece 8 is always gripped at a specified height above the manipulation surface 12, it is sufficient in the present case to ascertain the gripping coordinates and the gripping orientations in two dimensions by means of the artificial neural network 40.

    [0116] Here, the second gripping point for the second gripper 16 is always located spaced apart from the first gripping point for the first gripper 15.

    [0117] The flat workpiece 8 is now gripped in an automated fashion by means of the two grippers 15, 16 at the specified gripping points. For this purpose, the grippers 15, 16 are controlled by the control unit 32. Both grippers 15, 16 utilize a predefined gripping force of essentially the same magnitude, which may depend on the type of flat workpiece 8.

    [0118] The approach to and gripping of the grippable edge K can be seen for the first gripper 15 in the sequence of FIGS. 8 to 10. For the second gripper 16 this is effected by way of analogy.

    [0119] Proceeding from the gripped state, now the grippers 15, 16 are to be moved to opposite corners of the flat workpiece 8, which define the grippable edge K.

    [0120] For this purpose, the flat workpiece 8 is first held by means of the second gripper 16, and the first gripper 15 is moved along the gripped edge K until the first gripper 15 grips a first corner E1 of the flat workpiece 8 (cf. arrow 45 in FIGS. 11 and 12).

    [0121] In doing so, a gripping force of the first gripper 15 is reduced in comparison to the previous gripping and in comparison to the second gripper 16, so that the first gripper 15 can slide along the edge K without, however, letting go of the flat workpiece 8 (cf. FIGS. 10 to 12).

    [0122] Subsequently, the second gripper 16 is moved, proceeding in the same way.

    [0123] That is, the flat workpiece 8 is held by the first gripper 15, for which purpose its gripping force is increased to the original level again. At the same time, the second gripper 16 moves along the gripped edge K until it grips a second corner E2 of the flat workpiece 8.

    [0124] To this end, a gripping force of the second gripper 16 is reduced in comparison with a gripping force of the first gripper 15 and in comparison with the previous gripping by means of the second gripper 16.

    [0125] In order to facilitate the movement of the grippers 15, 16 along the edge, in the illustrated example the flat workpiece 8 is slightly lifted off the manipulation surface 12. Sliding along is thus performed a few centimeters above the manipulation surface 12.

    [0126] In this connection, the corners E1, E2 may be detected by detecting an increased sliding resistance of the grippers 15, 16, which is possible in particular if the flat workpiece 8 is provided with a seam or border at its edges. Alternatively, the corners E1, E2 may be detected by means of a distance sensor. It is also possible to detect the corners E1, E2 in a camera-based manner or by means of acoustic vibration measurement.

    [0127] After the first corner E1 has been gripped by means of the first gripper 15 and the second corner E2 has been gripped by means of the second gripper 16, the flat workpiece 8 can be deposited in a flat state on the feed surface 10 by means of both grippers 15, 16. The target state is thus attained.

    [0128] FIGS. 14 to 16 show an alternative embodiment of a gripper, which is illustrated using the example of the first gripper 15. The second gripper is designed in the same way.

    [0129] This gripper also includes two fingers 15a, 15b. However, in the open state, these fingers are now arranged such that the grippable edge can be approached from above (see FIG. 14 in comparison to FIGS. 8 and 9).

    [0130] To close the gripper, the lower finger 15a is now moved parallel to the manipulation surface 12 towards the upper finger 15b. In the process, it first moves with its tip 46 under the grippable edge K and in this way slightly lifts the flat workpiece 8.

    [0131] The edge region of the flat workpiece 8 comprising the edge K is then pressed against the upper finger 15b by means of the curvature adjoining the tip 46. The curvature is designed to be so flat here that the edge K will not fold over as a result of being approached by means of the finger 15a.

    [0132] As in the previous embodiment, the lower finger 15a is again provided with a single ball and the upper finger 15b with a double ball (see in particular FIG. 16). As an alternative, it is possible in this connection as well to fit the respective upper fingers with a single ball and the respective lower fingers with a double ball.