METHOD FOR THE CONTROL OF A PROCESSING MACHINE OR OF AN INDUSTRIAL ROBOT

20210402613 · 2021-12-30

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for determining a position of a workpiece includes: acquiring image data of a workpiece via a camera which defines an optical axis parallel to an impact direction of a tool in a z-direction; searching for a reference structure of the workpiece using the acquired image data; determining a current position of at least one point of the structure in an x/y direction relative to the optical axis; comparing the current position with a nominal position thereof;

    generating commands to place the tool to an area of the workpiece to be machined; and, determining the current position in the z-direction of the optical axis by determining a current x/y image size of the structure and by determining a distance of the structure from the camera by comparison with the known x/y size of the structure and considering the distance of the structure from the camera when generating the commands.

    Claims

    1. A method for controlling a processing machine or an industrial robot, the method comprising: acquiring image data of at least one image of a workpiece via a first camera, the first camera defining an optical axis which is parallel to an impact direction of a tool in a z-direction; searching for a reference structure of the workpiece using the acquired image data; determining a current actual position of at least one reference point of the reference structure in an x/y direction relative to the optical axis of the first camera; comparing the current actual position of the reference structure with a nominal position of the reference structure; generating control commands to place the tool to at least one area or location of the workpiece to be machined; and, determining the current actual position of the reference structure in the z-direction of the optical axis by determining a current x/y image size of the reference structure in the acquired image data and by determining a distance of the reference structure from the first camera by comparison with the known actual x/y size of the reference structure and taking the distance of the reference structure from the first camera into account when generating the control commands.

    2. The method of claim 1, wherein the reference structure is an opening defined by the workpiece, a body edge of the workpiece or an elevation of the workpiece and at least one of a length dimension, a circumferential dimension and a diameter of the reference structure is known.

    3. The method of claim 1, wherein a virtual intersection point of at least two structures of the workpiece is formed and the intersection point serves as a reference point for subsequent relative movements between the workpiece and the tool.

    4. The method of claim 1, wherein the image data of the workpiece are acquired via a second camera under a recording direction oblique to the z-direction, the method further comprising carrying out a pattern recognition at least in regions of the workpiece on a basis of the acquired image data.

    5. The method of claim 4 further comprising: detecting a current machining state of the workpiece via the pattern recognition; and, comparing the detected current machining state with a nominal machining state.

    6. The method of claim 1, wherein the tool is a stud welding apparatus.

    7. The method of claim 1, wherein the tool is a projection welding apparatus.

    8. The method of claim 1, wherein the tool is a device for screwing.

    9. The method of claim 1, wherein the tool is a device for riveting.

    10. A method for determining a position of a workpiece, the method comprising: acquiring image data of at least one image of the workpiece via a first camera, wherein the camera defines an optical axis; determining a reference structure of the workpiece on a basis of the acquired image data via an evaluation unit; determining a current actual position of at least one reference point of the reference structure in an x/y direction relative to the optical axis of the first camera via the evaluation unit; comparing the current actual position of the reference structure with a nominal position of the reference structure via the evaluation unit and generating comparison data; and, inferring the position of the workpiece with respect to a base coordinate system from the comparison data via the evaluation unit.

    11. The method of claim 10 further comprising: acquiring further image data of the workpiece via at least one further camera, wherein the at least one further camera defines a further optical axis; determining at least one further reference structure of the workpiece via the evaluation unit on a basis of the further image data acquired; determining the current actual position of at least one reference point of the further reference structure in the x/y direction relative to the further optical axis of the further camera via the evaluation unit; comparing the current actual position of the further reference structure with a nominal position of the further reference structure and generating further comparison data; and, considering the further comparison data when drawing conclusions about the position of the workpiece via the evaluation unit.

    12. The method of claim 10 further comprising: determining the current actual position of the reference structure in the z-direction of the optical axis of the camera by determining a current x/y image size of the reference structure in the captured image data; determining a distance of the reference structure from the camera by comparison with the known actual x/y size of the reference structure; and, wherein the determined actual position of the reference structure in the z-direction is taken into account when inferring the position of the workpiece via the evaluation unit.

    13. The method of claim 11 further comprising: determining the current actual position of the reference structure and the further reference structure in the z-direction of the optical axis of the corresponding one of the camera and the further camera by determining a current x/y image size of the reference structure and the further reference structure in respective ones of the acquired image data and the further acquired image data; determining a distance of the reference structure from the camera by comparison with the known actual x/y size of the reference structure; determining a distance of the further reference structure from the further camera by comparison with the known actual x/y size of the further reference structure; and, wherein the determined actual position of the reference structure and the determined actual positions of the further reference structure in the z-direction are taken into account when inferring the position of the workpiece via the evaluation unit.

    14. The method of claim 10, wherein the reference structure is an opening defined by the workpiece, a body edge of the work piece or an elevation of the workpiece and at least one of a length dimension, a circumferential dimension and a diameter of the reference structure is known.

    15. The method of claim 10 further comprising: generating control commands via a control unit; and, moving a tool to at least one area or location of the workpiece to be machined via the generated control commands.

    16. The method of claim 10 further comprising: generating control commands via a control unit; and, interrupting or stopping a machining operation of the workpiece via the generated control commands.

    17. The method of claim 10, wherein the method is for controlling an industrial robot.

    18. A device for protecting a camera comprising: a sleeve defining an obliquely cut opening; and, said sleeve having an elongated part configured to have the camera arranged therein.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0042] The invention will now be described with reference to the drawings wherein:

    [0043] FIG. 1 shows a flow chart of a first embodiment of the method according to the disclosure;

    [0044] FIG. 2 shows a schematic representation of an example of a device suitable for carrying out the method;

    [0045] FIG. 3 shows a schematic representation of a construction of a virtual reference point;

    [0046] FIG. 4 shows a flow chart of a further embodiment of the method according to the disclosure;

    [0047] FIG. 5A shows a schematic representation of a first camera with a protective sleeve in a frontal view; and,

    [0048] FIG. 5B shows a schematic representation of the first camera with the protective sleeve in a side view.

    DESCRIPTION OF THE PREFERRED EMBODIMENTS

    [0049] The essential process steps of a first embodiment of the process according to the disclosure are shown schematically in FIG. 1. With a view to FIG. 2, the process steps and technical units for carrying out the process are shown below.

    [0050] The step of capturing image data of a workpiece 1 may be performed using a first camera 2. The first camera 2 may be a miniature camera. It is aligned with its optical axis oa parallel to a push direction of a tool 3, the push direction pointing in the direction of a z-axis z of a Cartesian coordinate system with the axes x, y and z.

    [0051] A search for a reference structure 3 may be performed using image processing software housed in a computer unit 5. The computer unit 5 is, for example, a single board PC. In this way, decentralized image processing is possible, for example, whereby a corresponding arrangement for carrying out the method can have a modular structure. Moreover, retrofitting of the arrangement by replacing or reconfiguring the decentralized computer unit 5 is possible at low cost. In a further embodiment, the computer unit 5 can also be configured for image processing of a second camera 6 and be connected thereto in terms of data link.

    [0052] The determination of at least one reference point + of the reference structure 4 on the basis of the captured image data, as well as a comparison of the determined actual position of the reference structure 4 and the reference point + with a nominal position, can be carried out via an evaluation unit 7. The evaluation unit 7 may be a physical or virtual sub-unit of the computer unit 5 or a separate unit.

    [0053] The algorithms of the image processing of the computer unit 5 can be used for a determination of a distance a of the first camera 2 from the workpiece 1 in the Z-direction (push direction). For this purpose, detected and predetermined areas or sections of the reference structure 4 or the entire reference structure 4 are evaluated with respect to their x-/y-image size in the captured image and are related to a previously known actual x/y-size of the reference structure 4.

    [0054] The determined information on the current positioning of the reference structure 4, in particular of the selected reference point +, as well as the determined distance a in the Z-direction enables the generation of control commands to feed the tool 3 to a desired position of the workpiece 1 and to machine the workpiece 1. Via an actuator 10, for example a robot, a tracking of the workpiece 1 and/or of the drive 9 can be effected on the basis of the data generated by the evaluation unit 7. The generation of the control commands can take place in a control unit 8. By effect of the control commands, the tool 3 is fed and moved via a drive 9 (also referred to generally as application). All data connections can be configured as plug-in connections (shown schematically), whereby a higher flexibility and an increased ease of maintenance are achieved.

    [0055] All units may be connected to each other in a network. A database 11 may be connected to the control unit 8. Alternatively, the database 11 may be directly connected to the network.

    [0056] Since the first camera 2 and the tool 3 have a known spatial relationship, the position of the first camera 2 relative to the reference point + can also be used to determine the current position of the tool 3 both, generally, for example within a base coordinate system with the axes x, y and z, and/or relative to the reference point +.

    [0057] The optionally provided second camera 6 is directed obliquely towards the workpiece 1. The optical axes of the first camera 2 and the second camera 6 include, for example, an angle greater than zero and less than 90°. Via the second camera 6, image data of the workpiece 1 can be acquired under the recording direction oblique to the z-direction. Based on the captured image data from the first camera 2 and/or the second camera 6, a pattern recognition is performed at least in areas of the workpiece 1. In this regard, the computer unit 5 and the evaluation unit 7 are configured to acquire this image data and to execute the pattern recognition process (see FIG. 1). If the optional step of pattern recognition is carried out, the information obtained in the pattern recognition can be taken into account in the generation of the control commands or in a generation of further control commands in the sense of a closed-loop control, as illustrated by the arrows drawn with interrupted solid lines in FIG. 1.

    [0058] In addition to checking the success of processing steps, the pattern recognition can serve to verify the presence of the reference structure 4 and to detect a permissible shape of the reference structure 4. In the event of a detected malfunction and/or an impermissible reference structure 4, an error signal may be output by the evaluation unit 7. Thereupon, a warning signal can be output, for example by the actuator 10, and optionally the tool 3 can be stopped via the control unit 8 and a corresponding control of the drive 9.

    [0059] A reference point + may actually exist or may be determined virtually. For example, a reference point + may be the center of a circular reference structure 4 (see FIG. 2) or may be a virtual intersection point. FIG. 3 illustrates the construction of such a virtual intersection point. Two body edges of the workpiece 1 recognized as reference structures 4, which do not actually abut, are virtually extended. The virtual intersection point of the such virtually extended reference structures 4 is stored and used as reference point +.

    [0060] The essential process steps of a further embodiment of the process according to the disclosure are shown in FIG. 4. FIG. 2 can also be used here to explain the technical units necessary for carrying out the process.

    [0061] The initial step of acquiring image data of a workpiece 1 is performed via the first camera 2. The first camera 2 may be set up as described above. The first camera 2 is fixedly arranged in a production line and forms the origin of a base coordinate system.

    [0062] The search is performed via image processing software housed in the computer unit 5. The computer unit 5 is preferably set up as already described above. The computer unit 5 may also be configured for image processing of at least one further camera 6, and may be data-connected thereto. The further camera 6 is also arranged stationary at a known distance from the first camera 2. The further camera 6 is oriented at an angle greater than zero and less than 90° to the first camera 2 to the same object field as the first camera. Both cameras thus capture an equal image section from different perspectives. In other configurations, the optical axes of the first camera 2 and the further camera 6 are aligned in parallel and capture different image sections. For example, it may be provided that the first camera 2 captures a front side of the workpiece 1 and the second camera 6 captures a rear side of the workpiece 1.

    [0063] The determination of at least one reference point + of the reference structure 4 on the basis of the acquired image data, as well as a comparison of the determined actual position of the reference structure 4 and the reference point + with a nominal position, is carried out via the evaluation unit 7, which generates comparison data therefrom.

    [0064] Finally, the evaluation unit 7 is arranged to infer, based on the comparison data, the position of the workpiece (1) with respect to the base coordinate system.

    [0065] Preferably, the algorithms of the image processing of the computer unit 5 are used for a determination of a distance a of the first camera 2 from the workpiece 1 in the z-direction, as described above (not shown in FIG. 4).

    [0066] Again, optionally, pattern recognition may be performed at least in regions of the workpiece 1. In this case, the computer unit 5 and the evaluation unit 7 are configured to acquire this image data and to execute the pattern recognition process.

    [0067] In order to protect the first camera 2 from damage caused, for example, by swarf, sparks, splashes during welding operations and the like occurring during machining of the workpiece 1, the first camera 2 may be at least partially surrounded by a sleeve 12 having an obliquely cut opening. The first camera 2 is mounted on an inner side of the sleeve 12 in the area of the obliquely cut opening (FIG. 5A). In this way, the first camera 2 is protected on one side by the part of the sleeve 12 which is elongated in the region of the opening. The remaining possible detection range of the first camera 2 is nevertheless sufficiently large (FIG. 5B).

    [0068] It is understood that the foregoing description is that of the preferred embodiments of the invention and that various changes and modifications may be made thereto without departing from the spirit and scope of the invention as defined in the appended claims.

    REFERENCE SIGNS

    [0069] 1 Workpiece [0070] 2 First camera [0071] 3 Tool [0072] 4, 4′ Reference structure [0073] 5 Computer unit [0074] 6 Second/additional camera [0075] 7 Evaluation unit [0076] 8 Control unit [0077] 9 Drive, application [0078] 10 Actuator [0079] 11 Database [0080] 12 Sleeve [0081] oa Optical axis [0082] + Reference point