METHOD FOR WORK PIECE CALIBRATION AND ROBOT SYSTEM USING THE SAME
20190232499 ยท 2019-08-01
Inventors
- Jiajing Tan (Shanghai, CN)
- Hao Gu (Shanghai, CN)
- Jinsong Li (Beijing, CN)
- Yan Xu (Shanghai, CN)
- Shaojie Cheng (Shanghai, CN)
- Lei Mao (Shanghai, CN)
Cpc classification
G05B2219/39032
PHYSICS
G05B19/404
PHYSICS
B25J9/1664
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1617
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A method for calibration of work piece mounted in a predetermined manner to a work object and a robot system using the same. The work object has a first surface, a second surface and a third surface, and wherein the work object frame of reference is defined by a first coordinate line, a second coordinate line, and a third coordinate line at intersections of the first surface, the second surface and the third surface converging on a point. The method includes: touching a first number of locations on the first surface of the work object positioned by the robot touch probe to measure their actual locations on the first surface in the robot frame of reference, and storing the measured first coordinates for the measured locations; touching a second number of locations on the second surface of the work object positioned by the robot touch probe to measure their actual locations on the second surface in the robot frame of reference, and storing the measured second coordinates for the measured locations; touching a third number of locations on the third surface of the work object positioned by the robot touch probe to measure their actual locations on the third surface in the robot frame of reference, and storing the measured third coordinates for the measured locations; calculating orientation and origin of the work object frame of reference from the robot frame of reference based on the measured first, second and third coordinates for the measured locations, where the work object is positioned in the robot cell. The method provides all the necessary data to determine orientation and origin of the actual work object frame of reference relative to the robot frame of reference. The method also enables the robot to perform machine operations accurately at locations on a work object.
Claims
1. A method for calibration of a work piece mounted in a predetermined manner to a work object in a robot system, wherein the work object has a first surface, a second surface and a third surface, and wherein the work object frame of reference is defined by a first coordinate line, a second coordinate line, and a third coordinate line at intersections of the first surface, the second surface and the third surface converging on a point, the method comprising: touching a first number of locations on the first surface of the work object positioned by the robot touch probe to measure their actual locations on the first surface in the robot frame of reference, and storing the measured first coordinates for the measured locations; touching a second number of locations on the second surface of the work object positioned by the robot touch probe to measure their actual locations on the second surface in the robot frame of reference, and storing the measured second coordinates for the measured locations; touching a third number of locations on the third surface of the work object positioned by the robot touch probe to measure their actual locations on the third surface in the robot frame of reference, and storing the measured third coordinates for the measured locations; calculating orientation and origin of the work object frame of reference from the robot frame of reference based on the measured first, second and third coordinates for the measured locations, where the work object is positioned in the robot cell.
2. The method according to claim 1, wherein: determining orientation of the work object frame of reference relative to the robot frame of reference based on orientations of the first coordinate line, the second coordinate and the third coordinate line which are calculated by applying algorithm of square fitting respectively to the measured first, second and third coordinates for the measured locations; and determining the origins between of the work object frame of reference and the robot frame reference based on coordinate in the robot frame of reference for the converging point of the first, second and third coordinate line which is calculated by applying algorithm of square fitting respectively to the measured first, second and third coordinates for the measured locations.
3. The method according to claim 2, further comprising: regulating orientations of the second and third coordination lines of the fixture frame of reference to be normal to that of the first coordination line of the fixture frame of reference in the robot frame of reference.
4. The method according to claim 1, wherein: the first, second and third surface are arranged substantially perpendicular to each other.
5. The method according to claim 1, wherein: the first number amounts equal to or above three; the second number amounts equal to or above three; and the third number amounts equal to or above three.
6. A robot system, including: a work object, having a first surface, a second surface and a third surface, wherein the work object frame of reference is defined by a first coordinate line, a second coordinate line, and a third coordinate line at intersections of the first surface, the second surface and the third surface converging on a point, and being configured for having a work piece mounted thereto in a predetermined manner; a manipulator holding a touch probe; and a robot controller having a motion control module, a calculation module and a memory module; wherein: the motion control module is adapted for controlling the manipulator to touch a first number of locations on the first surface of the work object positioned by the robot touch probe to measure their actual locations on the first surface in the robot frame of reference, touch a second number of locations on the second surface of the work object positioned by the robot touch probe to measure their actual locations on the second surface in the robot frame of reference, and touch a third number of locations on the third surface of the work object positioned by the robot touch probe to measure their actual locations on the third surface in the robot frame of reference; the memory module is adapted for storing the measured first coordinates for the measured locations, storing the measured second coordinates for the measured locations, and storing the measured third coordinates for the measured locations; and the calculation module is adapted for calculating orientation and origin of the work object frame of reference from the robot frame of reference based on the measured first, second and third coordinates for the measured locations, where the work object is positioned in the robot cell.
7. The robot system according to claim 6, wherein: the calculation of the first coordinate transformation includes: determining orientation of the work object frame of reference relative to the robot frame of reference based on orientations of the first coordinate line, the second coordinate line and the third coordinate line which are calculated by applying algorithm of square fitting respectively to the measured first, second and third coordinates for the measured locations; and determining the origins between of the work object frame of reference and the robot frame of reference based on coordinate in the robot frame of reference for the converging point of the first, second and third coordinate line which is calculated by applying algorithm of square fitting respectively to the measured first, second and third coordinates for the measured locations.
8. The robot system according to claim 7, wherein: orientations of the second and third coordination lines of the fixture frame of reference may be regulated to be normal to that of the first coordination line of the fixture frame of reference in the robot frame of reference.
9. The robot system according to claim 6, wherein: the first, second and third surface are arranged substantially perpendicular to each other.
10. The robot system according to claim 6, wherein: the first number amounts equal to or above three; the second number amounts equal to or above three; and the third number amounts equal to or above three.
11. The method according to claim 2, wherein the first, second and third surface are arranged substantially perpendicular to each other.
12. The method according to claim 3, wherein: the first, second and third surface are arranged substantially perpendicular to each other.
13. The method according to claim 2, wherein: the first number amounts equal to or above three; the second number amounts equal to or above three; and the third number amounts equal to or above three.
14. The method according to claim 3, wherein: the first number amounts equal to or above three; the second number amounts equal to or above three; and the third number amounts equal to or above three.
15. The method according to claim 4, wherein: the first number amounts equal to or above three; the second number amounts equal to or above three; and the third number amounts equal to or above three.
16. The robot system according to claim 7, wherein: the first, second and third surface are arranged substantially perpendicular to each other.
17. The robot system according to claim 8, wherein: the first, second and third surface are arranged substantially perpendicular to each other.
18. The robot system according to claim 7, wherein: the first number amounts equal to or above three; the second number amounts equal to or above three; and the third number amounts equal to or above three.
19. The robot system according to claim 8, wherein: the first number amounts equal to or above three; the second number amounts equal to or above three; and the third number amounts equal to or above three.
20. The robot system according to claim 9, wherein: the first number amounts equal to or above three; the second number amounts equal to or above three; and the third number amounts equal to or above three.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The subject matter of the invention will be explained in more detail in the following text with reference to preferred exemplary embodiments which are illustrated in the drawings, in which:
[0013]
[0014]
[0015] The reference symbols used in the drawings, and their meanings, are listed in summary form in the list of reference symbols. In principle, identical parts are provided with the same reference symbols in the figures.
PREFERRED EMBODIMENTS OF THE INVENTION
[0016] In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular circuits, circuit components, interfaces, techniques, etc. in order to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known methods and programming procedures, devices, and circuits are omitted so not to obscure the description of the present invention with unnecessary detail.
[0017]
[0018] The work object 10, for example, may be shaped like a block, a cube, and rectangular parallelepiped, having a first surface S1, a second surface S2 and a third surface S1. The work object frame of reference Fw is defined by a first coordinate line Xw, a second coordinate line Yw, and a third coordinate line Zw at intersections of the first surface S1, the second surface S2 and the third surface S3 converging on a point Ow. The work object 10 in shape of a block is described as an example hereafter, wherein the first surface S1, the second surface S2 and the third surface S3 are the three orthogonal planes.
[0019] The manipulator 11 may be controlled by the robot controller 13 holding the touch probe 12 or terminating in the touch probe 12. The probe 12, which may be a linear touch sensor, is operable to provide a signal to the robot controller 13 whenever its pointer 120 comes into contact with an object, i.e. is displaced.
[0020] The robot controller 13 controlling the functions of the robot system 1 may have a motion control module 130, a calculation module 131 and a memory module 132. The robot controller 13 is provided with a data link 14 for receiving commands to control the position of the manipulator 11 and of tools and sensors affixed thereto. A computer 16 provides operator interface for editing the data stored in the memory module 132 of the robot controller 13.
[0021] For example, where the work object 10 was transferred to a second location L2 from the first location L1 in the robot cell within the manipulator's working range, since the location of the work object 10 with respect to the robot frame of reference Fr has been changed, the positional relationship between the robot and the work object needs to be updated for the performance of a variety of tasks, such as machining work pieces. In practice, greater accuracy and operation efficiency may be required. Therefore, it is essential to determine the actual location and orientation of the work object relative to the robot. This is accomplished, generally, by identifying the work object frame of reference in coordinates of the robot frame of reference.
[0022] To identify the work object frame of reference in coordinates of the robot frame of reference at the second location L2, the motion control module 130 of the robot controller 13 may control the manipulator 11 in response to the command from the computer 16 to casue the touch probe 12 to touch the the first surface S1 of the work object 10 at a first number of locations S1a, S1b, S1c, S1d, the actual locations of the first number of locations S1a, S1b, S1c, S1d are measured by the touch probe 12, and accordingly the measured first coordinates in the robot frame of reference Fr is provided to the robot controller 13 and stored in its memory module 132. The motion control module 130 of the robot controller 13 may control the manipulator 11 in response to the command from the computer 16 to casue the touch probe 12 to touch the the second surface S2 of the work object 10 at a second number of locations S2a, S2b, S2c, the actual locations of the second number of locations S2a, S2b, S2c are measured by the touch probe 12, and accordingly the measured second coordinates in the robot frame of reference Fr is provided to the robot controller 13 and stored in its memory module 132. The touching locations on the surface of the work object may be random selected, in case that at least three of them are not collinear. In addition, the motion control module 130 of the robot controller 13 may control the manipulator 11 in response to the command from the computer 16 to casue the touch probe 12 to touch the the third surface S3 of the work object 10 at a third number of locations S3a, S3b, S3c, the actual locations of the third number of locations S3a, S3b, S3c are measured by the touch probe 12, and accordingly the measured third coordinates in the robot frame of reference Fr is provided to the robot controller 13 and stored in its memory module 132. In actual operation, the touch probe 12 may be moved quickly, in gross increments, to contact the work object surface, and then backed off with high resolution incrementsthe position measurement being taken when the touch probe 13 loses contact with the work object surfacehence, the concept of touching off.
[0023] Using the measured coordinates for first number of locations S1a, S1b, S1c, S1d (four locations, for example), the second number of locations S2a, S2b, S2c (three locations, for example) and the third number of locations S3a, S3b, S3c (three locations, for example), the first surface S1, the second surface S2 and the third surface S3, the second surface S2 and the third surface S3 of the work object 13 may be modeled via mathematical algorithm, for example, equations (1) and (2), and thus the coorediantes in the robot frame of reference Fr for the origin Of of the work object frame of reference Fw may be calculated.
[0024] Where (X, Y, Z) indicate the coordinates in the robt frame of reference for the origin of the work object frame of reference, a.sub.s1, b.sub.s1, and c.sub.s1 are three factors to describe the first surface S1, a.sub.s2, b.sub.s2, and c.sub.s2 are three factors to describe the second surface S2, a.sub.s3, b.sub.s3, and c.sub.s3 are three factors to describe the third surface S3, in 3D in Algebra.
[0025] The 3D least fitting algorithm to solve the a.sub.s1,b.sub.s1,c.sub.s1 factors is:
[0026] Where X.sub.n,si, Y.sub.n,si, Z.sub.n, si are the measured coordinates for the nth measured location on the ith surfaces of the work object.
[0027] For example:
[0028] (X.sub.1,s1, Y.sub.1,s1, Z.sub.1,s1) indicate the measured coordiantes in the robot frame of reference Rf for the first measured location S1a on the first surface S1;
[0029] (X.sub.2,s1, Y.sub.2,s1, Z.sub.2, s1) indicate the measured coordiantes in the robot frame of reference Rf for the second measured location S1b on the first surface S1;
[0030] (X.sub.3,s1, Y.sub.3,s1, Z.sub.3, s1) indicate the measured coordiantes in the robot frame of reference Rf for the third measured location S1c on the first surface S1;
[0031] (X.sub.4,s1, Y.sub.4,s1, Z.sub.4, s1) indicate the measured coordiantes in the robot frame of reference Rf for the fourth measured location S1d on the first surface S1;
[0032] (X.sub.1,s2, Y.sub.1,s2, Z.sub.1, s2) indicate the measured coordiantes in the robot frame of reference Rf for the first measured location S2a on the second surface S2;
[0033] (X.sub.2,s2, Y.sub.2,s2, Z.sub.2, s2) indicate the measured coordiantes in the robot frame of reference Rf for the second measured location S2b on the second surface S2;
[0034] (X.sub.3,s2, Y.sub.3,s2, Z.sub.3, s2) indicate the measured coordiantes in the robot frame of reference Rf for the third measured location S2c on the second surface S2;
[0035] (X.sub.1,s3, Y.sub.1,s3, Z.sub.1, s3) indicate the measured coordiantes in the robot frame of reference Rf for the first measured location S3a on the third surface S3;
[0036] (X.sub.2,s3, Y.sub.2,s3, Z.sub.2, s3) indicate the measured coordiantes in the robot frame of reference Rf for the second measured location S3b on the third surface S3;
[0037] (X.sub.3,s3, Y.sub.3,s3, Z.sub.3, s3) indicate the measured coordiantes in the robot frame of reference Rf for the third measured location S3c on the third surface S3.
[0038] In summary, the origins between of the work object frame of reference Fw and the robot frame of reference Fr may be determined based on coordinate in the robot frame of reference for the converging point of the first, second and third coordinate line which is calculated by applying algorithm of square fitting respectively to the measured first, second and third coordinates for the measured locations.
[0039] Orientation of the work object frame of reference Fw relative to the robot frame of reference Fr may be determined based on orientations of the first coordinate line Xw, the second coordinate line Yw, and the third coordinate line Zw which may calculated by applying algorithm of square fitting respectively to the measured first, second and third coordinates for the measured locations.
[0040] Three vectors may be created from the calculation results. In particular, vectors (a.sub.s1 b.sub.s1 c.sub.s1), (a.sub.s2 b.sub.s2 c.sub.s2) (a.sub.s2 b.sub.s2 c.sub.s2) representing the orientation of the work object frame of reference Fw are stored in the memory module 132 of the robot controller 13.
[0041] This provides all the necessary data to determine a orientation and origin of the actual work object frame of reference relative to the robot frame of reference. This enables the robot to perform machine operations accurately at locations on a work object.
[0042] Under the embodiment according to present invention, the robot system is able to determine the orientation and origin of the robot frame of reference by touching four locations on the first surface of the work object, three locations on the second surface of the work object, and three locations on the third surface of the work object. The goal of course is to perform the minimum amount of steps while achieving the highest degree of accuracy. In order to have definite outcome from the square fitting, the first number amounts equal to or above three, the second number amounts equal to or above three, and the third number amounts equal to or above three. The invention provides for quick and accurate determination of the orientation and location of a work object relative to a robot. The lack of a requirement to locate precisely defined points on the fixture facilitates the overall process.
[0043] The orientations of the second and third coordination lines of the fixture frame of reference may be regulated to be normal to that of the first coordination line of the fixture frame of reference in the robot frame of reference. This procedure is disclosed in An automatic industrial robot cell calibration method, Jingguo Ge & Hao Gu, Conference ISR ROBOTIK (p. 389), 2014, Berlin.
[0044] Based on the embodiment according to the present invention, a positional relationship between the robot and a workpiece may be determined, in particular considering that the work piece has a shape without touchable geometry features, such as those with irregular shape or bumpy surface; thus, by touching three points, it is not possible to form this plane in terms of AX+BY+CZ=1.
[0045] Similar to the embodiment according to
[0046] Positioning the work object and associated workpiece fairly accurately relative to a robot is a well-established procedure. Therefore, both work object and work piece orientation and location are fairly accurately in robot coordinates known. In order to define the coordinates of the points on the work piece WP in the robot frame of reference Fr, off-line data 15, such as from a CAD data base, are provided to and stored in the memory module 132 of the robot controller 13 and specifies coordinates in the work object frame of reference Fw for a multiple of points WP1, WP2, WP3 on the work piece WP. In consideration of the orientation and origin in the robot frame of reference for the work object frame of reference and the coordinates of the points on the work piece WP in the robot frame of reference, the coordinates in the robot frame of reference for the multiple of points WP1, WP2, WP3 on the work piece may be determined.
[0047] In the scenario where the work piece may not be with qualified geometry and it is difficult to directly identify its actual location by directly touching the work piece by the manipulator, it is essential to determine the actual location and orientation of the work piece object relative to the robot frame of reference in the first place. This is accomplished, generally, by identifying the work object frame of reference in coordinates of the robot frame of reference as described above. Then, a coordinate transformation indicative of the position and orientation of the work piece in the work object frame of reference is determined and subsequently applied to off-line data (workpiece coordinates in the work object frame of reference) specifying the location and orientation of the workpiece located thereon.
[0048] Though the present invention has been described on the basis of some preferred embodiments, those skilled in the art should appreciate that those embodiments should by no way limit the scope of the present invention. Without departing from the spirit and concept of the present invention, any variations and modifications to the embodiments should be within the apprehension of those with ordinary knowledge and skills in the art, and therefore fall in the scope of the present invention which is defined by the accompanied claims.