AUTOMATED CALIBRATION SYSTEM AND METHOD FOR A WORKPIECE COORDINATE FRAME OF A ROBOT
20210187745 · 2021-06-24
Inventors
- Chwen-Yi Yang (Hualien County, TW)
- CHENG-KAI HUANG (Taichung City, TW)
- JAN-HAO CHEN (Changhua County, TW)
- Yi-Ying LIN (Taichung City, TW)
- BING-CHENG HSU (Changhua County, TW)
Cpc classification
G05B19/4015
PHYSICS
International classification
Abstract
An automated calibration system for a workpiece coordinate frame of a robot includes a physical image sensor having a first image central axis, and a controller for controlling the physical image sensor adapted on a robot to rotate by an angle to set up a virtual image sensor having a second image central axis. The first and the second image central axes are intersected at an intersection point. The controller controls the robot to repeatedly move back and forth a characteristic point on the workpiece between these two axes until the characteristic point overlaps the intersection point. The controller records a calibration point including coordinates of joints of the robot, then the controller moves another characteristic point and repeats the foregoing movement to generate several other calibration points. According to the calibration points, the controller calculates relative coordinates of a virtual tool center point and the workpiece to the robot.
Claims
1. A automated calibration system for a workpiece coordinate frame of a robot, comprising: a physical image sensor, having a first image central axis, disposed on a flange at an end of the robot; and a controller, connected with the robot and the physical image sensor, used for controlling the physical image sensor and the robot to construct a virtual image sensor having a second image central axis, the second image central axis intersecting the first image central axis to form an intersection point; wherein the controller controls the robot to move repeatedly a characteristic point on a workpiece back and forth between the first image central axis and the second image central axis until the characteristic point overlaps the intersection point, the characteristic point as a calibration point, out of a plurality of calibration points, including coordinates of a plurality of joints of the robot is recorded, then a next characteristic point is introduced to repeat the aforesaid movement for overlapping the intersection point again until a number of the plurality of calibration points is accepted, and each of the plurality of calibration points is evaluated to calculate coordinates of a virtual tool center point and the workpiece with respect to the robot.
2. The automated calibration system for a workpiece coordinate frame of a robot of claim 1, wherein the controller controls the robot to move according to a coordinate system of the robot with respect to a transformation relationship of another coordinate system of the physical image sensor and the virtual image sensor, and a plurality of images of the physical image sensor and the virtual image sensor.
3. The automated calibration system for a workpiece coordinate frame of a robot of claim 1, wherein each of the plurality of calibration points includes rotational angles of the plurality of joints with respect to a preset point.
4. The automated calibration system for a workpiece coordinate frame of a robot of claim 3, wherein the controller calculates coordinates of the virtual tool center point according to the plurality of calibration points and a Denavit-Hartenberg parameter of the robot.
5. The automated calibration system for a workpiece coordinate frame of a robot of claim 1, wherein a quantity of the plurality of calibration points is larger than or equal to a preset value.
6. The automated calibration system for a workpiece coordinate frame of a robot of claim 5, wherein, upon when the quantity of the plurality of calibration points is less than the preset value, the controller applies a random-number generator to generate an Euler-angle increment for correcting an Euler angle of the robot.
7. The automated calibration system for a workpiece coordinate frame of a robot of claim 6, wherein, after the controller corrects the Euler angle of the robot, the controller controls the robot to move repeatedly another characteristic point back and forth between the first image central axis and the second image central axis so as to obtain another calibration point, until the quantity of the calibration point is larger than or equal to the preset value.
8. The automated calibration system for a workpiece coordinate frame of a robot of claim 1, wherein, when the characteristic point overlaps the intersection point of the first image central axis and the second image central axis, a distance between the characteristic point and the first image central axis and another distance between the characteristic point and the second image central axis are less than a threshold valve.
9. The automated calibration system for a workpiece coordinate frame of a robot of claim 1, wherein coordinates of the virtual tool center point are coordinates with respect to a base or the flange of the robot.
10. An automated calibration method for a workpiece coordinate frame of a robot, the robot connecting a controller, comprising the steps of: (i) providing a physical image sensor, forming an image coordinate system and having a first image central axis, disposed on a flange at an end of the robot; (ii) applying the controller to rotate the physical image sensor and the robot to construct a virtual image sensor having a second image central axis, the second image central axis intersecting the first image central axis to form an intersection point; (iii) the controller controlling the robot to move repeatedly a characteristic point on a workpiece back and forth between the first image central axis and the second image central axis, until the characteristic point and the intersection point are overlapped, and recording as a calibration point including coordinates of a plurality of joints of the robot; (iv) the controller controlling the robot to move a next characteristic point according to the aforesaid movement to overlap the intersection point so as to generate a plurality of other calibration points; and (v) based on each of the plurality of calibration points to calculate coordinates of a virtual tool center point and the workpiece with respect to the robot.
11. The automated calibration method for a workpiece coordinate frame of a robot of claim 10, further including a method for constructing the virtual image sensor, the method for constructing the virtual image sensor including the steps of: (a) moving the robot to fall the characteristic point into a visual field of the physical image sensor, and obtaining coordinates of an arbitrary point within the visual field with respect to the flange; (b) obtaining a first point in the coordinate system of the physical image sensor, rotating by two different angles about the first image central axis to generate a second point and a third point, and calculating a center position according to the first point, the second point and the third point; (c) calculating a vector from the center position to a tool-image center, and transforming the vector in the image coordinate system into another vector with respect to the flange, the tool-image center being an intersection point of a coordinate axis of a tool and the first image central axis; (d) correcting coordinates of the first point, going back to perform Step (a) until the center position overlaps the tool-image center, and then obtaining a tool center coordinate; and (e) rotating the physical image sensor about an arbitrary coordinate axis of the flange by an angle so as to form the virtual image sensor.
12. The automated calibration method for a workpiece coordinate frame of a robot of claim 10, wherein the Step (iii) further includes the steps of: providing a transformation relationship between a coordinate system of the robot and another coordinate system of the physical image sensor and the virtual image sensor; and based on the transformation relationship, the physical image sensor and a plurality of images of the virtual image sensor, controlling the robot to move.
13. The automated calibration method for a workpiece coordinate frame of a robot of claim 12, wherein the step of providing the transformation relationship further includes the steps of: controlling the robot to move the characteristic point a distance along a horizontal axis of the coordinate system of the robot from an arbitrary position within an image-overlapped region of the physical image sensor and the virtual image sensor, and obtaining a first projection coordinate from the physical image sensor and the virtual image sensor; controlling the robot to move the characteristic point the distance along a vertical axis of the coordinate system of the robot from the arbitrary position within the image-overlapped region of the physical image sensor and the virtual image sensor, and obtaining a second projection coordinate from the physical image sensor and the virtual image sensor; and controlling the robot to move the characteristic point the distance along another vertical axis of the coordinate system of the robot from the arbitrary position within the image-overlapped region of the physical image sensor and the virtual image sensor, and obtaining a third projection coordinate from the physical image sensor and the virtual image sensor.
14. The automated calibration method for a workpiece coordinate frame of a robot of claim 13, wherein the step of providing the transformation relationship further includes the steps of: providing a first spatial vector, a second spatial vector and a third spatial vector corresponding to the first projection coordinate, the second projection coordinate and the third projection coordinate, respectively; based on an orthogonal relationship among the first spatial vector, the second spatial vector and the third spatial vector, calculating the first spatial vector, the second spatial vector and the third spatial vector; and based on the first spatial vector, the second spatial vector and the third spatial vector, calculating the transformation relationship between the coordinate system of the robot and that of the physical image sensor and the virtual image sensor.
15. The automated calibration method for a workpiece coordinate frame of a robot of claim 10, wherein coordinates of each of the plurality of joints are rotational angles of the plurality of joints with respect to a preset point.
16. The automated calibration method for a workpiece coordinate frame of a robot of claim 10, wherein the Step (v) further includes a step of, based on the plurality of calibration points and a Denavit-Hartenberg parameter of the robot, calculating coordinates of the virtual tool center point and the workpiece.
17. The automated calibration method for a workpiece coordinate frame of a robot of claim 10, wherein a quantity of the plurality of calibration points is larger than or equal to a preset value.
18. The automated calibration method for a workpiece coordinate frame of a robot of claim 10, wherein the Step (iii) further includes a step of having a distance between the characteristic point and the first image central axis and another distance between the characteristic point and the second image central axis are less than a threshold valve.
19. The automated calibration method for a workpiece coordinate frame of a robot of claim 10, wherein the Step (iv) further includes a step of, upon when a quantity of the plurality of calibration points is less than a preset value, applying a random-number generator to generate an Euler-angle increment for correcting an Euler angle of the robot.
20. The automated calibration method for a workpiece coordinate frame of a robot of claim 19, further including a following step of controlling the robot to move repeatedly another characteristic point back and forth between the first image central axis and the second image central axis so as to obtain another calibration point, until a quantity of the calibration points is larger than or equal to the preset value.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] The present disclosure will become more fully understood from the detailed description given herein below and the accompanying drawings which are given by way of illustration only, and thus are not limitative of the present disclosure and wherein:
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
DETAILED DESCRIPTION
[0031] In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
[0032] Referring to the system framework of
[0033] It shall be emphasized that, in the system and method provided by this disclosure, the physical tooling is not needed during the calibration process, but the characteristic points or the designated points on the workpiece W are used for calibration. On the workpiece W, the only one requirement to be a qualified characteristic point is any point within an intersection region of the visual fields of the physical image sensor 11 and the virtual image sensor 12, such as a center or an intersection point of lines and planes. As shown in
[0034] The physical image sensor 11 has a visual field which has a first image central axis A, and is disposed on a flange F at an end of the robot R. The flange F is defined with a coordinate system (x.sub.f-y.sub.f-z.sub.f). The visual field of the physical image sensor 11 intersects a Z axis (z.sub.f) originated at a center of the flange F, and the Z axis (z.sub.f) is perpendicular to a horizontal plane expanded by an X axis (x.sub.f) and a Y axis (y.sub.f) of the coordinate system (x.sub.f-y.sub.f-z.sub.f).
[0035]
[0036] Firstly, after the robot R moves an arbitrary designated point on the workpiece W into an arbitrary position within the visual field of the physical image sensor 11, then the position of the designated point is defined as an origin O (not shown in the figure) for an image coordinate system. The image coordinate system is the coordinate system (x.sub.1C-y.sub.1C-z.sub.1C) formed by the images captured by the physical image sensor 11.
[0037] After the origin O is moved to overlap a designated point, the robot R moves in an x.sub.R direction of the coordinate system of the robot R by an arbitrary distance LR so as to obtain a projected coordinate point P′.sub.x1=(x.sub.11, y.sub.11) of the physical image sensor 11, and a spatial vector for this point P′.sub.x1=(x.sub.11, y.sub.11) is defined as {right arrow over (U.sub.1)}=(−x.sub.11, −y.sub.11, −z.sub.11).
[0038] Similarly, after the origin O is moved to overlap another designated point, the robot R moves in an y.sub.R direction of the coordinate system of the robot R by the arbitrary distance LR so as to obtain another projected coordinate point P′.sub.y1=(x.sub.21, y.sub.21) of the physical image sensor 11, and a spatial vector for this point P′.sub.y1=(x.sub.21, y.sub.21) is defined as {right arrow over (V.sub.1)}=(−x.sub.21, −y.sub.21, −z.sub.21).
[0039] Similarly, after the origin O is moved to overlap a further designated point, the robot R moves in an z.sub.R direction of the coordinate system of the robot R by the arbitrary distance LR so as to obtain a further projected coordinate point P′.sub.z1=(x.sub.31, y.sub.31) of the physical image sensor 11, and a spatial vector for this point P′.sub.z1=(x.sub.31,y.sub.31) is defined as {right arrow over (W.sub.1)}=(−x.sub.31, −y.sub.31, −z.sub.31).
[0040] According to orthogonality of the coordinate system, following simultaneous equations can be obtained:
{right arrow over (U.sub.1)}.Math.{right arrow over (V.sub.1)}=0 (1)
{right arrow over (V.sub.1)}.Math.{right arrow over (W.sub.1)}=0 (2)
{right arrow over (U.sub.1)}.Math.{right arrow over (W.sub.1)}=0 (3)
[0041] Thus, constant vectors {right arrow over (U.sub.1)}, {right arrow over (V.sub.1)}, {right arrow over (W.sub.1)} can be obtained.
[0042] Theoretically, these simultaneous equations have two sets of solutions, (z.sub.11, z.sub.21, z.sub.31) and (z.sub.12, z.sub.22, z.sub.32), in which (z.sub.11, z.sub.21, z.sub.31)=−(z.sub.12, z.sub.22, z.sub.32). Thus, the distance variation between two arbitrary designated points on the workpiece in the image can be utilized to judge whether, while in moving in the coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robot R, the designated point moves toward or away from the physical image sensor 11, such that the correct branch solution can be determined.
[0043] Hence, coordinates of the physical image sensor 11 with respect to the coordinate system of the flange F is:
[0044] in which .sup.flangeT.sub.CCD stands for the coordinates of the physical image sensor 11 with respect to the coordinate system of the flange F, and .sup.baseT.sub.flange stands for the coordinates of the flange F with respect to the base coordinate system (x.sub.b-y.sub.b-z.sub.b) of the robot R.
[0045] Thereupon, the transformation relationship between the coordinate system of the robot R and that of the physical image sensor 11 can be obtained as follows:
S.sub.R=.sup.baseT.sub.flange.sup.flangeT.sub.CCDS.sub.c (5)
[0046] in which S.sub.c stands for the movement in the coordinate system (x.sub.1C-y.sub.1C-z.sub.1C) of the physical image sensor 11, and S.sub.R stands for the movement in the coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robot R.
[0047] After the transformation relationship of coordinate systems between the physical image sensor 11 and the robot R has been established, then any point within the visual field of the physical image sensor 11 is rotated, with respect to an arbitrary coordinate axis of the flange F, by an angle θ.sub.v to generate a second viewing angle to locate the virtual image sensor 12, as shown in
[0048] Referring now to
[0049] Step (a): Move the robot R to an arbitrary characteristic point WPi within the visual field of the physical image sensor 11, and assign [0, 0, D.sub.Z] to be the coordinates of another arbitrary point D within the visual field with respect to the flange F, i.e., D=[0, 0, D.sub.Z] as shown in
[0050] Step (b): Apply the physical image sensor 11 to obtain a coordinate C.sub.1 of the characteristic point WPi with respect to the image coordinate system, and then the physical image sensor 11 rotates twice about the Z axis (z.sub.1c axis) so as to generate two coordinates C.sub.2 and C.sub.3. Further, according to the arcs formed by coordinates C.sub.1, C.sub.2, C.sub.3, calculate a center position D1, as shown in
[0051] Step (c): Calculate a vector ({right arrow over (D.sub.1I.sub.1)}).sub.c from the center position D.sub.1 to a tool-image center I1, and transform the vector ({right arrow over (D.sub.1I.sub.1)}).sub.c into a vector with respect to the flange F, i.e., vector ({right arrow over (D.sub.1I.sub.1)}).sub.f=.sup.flangeT.sub.CCD({right arrow over (D.sub.1I.sub.1)}).sub.c. The tool-image center I1 is the intersection point of the Z axis (ztool) of the tooling coordinate system and the first image central axis A.
[0052] Step (d): Correct the coordinate of point D by D=D+L(D.sub.1I.sub.1).sub.f, in which L( ) stands for a distance function. Then, go back to perform Step (a), until the center position D.sub.1 overlaps the image center point I1. Thus, coordinate I of the virtual tool center point can be obtained. Based on each change at the vector ({right arrow over (D.sub.1I.sub.1)}).sub.c, adjust constants of the function L( ).
[0053] Step (e): Rotate the physical image sensor 11 an angle θ.sub.v about an arbitrary coordinate axis of the flange F so as to generate the second viewing angle to locate the virtual image sensor 12.
[0054] Referring to
[0055] Step (a1): Apply the physical image sensor 11 to capture the image information including coordinates (x.sub.c1,y.sub.c1) of the designated point with respect to the physical image sensor 11.
[0056] Step (b1): Transform
into S.sub.R=.sup.baseT.sub.flange.sup.flangeT.sub.CCDS.sub.c in the coordinate system of the robot R, as shown in
[0057] Step (c1): Move the robot R along SR, until the designated point reaches the coordinate axis of the physical image sensor 11.
[0058] Step (d1): If the designated point does not overlap the center point of the physical image sensor 11, then go back to perform Step (a1). Otherwise, if the designated point overlaps the intersection points of axial lines, then the visual servo control process is completed.
[0059] Referring to
[0060] The controller 13 controls the robot R as well as the physical image sensor 11 or the virtual image sensor 12 to rotate synchronously by an angle, so as to have an arbitrary characteristic point WPi on the workpiece W to move repeatedly back and forth between the first image central axis A and the second image central axis B, as shown in
[0061] From
[0062] Then, the controller 13 goes further to determine whether or not a quantity of the calibration points is larger than or equal to a preset value. In this embodiment, the quantity of the calibration point needs to be larger than or equal to 3. If the quantity of the calibration points is less than 3, then the controller 13 can apply a random-number generator to produce Euler-angle increments ΔR.sub.x, ΔR.sub.y, ΔR.sub.z for correcting the Euler angles of the robot R so as to vary the posture of the robot R. At this time, the Euler angles of the robot R can be expressed by (R.sub.x+ΔR.sub.x, R.sub.y+ΔR.sub.y, R.sub.z+ΔR.sub.z), in which (R.sub.x, R.sub.y, R.sub.z) stands for the original Euler angle of the robot R, Rx stands for the Yaw angle, Ry stands for the pitch angle, and Rz stands for the roll angle. If the corrected Euler angle exceeds a motion region of the robot R or the overlapped region IA, then the controller 13 would apply again the random-number generator to generate a new set of the Euler-angle increments.
[0063] Then, after a new Euler angle and a next characteristic point WPi are obtained, the controller 13 controls the robot R to move the virtual tool center point TCP repeatedly back and forth between the first image central axis A and the second image central axis B. Upon when the virtual tool center point TCP overlaps the second image central axis B, a second calibration point CP2 is recorded.
[0064] Then, the controller 13 determines if or not the quantity of the calibration points is larger than or equal to 3. In the case that the controller 13 judges that the quantity of the calibration points is less than 3, then the controller 13 repeats the aforesaid procedures for obtaining and recording a third calibration point CP3. The same process would be repeated until the controller 13 confirms that the quantity of the calibration points is larger than or equal to 3.
[0065] As described above, in the calibration process of this disclosure, at least 3 designated points in the workpiece coordinate system are adopted as the aforesaid designated or characteristic points. For example, these three designated points can be the origin of the workpiece coordinate system, an arbitrary point at the X axis of the workpiece coordinate system, and an arbitrary point on the X-Y plane of the workpiece coordinate system. Firstly, the robot R is controlled to move an i-th designated point in the workpiece coordinate system (i.e., the i-th characteristic point WPi on the workpiece W) into an overlapped visual region of the physical image sensor 11 and the virtual image sensor 12. The aforesaid moving operation is repeated until the number i is greater than a present number, so that information-collection process of the designated points for calibration can be completed. Based on the plurality of calibration points, coordinates of the virtual TCP and the workpiece can be calculated.
[0066] As shown in
[0067] The coordinates of the virtual tool center point TCP can be derived by the following equation (6):
T1iT2=P (6)
[0068] in which matrix T1i is a 4×4 transformation matrix for transforming coordinates of the i-th calibration point from the basic coordinate system (xb-yb-zb) to the coordinate system (x.sub.f-y.sub.f-z.sub.f) of the flange F, column T2 stands for the coordinates of the virtual tool center point TCP in the coordinate system of the flange F, and column P stands for the coordinates of the calibration point in the basic coordinate system (x.sub.b-y.sub.b-z.sub.b). Since each of the calibration points can apply equation (6) to obtain three linear equations, thus 3n linear equations can be obtained for n calibration points. Then, a pseudo-inverse matrix can be applied to derive the coordinates of the virtual tool center point TCP. As shown below, equation (7) is derived from equation (6).
[0069] in which column (e.sub.11i, e.sub.21i, e.sub.31i) stands for a coordinate vector for the i-th calibration point at the x.sub.f axis of the basic coordinate system (x.sub.b-y.sub.b-z.sub.b), column (e.sub.12i, e.sub.22i, e.sub.32i) stands for a coordinate vector for the i-th calibration point at the y.sub.f axis of the basic coordinate system (x.sub.b-y.sub.b-z.sub.b), and column (e.sub.13i, e.sub.23i, e.sub.33i) stands for a coordinate vector for the i-th calibration point at the z.sub.f axis of the basic coordinate system (x.sub.b-y.sub.b-z.sub.b). From equation (7), equations (8) and (9) can be derived as follows:
[0070] in which
T.sub.3.sup.t is a transpose matric of T.sub.3, and (T.sub.3T.sub.3.sup.t).sup.−1 is an inverse matrix of (T.sub.3T.sub.3.sup.t).
[0071] If the quantity of the calibration points is met, plug entries of matrix T with respect to the known i-th calibration point into equation (8), and, through a shift operation upon matrix T.sub.3, equation (9) can be obtained. Thus, coordinates (T.sub.x, T.sub.y, T.sub.z) of the virtual tool center point TCP in the coordinate system of the flange F and coordinates (P.sub.x, P.sub.y, P.sub.z) of the virtual tool center point TCP in the coordinate system of the robot R (x.sub.R-y.sub.R-z.sub.R) can be obtained, calibrations of coordinates (T.sub.x, T.sub.y, T.sub.z) of the virtual tool center point TCP can be completed, and coordinates for the workpiece W can be calculated.
[0072] As described above, in this embodiment, the automated calibration system for a workpiece coordinate frame of a robot 1 can apply the visual servo means to automatically calibrate the coordinate of the robot with respect to the workpiece to be machined, with an acceptable calibration precision. Thus, related labor and time cost can be effectively reduced. In addition, since the automated calibration system for a workpiece coordinate frame of a robot 1 can calibrate the coordinates of the robot with respect to the workpiece in a single calibration process, so the automated calibration system 1 provided by this disclosure can effectively improve existing shortcomings in the art.
[0073] Referring now to
[0074] Step S51: Control the robot R to move an arbitrary designed point within a visual field of the physical image sensor 11 a distance LR along a horizontal axis x.sub.R of the coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robot R from an arbitrary position in an image-overlapped region IA, and obtain a first projection coordinate P.sub.x1′ through the physical image sensor 11.
[0075] Step S52: Control the robot R to move the designed point the distance L.sub.R along a vertical axis y.sub.R of the coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robot R from the aforesaid position in the image-overlapped region IA, and obtain a second projection coordinate P.sub.y1′ through the physical image sensor 11.
[0076] Step S53: Control the robot R to move the designed point the distance LR along another vertical axis z.sub.R of the coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robot R from the aforesaid position in the image-overlapped region IA, and obtain a third projection coordinate P.sub.z1′ through the physical image sensor 11.
[0077] Step S54: Provide a first spatial vector {right arrow over (U.sub.1)}, a second spatial vector {right arrow over (V.sub.1)} and a third spatial vector {right arrow over (W.sub.1)} corresponding to the first projection coordinate P.sub.x1′, the second projection coordinate P.sub.y1′ and the third projection coordinate P.sub.z1′, respectively.
[0078] Step S55: Based on an orthogonal relationship formed by the first spatial vector {right arrow over (U.sub.1)}, the second spatial vector {right arrow over (V.sub.1)} and the third spatial vector {right arrow over (W.sub.1)}, calculate the first spatial vector {right arrow over (U.sub.1)}, the second spatial vector {right arrow over (V.sub.1)} and the third spatial vector {right arrow over (W.sub.1)}.
[0079] Step S56: Based on an orthogonal relationship formed by the first spatial vector {right arrow over (U.sub.1)}, the second spatial vector {right arrow over (V.sub.1)} and the third spatial vector {right arrow over (W.sub.1)}, calculate the first spatial vector {right arrow over (U.sub.1)}, the second spatial vector {right arrow over (V.sub.1)} and the third spatial vector {right arrow over (W.sub.1)}, referring to equation (4).
[0080] Referring to
[0081] Step S61: Control the robot R to move an arbitrary designed point within a visual field of the virtual image sensor 12 a distance L.sub.R along a horizontal axis x.sub.R of the coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robot R from an arbitrary position in an image-overlapped region IA, and obtain a first projection coordinate P.sub.x2′ through the virtual image sensor 12.
[0082] Step S62: Control the robot R to move the designed point the distance L.sub.R along a vertical axis y.sub.R of the coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robot R from the aforesaid position in the image-overlapped region IA, and obtain a second projection coordinate P.sub.y2′ through the virtual image sensor 12.
[0083] Step S63: Control the robot R to move the designed point the distance L.sub.R along another vertical axis z.sub.R of the coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robot R from the aforesaid position in the image-overlapped region IA, and obtain a third projection coordinate P.sub.z2′ through the virtual image sensor 12.
[0084] Step S64: Provide a first spatial vector {right arrow over (U.sub.2)}, a second spatial vector {right arrow over (V.sub.2)} and a third spatial vector {right arrow over (W.sub.2)} corresponding to the first projection coordinate P.sub.x2′, the second projection coordinate P.sub.y2′ and the third projection coordinate P.sub.z2′ respectively.
[0085] Step S65: Based on an orthogonal relationship formed by the first spatial vector {right arrow over (U.sub.2)}, the second spatial vector {right arrow over (V.sub.2)} and the third spatial vector {right arrow over (W.sub.2)}, calculate the first spatial vector {right arrow over (U.sub.2)}, the second spatial vector {right arrow over (V.sub.2)} and the third spatial vector {right arrow over (W.sub.2)}.
[0086] Step S66: Based on the first spatial vector {right arrow over (U.sub.2)}, the second spatial vector {right arrow over (V.sub.2)} and the third spatial vector {right arrow over (W.sub.2)}, calculate a transformation relationship between the coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robot R and a coordinate system (x.sub.2C-y.sub.2C-z.sub.2C) of the virtual image sensor 12, referring to equation (8).
[0087] Referring now to
[0088] Step S71: Provide a physical image sensor 11 having a first image central axis A.
[0089] Step S72: Provide a virtual image sensor 12 having a second image central axis B intersecting the first image central axis A at an intersection point I.
[0090] Step S73: Control a robot R to move a characteristic point WPi of a workpiece repeatedly back and forth between the first image central axis A and the second image central axis B.
[0091] Step S74: Upon when the characteristic point WPi and the intersection point I are overlapped, record a calibration point including coordinates of a plurality of joints J1-J6 of the robot R.
[0092] Step S75: Repeat the aforesaid steps to generate a plurality of another calibration points.
[0093] Step S76: Based on all the plurality of calibration points, calculate coordinates of a virtual tool center point TCP and the workpiece.
[0094] Referring now to
[0095] Step S81: Given i=1, define an i-th characteristic point WPi of a workpiece W.
[0096] Step S82: A controller 13 controls a robot R to move the characteristic point WPi into a common visual field of a physical image sensor 11 and a virtual image sensor 12.
[0097] Step S83: The controller 13 moves the characteristic point WPi repeatedly back and forth between a first image central axis A and a second image central axis B, until the characteristic point WPi hits an intersection point I of the first image central axis A and the second image central axis B.
[0098] Step S84: The controller 13 determines whether or not a distance between the characteristic point WPi and the intersection point I of the first image central axis A and the second image central axis B is smaller than a threshold valve. If positive, go to perform Step S85. If negative, then go to perform Step S841.
[0099] Step S841: The controller 13 applies a random-number generator to generate Euler-angle increments (ΔR.sub.x,ΔR.sub.y,ΔR.sub.z) to correct corresponding Euler angles of the robot R, and the method goes back to perform Step S83.
[0100] Step S85: The controller 13 records a first set of joint values of the characteristic point WPi, i.e., a first calibration point.
[0101] Step S86: The controller 13 determines whether or not a quantity of the first set of joint values is larger than or equal to 4. If positive, go to perform Step S87. If negative, go to perform Step S841. In this disclosure, the aforesaid quantity can be any other integer.
[0102] Step S87: The controller 13 determines whether or not a quantity of the characteristic points WPi is larger than or equal to a preset number. If positive, go to perform Step S88. If negative, go to perform Step S871.
[0103] Step S871: Given i=i+1.
[0104] Step S88: Derive coordinates of a virtual tool center point TCP and the workpiece W with respect to the robot R by a tool-center calibration method which is disclosed in the prior application.
[0105] As described above, in the automated calibration system and method for a workpiece coordinate frame of a robot provided by this disclosure, the calibration process mainly includes four portions: (1) mounting a physical image sensor on a flange at an end of the robot; (2) obtaining a transformation relationship between a coordinate system of the flange of the robot and that of the physical image sensor so as to transform motion information captured as an image into information convenient for the robot; (3) applying a multi-vision means to construct the virtual image sensor and the position of the virtual tool center point so as to generate a 2.5D machine vision; and, (4) applying a visual servo means to control the robot to overlap a designated point on the workpiece and an intersection point of two image axes. In addition, the aforesaid overlapping operation can be achieved by any of the two following method: (41) controlling the robot having four, for example, different postures to overlap the virtual tool center point and the origin of the workpiece coordinate system, then overlapping the virtual tool center point of any posture of the robot and each of an arbitrary point at an X axis of the workpiece coordinate system and another arbitrary point on the X-Y plane, and recording the coordinates; or (42) controlling the robot having four, for example, different postures to overlap the virtual tool center point and each of four known coordinate points on the workpiece of the workpiece coordinate system, and recording the coordinates.
[0106] According to this disclosure, only one physical image sensor is needed to be furnished onto the flange at the end of the robot, and, through calibrating the characteristic points, the calibration process avoids direct contact between the workpiece and the robot, and can complete the position correction of the workpiece in a single calibration process. Thereupon, the calibration precision can be effectively enhanced.
[0107] With respect to the above description then, it is to be realized that the optimum dimensional relationships for the parts of the disclosure, to include variations in size, materials, shape, form, function and manner of operation, assembly and use, are deemed readily apparent and obvious to one skilled in the art, and all equivalent relationships to those illustrated in the drawings and described in the specification are intended to be encompassed by the present disclosure.