AUTOMATED CALIBRATION SYSTEM AND METHOD FOR THE RELATION BETWEEN A PROFILE-SCANNER COORDINATE FRAME AND A ROBOT-ARM COORDINATE FRAME

20230008909 · 2023-01-12

    Inventors

    Cpc classification

    International classification

    Abstract

    An automated calibration system for the relation between a robot-arm coordinate frame and a profile-scanner coordinate frame includes a ball probe, a distance sensor module, a profile scanner and a control module. The ball probe is attached on a flange of a robot arm. The distance sensor module includes at least three distance sensors having respective axes sharing a common sensing plane and intersecting at a common point. The profile scanner is used for detecting a 2D cross-sectional profile of the ball probe. The control module is electrically connected with the distance sensor module, the profile scanner and the robot arm so as to control the robot arm to move the ball probe to obtain calibration information. In addition, an automated calibration method for the relation between the profile-scanner coordinate frame and the robot-arm coordinate frame is also provided.

    Claims

    1. An automated calibration method for a relation between a robot-arm coordinate frame and a profile-scanner coordinate frame, comprising the steps of: (a) disposing a ball probe having a predetermined radius on a flange of a robot arm, and arranging a distance sensor module and a profile scanner, the distance sensor module including at least three distance sensors, three axes corresponding to the three distance sensors sharing a common sensing plane and intersecting at a point of intersection; wherein the ball probe, the robot arm, the flange, the distance sensor module and the profile scanner have a ball-probe coordinate frame, a robot-arm coordinate frame, a flange coordinate frame, a distance-sensor-module coordinate frame and a profile-scanner coordinate frame, respectively; (b) controlling the robot arm to move the ball probe to undergo a triaxial movement along the robot-arm coordinate frame, and thus to establish a transformation relationship between the robot-arm coordinate frame and the distance-sensor-module coordinate frame; (c) utilizing distance information detected by the distance sensor module to control the robot arm at one of different postures to move a spherical center of the ball probe to the point of intersection, so as to coincide an origin of the distance-sensor-module coordinate frame with the spherical center of the ball probe, and further to record all axial joint angles of the robot arm into calibration point information of a tool center point (TCP); (d) calculating a coordinate of the spherical center of the ball probe with respect to the flange coordinate frame as an instant coordinate of the TCP; (e) controlling repeatedly the robot arm to experience all the different postures so as to allow the profile scanner to capture respective information of the ball probe and the profile scanner to obtain respective cross-sectional profile information of the ball probe, and to apply a circle fitting method and the Pythagorean theorem to derive respective center coordinates into the calibration point information with respect to the profile-scanner coordinate frame; and (f) deriving the relation between the profile-scanner coordinate frame and the robot-arm coordinate frame, and inputting all the calculated coordinates into a control module for completing calibration.

    2. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 1, wherein the Step (b) further includes the steps of: (a1) controlling the robot arm to move the ball probe to undergo the triaxial movement along the robot-arm coordinate frame, so as to have the three distance sensors simultaneously to read corresponding distance information of the ball probe; wherein a sensing plane formed by the distance sensor module at a movement onset position is not coplanar with a cross-sectional circle containing the largest radius of the ball probe, and corresponding coordinates with respect to the distance-sensor-module coordinate frame are recorded; (b1) utilizing the distance information detected by the three distance sensors to calculate coordinates of at least three points of the ball probe on the sensing plane with respect to the distance-sensor-module coordinate frame, and further to calculate a center of the cross-sectional circle as an initial point; (c1) moving the robot arm, from the initial point, along three axial directions (X, Y, Z) of the robot-arm coordinate frame by an arbitrary length, so as to calculate a vector corresponding to the three axial directions of the robot-arm coordinate frame with respect to the distance-sensor-module coordinate frame; and (d1) utilizing the vector derived in the Step (c1) to calculate the transformation relationship between the robot-arm coordinate frame and the distance-sensor-module coordinate frame.

    3. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 2, wherein the Step (b1) further includes the steps of: (a11) utilizing the three distance sensors to calculate three circular coordinates A.sub.0, B.sub.0, C.sub.0; (b11) connecting the circular coordinate A.sub.0 and the circular coordinate B.sub.0 to form a line and the circular coordinate B.sub.0 and the circular coordinate C.sub.0 to form another line, calculating two perpendicular bisectors respective to the line and the another line, and calculating the two perpendicular bisectors to derive a coordinate of the center of the cross-sectional circle with respect to the distance-sensor-module coordinate frame; (c11) deriving a radius of the cross-sectional circle from the coordinate of the center obtained in the Step (b11); and (d11) according to the Pythagorean theorem, calculating a height of the spherical center of the ball probe with respect to the cross-sectional circle.

    4. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 3, wherein, in the Step (d11), the height <0 if the spherical center is located under the cross-sectional circle, and the height >0 if the spherical center is located above the cross-sectional circle.

    5. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 1, wherein the Step (c) further includes the steps of: (a2) utilizing the distance information detected by the distance sensor module to obtain at least three circular coordinates on the cross-sectional circle and further to calculate a coordinate of a center of the cross-sectional circle, so as to control the center of the cross-sectional circle to coincide with a Z-axial direction of the distance-sensor-module coordinate frame; (b2) according to the transformation relationship between the robot-arm coordinate frame and the distance-sensor-module coordinate frame, controlling the robot arm to move, and having the distance sensor module to capture the at least three circular coordinates on the cross-sectional circle and to calculate a radius of the cross-sectional circle; if the radius of the cross-sectional circle is equal to the radius of the ball probe, implying that the sensing plane is coincided with the spherical center of the ball probe, and recording the coordinate of the center into the calibration point information of the TCP; if a number of calibration points in the calibration point information is at least greater than 4, then finishing to obtain the calibration points; if the number of calibration points in the calibration point information is at least less than 4, then going to perform Step (c2); (c2) utilizing a random number generator to generate Euler angle increments; and (d2) utilizing the Euler angle increments to calculate Euler angles of the robot arm, and then moving the robot arm to a position corresponding to the Euler angles; if the position exceeds a movement limit, then going back to the Steps (c2) and (d2) for generating another Euler angle increments; otherwise, going back to the Step (a2) for generating another calibration point information.

    6. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 1, wherein the Step (d) utilizes information of the robot arm in link parameters, joint coordinates and the TCP with respect to the flange coordinate frame to obtain spatial coordinates of at least four calibration points, and thus the spherical center of the ball probe with respect to the flange coordinate frame is calculated to be the coordinate of the TCP.

    7. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 1, wherein the Step (e) further includes the steps of: (a3) controlling the robot arm to move the ball probe into the distance sensor module so as to have the three distance sensors and the profile scanner able to simultaneously read information with respect to the ball probe, the sensing plane formed by the distance sensor module and the cross-sectional circle of the ball probe having the largest radius being coplanar or non-coplanar; (b3) recording a coordinate of the spherical center of the ball probe with respect to the robot-arm coordinate frame; (c3) utilizing the profile scanner to capture the cross-sectional profile information of the ball probe and to obtain profile-point set information with respect to the profile-scanner coordinate frame, and applying a circle equation and a least-squared error method to perform fitting so as to derive a coordinate of a center of a cross-sectional circle and a radius of the cross-sectional circle; (d3) applying the Pythagorean theorem to calculate a distance between the spherical center and the cross-sectional circle; and (e3) recording a coordinate of the spherical center of the ball probe with respect to the profile-scanner coordinate frame into the calibration point information.

    8. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 7, wherein, in the Step (d3), the spherical center is located above the cross-sectional circle of the profile scanner if the radius of the cross-sectional circle obtained by the three distance sensors is larger than the radius of the cross-sectional circle of the profile scanner, and the spherical center is located under the cross-sectional circle of the profile scanner if the radius of the cross-sectional circle obtained by the three distance sensors is smaller than the radius of the cross-sectional circle of the profile scanner.

    9. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 7, wherein, in the Step (e3), if the calibration point information includes at least four calibration points, then obtaining of the calibration point information is finished; otherwise, a random number generator is applied to generate a movement increment so as to move the robot arm accordingly to another position of the different postures; wherein, if the another position exceeds a movement limit or a detection range, another movement increment is generated; and, otherwise, go to the Step (b3) to form another calibration point information.

    10. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 1, wherein, after at least four calibration point coordinates are obtained with respect to the profile-scanner coordinate frame and the robot-arm coordinate frame in the Step (f), a coordinate relation and a transformation matrix are utilized to calculate the transformation relationship between the robot-arm coordinate frame and the profile-scanner coordinate frame.

    11. The automated calibration method for the relation between the robot-arm coordinate frame and the profile-scanner coordinate frame of claim 1, wherein the robot arm, the distance sensor module and the profile scanner are all electrically connected with the control module, such that the control module is able to control the robot arm, the distance sensor module and the profile scanner to move and perform calculations in the Step (b) through the Step (f).

    12. An automated calibration system for a relation between a robot-arm coordinate frame and a profile-scanner coordinate frame, comprising: a ball probe, attached on a flange of a robot arm; a distance sensor module, including at least three distance sensors, three axes corresponding to the three distance sensors being coplanar with a sensing plane of the at least three distance sensors, the three axes being intersected at a point of intersection; a profile scanner, used for detecting a 2D cross-sectional profile of the ball probe; and a control module, electrically connected with the distance sensor module, the profile scanner and the robot arm, configured for controlling the robot arm to move the ball probe for obtaining calibration point information.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0022] The present disclosure will become more fully understood from the detailed description given herein below and the accompanying drawings which are given by way of illustration only, and thus are not limitative of the present disclosure and wherein:

    [0023] FIG. 1 is a schematic front view of an embodiment of the automated calibration system for a relation between a robot-arm coordinate frame and a profile-scanner coordinate frame in accordance with this disclosure;

    [0024] FIG. 2 is a schematic top view of the distance sensor module and the profile scanner of FIG. 1;

    [0025] FIG. 3 demonstrates schematically a transformation relationship between the robot-arm coordinate frame and the distance-sensor coordinate frame of FIG. 1;

    [0026] FIG. 4A is a schematic front view of an operation of FIG. 1;

    [0027] FIG. 4B is a schematic top view of FIG. 4A;

    [0028] FIG. 5, FIG. 6, FIG. 6A and FIG. 6B illustrate schematically how the embodiment of FIG. 1 applies detection information of the distance sensor module to calculate a coordinate of a center;

    [0029] FIG. 7 illustrates schematically how the embodiment of FIG. 1 calculates a real coordinate of the tool center point;

    [0030] FIG. 8 illustrates schematically how the embodiment of FIG. 1 applies a circle equation and the least-squared error method to fit a radius with the least error so as further to derive a center coordinate and a circular radius; and

    [0031] FIG. 9 is a schematic flowchart of an embodiment of the automated calibration method for a relation between a robot-arm coordinate frame and a profile-scanner coordinate frame in accordance with this disclosure.

    DETAILED DESCRIPTION

    [0032] In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.

    [0033] Referring now to FIG. 1 and FIG. 2, an automated calibration system for a relation between a robot-arm coordinate frame and a profile-scanner coordinate frame 100 provided in this disclosure includes a ball probe 10, a distance sensor module 20, a profile scanner 30 and a control module 40.

    [0034] The ball probe 10 is attached on a flange 202 of a robot arm 200, and can be made of, but not limited to, a stainless steel or any metallic material with the like rigidity.

    [0035] The distance sensor module 20 includes three distance sensors 21˜23.

    [0036] The profile scanner 30, configured to detect a 2D cross-sectional profile of the ball probe 10, can be a 2D profile scanner or a 3D profile scanner.

    [0037] FIG. 1 illustrates schematically connections among the robot arm 200, the distance sensor module 20, the profile scanner 30 and the control module 40. The control module 40, omitted in FIG. 2, is configured to control motions of the robot arm 20, the distance sensor module 20 and the profile scanner 30, and to carry out calculations and analysis during a calibration process. Generally, the control module 40 is, but not limited to, a computer.

    [0038] In practical application, the robot arm 200 drives the tooling mounted on the flange 202 to complete preset tasks. In this disclosure, the distance sensor module 20 and the ball probe 10 having a predetermined radius and mounted on the flange 202 of the robot arm 200 are utilized to perform calibration of the positioning relationship between the robot arm 200 and the profile scanner 30.

    [0039] Referring also to FIG. 1 and FIG. 2, according to this disclosure, the calibration of the tool center point (TCP) requires detected distance information of the distance sensors 21˜23, Pythagorean theorem and the circle equations. With the calibrated TCP, a circle fitting equation is applied to derive a relation between coordinate frames of the profile scanner 30 and the robot arm 200.

    [0040] In the calibration, the ball probe 10 has a radius R.sub.s, the robot arm 200 has a robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R, the flange 202 has a flange coordinate frame X.sub.f-Y.sub.f-Z.sub.f, the profile scanner 30 has a profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L, the ball probe 10 has a ball-probe coordinate frame X.sub.t-Y.sub.t-Z.sub.t, and the distance sensor module 20 has a distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M.

    [0041] The three distance sensors 21˜23 has three axes I.sub.1, I.sub.2, I.sub.3, respectively, in which I.sub.1, I.sub.2 and I.sub.3 shall share a common sensing plane H20, and intersect at a common point of intersection O.sub.20. In addition, the angular relationship among these three axes I.sub.1, I.sub.2, I.sub.3 is given. Angles θ.sub.1, θ.sub.2, θ.sub.3 for the three axes I.sub.1, I.sub.2, I.sub.3 can be arranged in a 120-degree equiangular distribution, or in an unequal angular distribution. The point of intersection O.sub.20 is the origin of the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M, as shown in FIG. 2.

    [0042] Referring now to FIG. 3 through FIG. 6, by having a spherical center M.sub.o of the ball probe 10 having a predetermined radius R.sub.s at the robot arm 200 to slide along the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R, then a transformation relationship between the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R and the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M , can be derived, as shown in FIG. 3. The corresponding method thereto can include Steps (a1)˜(f1) as follows.

    [0043] Step (a1): Control the robot arm 200 to move so as to have the ball probe 10 mounted on the flange 202 of the robot arm 200 to move along the three axial directions of the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R into the distance sensor module 20, such that the three distance sensors 21˜23 can generate simultaneously corresponding distance information of the ball probe 10. In particular, the sensing plane H.sub.20 containing a movement onset position with respect to the distance sensor module 20 and a cross-sectional position H.sub.10 containing the largest radius R.sub.s of the ball probe 10 are not coplanar. As shown in FIG. 4A and FIG. 4B, the coordinate of the initial point O with respect to the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M is then recorded. It is noted that the control module 40 is omitted in FIG. 4A and FIG. 4B.

    [0044] Step (b1): Utilize the detected distance information of the distance sensors 21˜23 to derive three coordinates A.sub.0, B.sub.0, C.sub.0 of the ball probe 10 on the sensing plane H.sub.20 with respect to the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M, and thereby to calculate an initial point O.sub.s on the cross-sectional circle containing the three coordinates A.sub.0, B.sub.0, C.sub.0, as shown in FIG. 5 an FIG. 6. The corresponding computation method can include Steps (a11)˜(d11) as follows.

    [0045] Step (a11): Utilize the three distance sensors 21˜23 to obtain

    [00001] A 0 = [ I 1 cos t 1 I 1 sin t 1 0 ] , B 0 = [ I 2 cos t 2 I 2 sin t 2 0 ] , C 0 = [ I 3 cos t 3 I 3 sin t 3 0 ] ,

    in which l.sub.i is the distance of point of intersection between one of the three axes I.sub.1, I.sub.2, I.sub.3 and the ball probe 10 with respect to the distance-sensor-module coordinate frame Z.sub.M, and t.sub.i is the angle of each of the three axes I.sub.1, I.sub.2, I.sub.3 with respect to the distance-sensor-module coordinate frame X.sub.M.

    [0046] Step (b11): Calculate two perpendicular bisectors V.sub.1, V.sub.2 to the line L.sub.1 and the another line L.sub.2, respectively, and calculate the two perpendicular bisectors V.sub.1, V.sub.2 to derive a coordinate F.sub.0 of the center O.sub.s of the cross-sectional circle with respect to the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M.

    [0047] Step (c11): Derive a radius R.sub.0 of the cross-sectional circle C.sub.S from the coordinate F.sub.0 of the center O.sub.s.

    [0048] Step (d11): According to the Pythagorean theorem, calculate a height d.sub.0=±√{square root over (R.sub.s.sup.2−R.sub.o.sup.2)} of the spherical center M.sub.0 of the ball probe 10 with respect to the cross-sectional circle C.sub.S. Referring to FIG. 6, if the spherical center M.sub.0 is located under the cross-sectional circle C.sub.S, then d.sub.0<0. Otherwise, d.sub.0>0.

    [0049] The spherical center M.sub.0 can be determined from an initial state. If the initial state spherical center M.sub.0 is located under the cross-sectional circle C.sub.S, and the radius R.sub.0 of the cross-sectional circle C.sub.S keeps increasing or decreasing during the movement, then the spherical center M.sub.0 would be maintained to be located under the cross-sectional circle C.sub.S. However, during the movement, if the radius R.sub.0 of the cross-sectional circle C.sub.S decreases after an increase, then it implies that the spherical center M.sub.0 is moved to be located above the cross-sectional circle C.sub.S.

    [0050] After the Step (b1) is performed, then keep performing Step (c1)˜(f1). Step (c1): Move the robot arm 200, from the initial point O, along an axial direction X.sub.R of the robot-arm coordinate frame by an arbitrary length. Then, according to the aforesaid Steps (a11)˜(d11), calculate orderly a coordinate F.sub.x, a radius R.sub.x, a height d.sub.x, and a vector

    [00002] U 1 = [ F x - F 0 d x - d 0 ]

    of the robot-arm coordinate frame X.sub.R with respect to the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M.

    [0051] Step (d1): Move the robot arm 200, from the initial point O, along another axial direction Y.sub.R of the robot-arm coordinate frame by an arbitrary length. Then, according to the aforesaid Steps (a)˜(d), calculate orderly a coordinate F.sub.y, a radius R.sub.y, a height d.sub.y, and a vector

    [00003] V 1 = [ F y - F 0 d y - d 0 ]

    of the robot-arm coordinate frame Y.sub.R with respect to the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M.

    [0052] Step (e1): Move the robot arm 200, from the initial point O, along a third axial direction Z.sub.R of the robot-arm coordinate frame by an arbitrary length. Then, according to the aforesaid Steps (a1)˜(d1), calculate orderly a coordinate F.sub.z, a radius R.sub.z, a height d.sub.z, and a vector

    [00004] W 1 = [ F z - F 0 d z - d 0 ]

    of the robot-arm coordinate frame Z.sub.R with respect to the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M.

    [0053] Step (f1): Obtain the transformation relationship

    [00005] S R = [ U 1 .Math. U 1 .Math. V 1 .Math. V 1 .Math. W 1 .Math. W 1 .Math. ] - 1 S M

    between the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R and the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M, in which S.sub.R is the movement along the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R, S.sub.M is the movement along the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M.

    [0054] Referring now to FIG. 1, FIG. 2 and FIG. 6A, after the transformation relationship between the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R and the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M is obtained, then control the spherical center M.sub.0 of the ball probe 10 at a different posture to coincide with the origin O.sub.20 of the distance-sensor-module coordinate frame X.sub.M-Y.sub.M-Z.sub.M, so that the TCP calibration point information can be calculated (i.e., the position information of the spherical center M.sub.0 of the ball probe 10 with the known radius R.sub.S on the robot arm 200 with respect to the flange coordinate frame X.sub.f-Y.sub.f-Z.sub.f). The method can include Steps (a2)˜(d2) as follows.

    [0055] Step (a2): Utilize the distance information detected by the distance sensor module 20 to obtain at least three circular coordinates A.sub.0, B.sub.0, C.sub.0 on the cross-sectional circle and further to calculate a coordinate C′ of a center of the cross-sectional circle C.sub.S1, and then utilize

    [00006] S R = - [ U 1 .Math. U 1 .Math. V 1 .Math. V 1 .Math. W 1 .Math. W 1 .Math. ] - 1 C

    to control the center O.sub.s of the cross-sectional circle C.sub.s to coincide with a Z-axial direction Z.sub.M of the distance-sensor-module coordinate frame.

    [0056] Step (b2): Control the robot arm 200 to move along the direction

    [00007] S R = - [ U 1 .Math. U 1 .Math. V 1 .Math. V 1 .Math. W 1 .Math. W 1 .Math. ] - 1 Z M ,

    and utilize the distance sensor module 20 to capture in a real-time manner three circular coordinates A.sub.01, B.sub.01, C.sub.01 on the cross-sectional circle C.sub.S1 for calculating a radius R.sub.01 of the cross-sectional circle C.sub.S1. If R.sub.01=R.sub.S of the ball probe 10, it implies that the sensing plane H.sub.20 is coincided with the spherical center M.sub.0. Thus, record this point into the calibration point information of the TCP. If a number of the recorded calibration points in the calibration point information of the TCP is greater than 4, then the obtaining of the calibration points is complete. On the other hand, if the number of the recorded calibration points in the calibration point information of the TCP is less than 4, then go to perform Step (c2).

    [0057] Step (c2): Utilize a random number generator to generate a Euler angle increment ΔR.sub.x, ΔR.sub.y, ΔR.sub.z.

    [0058] Step (d2): Having the Euler angles of the robot arm 200 to be R.sub.x=R.sub.x+ΔR.sub.x, R.sub.y=R.sub.y+ΔR.sub.y, R.sub.z=R.sub.z+ΔR.sub.z, then move the robot arm 200 to this new azimuth. If this set of Euler angles exceeds a movement limit, then go back to the Steps (c2) and (d2) for generating another set of Euler angles. Otherwise, go back to the Step (a2) for generating another calibration point information.

    [0059] Referring now to FIG. 1, FIG. 2 and FIG. 7, after sufficient calibration point information of the TCP has been gathered, then go to perform the TCP calibration to calculate the position of the spherical center M.sub.0 of the ball probe 10 having the known radius R.sub.S on the robot arm 200 with respect to the flange coordinate frame X.sub.f-Y.sub.f-Z.sub.f; i.e., the coordinate of the TCP. A spatial coordinate of the calibration point P (equivalent to the spherical center M.sub.0 of the ball probe 10) can be obtained by the information of link parameters, joint coordinates and the TCP of the robot arm 200 with respect to the flange coordinate frame X.sub.f-Y.sub.f-Z.sub.f:


    T.sub.1iT.sub.2=P

    [0060] in which

    [00008] T 1 i = [ R 1 i L 1 i 0 0 0 1 ]

    is the 4×4 homogeneous transformation matrix for the i-th calibration point to transform coordinates from the flange coordinate frame X.sub.f-Y.sub.f-Z.sub.f to the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R, R.sub.1i is a 3×3 sub-transformation matrix at the upper left corner of the homogeneous transformation matrix, and L.sub.1i is a vector formed by the top three entries of the fourth column of the homogeneous transformation matrix. By plugging the link parameters and the joint coordinates, this 4×4 homogeneous transformation matrix would become a constant matrix.

    [0061] T.sub.2=[T.sub.x T.sub.y T.sub.z 1].sup.T is the coordinate of the TCP with respect to the coordinate of the flange 202 coordinate, and P=[P.sub.x P.sub.y P.sub.z 1].sup.T is the spatial coordinate of the calibration point with respect to the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R. After four calibration points have been collected, then:

    [00009] T 2 = [ R 1 1 L 1 1 R 1 2 L 1 2 R 1 3 L 1 3 R 1 4 L 1 4 ] T ( [ R 1 1 L 1 1 R 1 2 L 1 2 R 1 3 L 1 3 R 1 4 L 1 4 ] [ R 1 1 L 1 1 R 1 2 L 1 2 R 1 3 L 1 3 R 1 4 L 1 4 ] T ) - 1 [ P x P y P z 1 ]

    can be used to calculate the coordinate of the TCP, and so the TCP calibration is complete.

    [0062] Referring now to FIG. 1, FIG. 2, FIG. 4A, FIG. 4B, FIG. 6 and FIG. 8, after the TCP coordinate is obtained, then the ball probe 10 having a predetermined radius R.sub.S on the robot arm 200 would be moved to a position demonstrating a profile able to be captured with respect to the profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L, and simultaneously to obtain a coordinate B.sub.j of the spherical center M.sub.0 of the ball probe 10 having the known radius R.sub.S with respect to the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R and another coordinate W.sub.j thereof with respect to the profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L. The method thereto includes Steps (a3)˜(e3) as follows.

    [0063] Step (a3): Define j=1, and move the robot arm 200 to dispose the ball probe 10 mounted on the flange 202 of the robot arm 200 into the distance sensor module 20, such that all the three distance sensors 21˜23 and the profile scanner 30 can read simultaneously information related to the ball probe 10. According to this disclosure, the sensing plane H.sub.20 formed by the distance sensor module 20 and the cross-sectional plane H.sub.10 of the ball probe 10 containing the largest radius R.sub.s thereof can be either coplanar or non-coplanar.

    [0064] Step (b3): Record the coordinate B.sub.j of the spherical center M.sub.0 of the ball probe 10 having the known radius R.sub.S with respect to the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R, in which B.sub.j=T.sub.1jT.sub.2 and

    [00010] T 1 j = [ R 1 j L 1 j 0 0 0 1 ] .

    T.sub.1j is the 4×4 homogeneous transformation matrix to transform coordinates from the flange coordinate frame X.sub.f-Y.sub.f-Z.sub.f to the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R.

    [0065] Step (c3): Utilize the profile scanner 30 to capture the cross-sectional profile information of the ball probe 10, and obtain profile-point set information x.sub.i, y.sub.i with respect to the profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L. By introducing the circle equation (x−x.sub.c).sup.2+(y−y.sub.c).sup.2=R.sub.c.sup.2 and the least-squared error method to perform fitting for minimizing the error upon the radius, then the center coordinate (x.sub.cj, y.sub.cj) and the sectional-circle radius R.sub.cj of the cross-sectional circle can be calculated, as shown in FIG. 8.

    [00011] [ x cj y cj R cj ] = [ x 1 y 1 1 x 2 y 2 1 .Math. .Math. .Math. x i y i 1 ] [ - ( x 1 2 + y 1 2 ) - ( x 2 2 + y 2 2 ) - ( x i 2 + y i 2 ) ]

    in which

    [00012] [ x 1 y 1 1 x 2 y 2 1 .Math. .Math. .Math. x i y i 1 ] = ( [ x 1 y 1 1 x 2 y 2 1 .Math. .Math. .Math. x i y i 1 ] T [ x 1 y 1 1 x 2 y 2 1 .Math. .Math. .Math. x i y i 1 ] ) - 1 [ x 1 y 1 1 x 2 y 2 1 .Math. .Math. .Math. x i y i 1 ] T

    is a pseudo-inverse matrix.

    [0066] Step (d3): Utilize the Pythagorean theorem to calculate a distance Z.sub.cj=±√{square root over (R.sub.s2.sup.2−R.sub.cj.sup.2)} between the spherical center M.sub.0 and the cross-sectional circle C.sub.S2. If the radius R.sub.02 of the cross-sectional circle C.sub.S2 captured by the distance sensors 21˜23 is larger than the radius R.sub.03 of the cross-sectional circle C.sub.S3 captured by the profile scanner 30 (i.e., the sensing plane H.sub.20 of the distance sensors 21˜23 is located above the sensing plane H.sub.30 of the profile scanner 30, as shown in FIG. 6B), then it implies that the spherical center M.sub.0 is located above the cross-sectional circle C.sub.S3 of the profile scanner 30, and thus Z.sub.cj>0. On the other hand, if the radius R.sub.02 of the cross-sectional circle C.sub.S2 captured by the distance sensors 21˜23 is smaller than the radius R.sub.03 of the cross-sectional circle C.sub.S3 captured by the profile scanner 30 (i.e., the sensing plane H.sub.20 of the distance sensors 21˜23 is located under the sensing plane H.sub.30 of the profile scanner 30), then it implies that the spherical center M.sub.0 is located under the cross-sectional circle C.sub.S3 of the profile scanner 30, and thus Z.sub.cj<0.

    [0067] Step (e3): Record

    [00013] W j = [ x cj y cj z cj ]

    as the coordinate of the spherical center M.sub.0 of the ball probe 10 with respect to the profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L, and then define j=j+1. If j>4, then the capturing of the calibration point information is complete. Otherwise, apply a random number generator to generate a movement increment, and then vary the position of the robot arm according to P.sub.x=R.sub.x+ΔP.sub.x, P.sub.y=P.sub.y+ΔP.sub.y, P.sub.z=P.sub.z+ΔP.sub.z, R.sub.x=R.sub.x+ΔR.sub.x, R.sub.y=R.sub.y+ΔR.sub.y, R.sub.z=R.sub.z+ΔR.sub.z. If this position exceeds a preset movement limit or detection range, then regenerate the movement increment. Thereafter, go to Step (b3) for generating information of the next calibration point to amend the calibration point information.

    [0068] After coordinates of all four arbitrary calibration points with respect to the profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L are obtained, the further calculation can be proceeded. In the following description, upon after all the coordinates of at least four calibration points with respect to the profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L and the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R are obtained, then the relationship among these coordinates can be utilized to further calculate the transformation relationship between the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R and the profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L.

    [0069] The transformation matrix for the profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L with respect to the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R can be formed as:

    [00014] T 3 = [ B 1 B 2 B 3 B 4 0 0 0 1 ] [ W 1 W 2 W 3 W 4 0 0 0 1 ] - 1 ,

    in which B.sub.j and W.sub.j are the coordinates of the j-th calibration point with respect to the robot-arm coordinate frame X.sub.R-Y.sub.R-Z.sub.R and the profile-scanner coordinate frame X.sub.L-Y.sub.L-Z.sub.L, respectively.

    [0070] Then, input all the calculated coordinates into the control module 40, and thus the calibration process can be complete.

    [0071] Referring now to FIG. 9, as described above, the calibration method 900 for a relation between a robot-arm coordinate frame and a profile-scanner coordinate frame in accordance with this disclosure can include the steps as follows.

    [0072] Step 902: Dispose a ball probe having a predetermined radius on a flange of the robot arm, and arrange a distance sensor module and a profile scanner; wherein the distance sensor module includes at least three distance sensors, and three axes corresponding to the three distance sensors share a common sensing plane and intersect at a point of intersection; wherein the ball probe, the robot arm, the flange, the distance sensor module and the profile scanner have a ball-probe coordinate frame, a robot-arm coordinate frame, a flange coordinate frame, a distance-sensor-module coordinate frame and a profile-scanner coordinate frame, respectively.

    [0073] Step 904: Control the robot arm to move the ball probe to undergo a triaxial movement along the robot-arm coordinate frame, and thus to establish a transformation relationship between the robot-arm coordinate frame and the distance-sensor-module coordinate frame.

    [0074] Step 906: Utilize distance information detected by the distance sensor module to control the robot arm at one of different postures to move a spherical center of the ball probe to the point of intersection so as to coincide an origin of the distance-sensor-module coordinate frame with the spherical center of the ball probe, and further to record all axial joint angles of the robot arm into calibration point information of a tool center point (TCP).

    [0075] Step 908: Calculate a coordinate of the spherical center of the ball probe with respect to the flange coordinate frame as an instant coordinate of the TCP.

    [0076] Step 910: Control repeatedly the robot arm to experience all the different postures so as to allow the profile scanner to capture respective information of the ball probe and the profile scanner to obtain respective cross-sectional profile information of the ball probe, and then apply a circle fitting method and the Pythagorean theorem to derive respective center coordinates into the calibration point information with respect to the profile-scanner coordinate frame.

    [0077] Step 912: Derive the relation between the profile-scanner coordinate frame and the robot-arm coordinate frame, and input all the calculated coordinates into the control module for completing the calibration process.

    [0078] In summary, in the automated calibration method and system for a relation between a robot-arm coordinate frame and a profile-scanner coordinate frame provided by this disclosure, after the ball probe having a predetermined radius is mounted onto the robot arm, a plurality of coplanar distance sensors are introduced to obtain a relationship between the ball probe and the flange of the robot arm by utilizing a circle fitting equation and the Pythagorean theorem, and the profile scanner is further introduced to obtain the ball-probe profile information at a plurality of different postures, such that the relation between the profile-scanner coordinate frame and the robot-arm coordinate frame can be calculated for performing the calibration process.

    [0079] According to this disclosure, no physical feature point is needed in the coordinate frame, no fixture is required to perform as a calibration medium, no CAD model should be applied as an assistance, and no additional 3D measurement device is required for calibrating the spatial position of the device. With this method and system of this disclosure, calibration of the coordinate frames can be completed in one operation procedure with enhanced calibration precision, and the aforesaid shortcomings in the art that the coordinate frame should have physical feature points and a fixture should be applied to improve the accuracy of calibration can be substantially resolved.

    [0080] With respect to the above description then, it is to be realized that the optimum dimensional relationships for the parts of the disclosure, to include variations in size, materials, shape, form, function and manner of operation, assembly and use, are deemed readily apparent and obvious to one skilled in the art, and all equivalent relationships to those illustrated in the drawings and described in the specification are intended to be encompassed by the present disclosure.