CALIBRATION METHOD FOR TOOL CENTER POINT, TEACHING METHOD FOR ROBOTIC ARM AND ROBOTIC ARM SYSTEM USING THE SAME
20220063104 · 2022-03-03
Assignee
Inventors
- CHENG-KAI HUANG (Taichung City, TW)
- Yi-Ying LIN (Taichung City, TW)
- Bing-Cheng HSU (Hemei Township, TW)
- Jan-Hao CHEN (Hemei Township, TW)
Cpc classification
B25J9/1664
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/39054
PHYSICS
B25J13/089
PERFORMING OPERATIONS; TRANSPORTING
International classification
B25J13/08
PERFORMING OPERATIONS; TRANSPORTING
Abstract
Firstly, a robotic arm drives a projection point of tool projected on test plane to perform relative movement relative to a reference point of a test plane. Then, conversion relationship is established according to the relative movement. Then, a tool axis vector relative to an installation surface reference coordinate system of the robotic arm is obtained. Then, calibration point information group obtaining step is performed, wherein the calibration point information group obtaining step includes: (a1) the robotic arm drives a tool center point to coincide with a reference point of the test plane and records calibration point information group; (a2) the robotic arm drives the tool to change angle of the tool; and (a3) steps (a1) and (a2) are repeated to obtain several calibration point information groups. Then, tool center point coordinate relative to the installation surface reference coordinate system is obtained according to the calibration point information groups.
Claims
1. A calibration method for tool center point, comprising: performing a step of establishing a first conversion relationship between a robotic arm reference coordinate system and a camera reference coordinate system, comprising: driving, by a robotic arm, a projection point of a tool axis of a tool projected on a test plane to perform a relative movement relative to a reference point of the test plane; and establishing the first conversion relationship according to the relative movement; obtaining a tool axis vector relative to an installation surface reference coordinate system of the robotic arm; performing a calibration point information group obtaining step, comprising: (a1) driving, by the robotic arm, a tool center point to coincide with the reference point of the test plane, and recording a calibration point information group of the robotic arm; (a2) driving, by the robotic arm, the tool to change an angle of the tool axis; and (a3) repeating steps (a1) and (a2) to obtain a plurality of the calibration point information groups; and obtaining a tool center point coordinate relative to the installation surface reference coordinate system according to the calibration point information groups.
2. The calibration method according to claim 1, wherein the step of driving, by the robotic arm, the projection point of the tool axis of the tool projected on the test plane to perform the relative movement relative to the reference point of the test plane further comprises: driving, by the robotic arm, the tool to move by a space vector from the reference point along a plurality of axes of the robotic arm reference coordinate system; wherein the step of establishing the first conversion relationship further comprises: capturing, by a camera, an image of the projection point moving on the test plane; wherein the step of establishing the first conversion relationship further comprises: analyzing the image captured by the camera to obtain a value of a plane coordinate of each space vector; and establishing the first conversion relationship between the robotic arm reference coordinate system and the camera reference coordinate system according to mutually orthogonal characteristics of the space vectors.
3. The calibration method according to claim 1, wherein the step of driving, by the robotic arm, the projection point of the tool axis of the tool projected on the test plane to perform the relative movement relative to the reference point of the test plane further comprises: driving, by the robotic arm, tool to move by a first space vector from the reference point along a first axis of the robotic arm reference coordinate system; driving, by the robotic arm, tool to move by a second space vector from the reference point along a second axis of the robotic arm reference coordinate system; driving, by the robotic arm, tool to move by a third space vector from the reference point along a third axis of the robotic arm reference coordinate system; wherein the step of establishing the first conversion relationship further comprises: capturing, by a camera, an image of the projection point moving on the test plane; wherein the step of establishing the first conversion relationship further comprises: analyzing the image captured by the camera to obtain a value of a first plane coordinate of the first space vector; analyzing the image captured by the camera to obtain a value of a second plane coordinate of the second space vector; analyzing the image captured by the camera to obtain a value of a third plane coordinate of the third space vector; and establishing the first conversion relationship between the robotic arm reference coordinate system and the camera reference coordinate system according to mutually orthogonal characteristics of the first space vector, the second space vector and the third space vector.
4. The calibration method according to claim 1, wherein the step of obtaining the tool axis vector comprises: performing offset correction to the tool axis relative to a first axis of the camera reference coordinate system, comprising: (b1) driving the tool to move along a third axis of the camera reference coordinate system; (b2) determining whether a position of the projection point on the test plane in the first axis of the camera reference coordinate system changes according to an image, captured by a camera, of the tool moving relative to the test plane; (b3) when the position of the projection point on the test plane in the first axis changes, driving the tool to rotate by an angle around a second axis of the camera reference coordinate system; and (b4) repeating steps (b1) to (b3) until a position change amount of the projection point of the test plane in the first axis of the camera reference coordinate system is substantially equal to zero.
5. The calibration method according to claim 4, wherein the step of obtaining the tool axis vector comprises: performing offset correction to the tool axis relative to the second axis of the camera reference coordinate system when the projection point of the test plane in the first axis of the camera reference coordinate system is substantially equal to zero, comprising: (c1) driving the tool to move along a third axis of the camera reference coordinate system; (c2) determining whether a position of the projection point on the test plane in the second axis of the camera reference coordinate system changes according to an image, captured by the camera, of the tool moving relative to the test plane; (c3) when the position of the projection point on the test plane in the second axis changes, driving the tool to rotate by an angle around the first axis of the camera reference coordinate system; and (c4) repeating steps (c1) to (c3) until a position change amount of the projection point of the test plane in the second axis of the camera reference coordinate system is substantially equal to zero.
6. The calibration method according to claim 1, wherein the step of obtaining the tool axis vector relative to the installation surface reference coordinate system of the robotic arm comprises: driving the tool axis of the tool to be perpendicular to the test plane; and obtaining the tool axis vector according to a posture of the robotic arm when the tool axis is perpendicular to the test plane.
7. The calibration method according to claim 1, wherein the step of obtaining the tool center point coordinate comprises: adjusting an angle of a light source so that a first light emitted by the tool and a second light emitted by the light source intersect at the tool center point; obtaining a plurality of calibration point information groups where the tool center point coincides with the reference point under a plurality of different postures of the robotic arm; driving the tool to move along the tool axis vector; establishing a calibration point information group matrix according to the calibration point information groups; and obtaining the tool center point coordinate according to the calibration point information group matrix.
8. A teaching method for robotic arm, comprises: (d1) by using the calibration method as claimed in claim 1, obtaining the tool center point coordinate and driving the tool to a first position, so that the tool center point coincides with a designated point of a detection surface at the first position; (d2) translating the tool by a translation distance to a second position; (d3) obtaining a detection angle of the tool according to the translation distance and a stroke difference of the tool center point of the tool along the tool axis; (d4) determining whether the detection angle meets a specification angle; (d5) driving the tool back to the first position when the detection angle does not meet the specification angle; and (d6) adjusting a posture of the robotic arm to perform steps (d2) to (d6) until the detection angle meets the specification angle.
9. A robotic arm system, comprising: a robotic arm configured to carry a tool, wherein the tool has a tool axis; a controller configured to: control the robotic arm to drive a projection point of a tool axis of a tool projected on a test plane to perform a relative movement relative to a reference point of the test plane; establish a first conversion relationship between a robotic arm reference coordinate system of the robotic arm and a camera reference coordinate system according to the relative movement; obtain a tool axis vector relative to an installation surface reference coordinate system of the robotic arm; perform a calibration point information group obtaining step, comprising: (a1) controlling the robotic arm to drive a tool center point to coincide with the reference point of the test plane and recording a calibration point information group of the robotic arm; (a2) controlling the robotic arm to drive the tool to change an angle of the tool axis; and (a3) repeating steps (a1) and (a2) to obtain a plurality of the calibration point information groups; and obtain a tool center point coordinate relative to the installation surface reference coordinate system according to the calibration point information groups.
10. The robotic arm system according to claim 9, further comprises: a camera configured to capture an image of the projection point moving on the test plane; wherein the controller is further configured to: control the robotic arm to drive the tool to move by a space vector from the reference point along a plurality of axes of the robotic arm reference coordinate system; analyze the image captured by the camera to obtain a value of a plane coordinate of each space vector; and establish the first conversion relationship between the robotic arm reference coordinate system and the camera reference coordinate system according to mutually orthogonal characteristics of the space vectors.
11. The robotic arm system according to claim 9, further comprises: a camera configured to capture an image of the projection point moving on the test plane; wherein the controller is further configured to: control the robotic arm to drive the tool to move by a first space vector from the reference point along a first axis of the robotic arm reference coordinate system; control the robotic arm to drive the tool to move by a second space vector from the reference point along a second axis of the robotic arm reference coordinate system; control the robotic arm to drive the tool to move by a third space vector from the reference point along a third axis of the robotic arm reference coordinate system; analyze the image captured by the camera to obtain a value of a first plane coordinate of the first space vector; analyze the image captured by the camera to obtain a value of a second plane coordinate of the second space vector; analyze the image captured by the camera to obtain a value of a third plane coordinate of the third space vector; and establish the first conversion relationship between the robotic arm reference coordinate system and the camera reference coordinate system according to mutually orthogonal characteristics of the first space vector, the second space vector and the third space vector.
12. The robotic arm system according to claim 9, further comprises: a camera configured to capture an image of the projection point moving on the test plane; wherein the controller is further configured to perform offset correction to the tool axis relative to a first axis of the camera reference coordinate system, comprising; (b1) driving the tool to move along a third axis of the camera reference coordinate system; (b2) determining whether a position of the projection point on the test plane in the first axis of the camera reference coordinate system changes according to an image, captured by a camera, of the tool moving relative to the test plane; (b3) when the position of the projection point on the test plane in the first axis changes, driving the tool to rotate by an angle around a second axis of the camera reference coordinate system; and (b4) repeating steps (b1) to (b3) until a position change amount of the projection point of the test plane in the first axis of the camera reference coordinate system is substantially equal to zero.
13. The robotic arm system according to claim 12, wherein the controller further configured to perform offset correction to the tool axis relative to the second axis of the camera reference coordinate system when the projection point of the test plane in the first axis of the camera reference coordinate system is substantially equal to zero, comprising: (c1) driving the tool to move along a third axis of the camera reference coordinate system; (c2) determining whether a position of the projection point on the test plane in the second axis of the camera reference coordinate system changes according to an image, captured by the camera, of the tool moving relative to the test plane; (c3) when the position of the projection point on the test plane in the second axis changes, driving the tool to rotate by an angle around the first axis of the camera reference coordinate system; and (c4) repeating steps (c1) to (c3) until a position change amount of the projection point of the test plane in the second axis of the camera reference coordinate system is substantially equal to zero.
14. The robotic arm system according to claim 9, wherein the controller is further configured to: drive the tool axis of the tool to be perpendicular to the test plane; and obtain the tool axis vector of the tool relative to the installation surface reference coordinate system of the robotic arm according to a posture of the robotic arm when the tool axis is perpendicular to the test plane.
15. The robotic arm system according to claim 9, wherein a first light emitted by the tool and a second light emitted by the light source intersect at the tool center point, and the controller is further configured to: obtain a plurality of calibration point information groups where the tool center point coincides with the reference point under a plurality of different postures of the robotic arm; drive the tool to move along the tool axis vector; establish a calibration point information group matrix according to the calibration point information groups; and obtain the tool center point coordinate according to the calibration point information group matrix.
16. The robotic arm system according to claim 9, wherein the controller is further configured to: (d1) drive the tool to a first position, so that the tool center point coincides with a designated point of a detection surface at the first position; (d2) translate the tool by a translation distance to a second position; (d3) obtain a detection angle of the tool according to the translation distance and a stroke difference of the tool center point of the tool along the tool axis; (d4) determine whether the detection angle meets a specification angle; (d5) drive the tool back to the first position when the detection angle does not meet the specification angle; and (d6) adjust a posture of the robotic arm to perform steps (d2) to (d6) until the detection angle meets the specification angle.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021] In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
DETAILED DESCRIPTION
[0022] Referring to
[0023] The tool 10 is shown with a luminance meter as an example. In another embodiment, the tool 10 is, for example, a machining tool.
[0024] In the present embodiment, the test plane 20 is, for example, the surface of a physical screen. The physical screen is, for example, a transparent screen or an opaque screen. In the case of the opaque screen, the test plane 20 of the physical screen is, for example, white. However, as long as the first light L1 emitted by the tool 10 and the second light L2 emitted by the light source 130 could be clearly displayed (the second light L2 is shown in
[0025] Referring to
[0026] In step S110, the robotic arm system 100 executes the step of establishing a first conversion relationship T1 between the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robotic arm 110 and the camera's reference coordinate system (x.sub.C-y.sub.C-z.sub.C) of the camera 120. The step S110 includes sub-steps S111 to S117. The step of establishing the first conversion relationship T1 includes the following steps: the robotic arm 110 drives the tool axis A1 of the tool 10 to project the projection point P1 on the test plane 20 relative to the reference point O1 of the test plane 20; then, the controller 140 establishes the first conversion relationship T1 between the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R) of the robotic arm 110 and the camera reference coordinate system (x.sub.C-y.sub.C-z.sub.C) according to the relative movement.
[0027] For example, referring to
[0028] In step S111, as shown in
[0029] In step S111, the controller 140 could analyze the image M1 captured by the camera 120, and, as shown in
[0030] In step S112, the controller 140 could analyze the image M1 captured by the camera 120. As shown in
[0031] In step S113, the controller 140 controls the tool 10 moves by a second space vector {right arrow over (V.sub.1)}(x.sub.2,y.sub.2,z.sub.2) from the reference point O1 along a second space vector (for example, the y.sub.R axis) of the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). The value (or length) of the second space vector {right arrow over (V.sub.1)}(x.sub.2,y.sub.2,z.sub.2) is L.sub.R, and an end point of the second space vector {right arrow over (V.sub.1)}(x.sub.2,y.sub.2,z.sub.2) is the projection point Py(x.sub.2, y.sub.2, z.sub.2) in
[0032] Similarly, in step S113, the controller 140 could analyze the image M1 captured by the camera 120 and determine whether the projection point P1′ in the image M1 corresponds to (or is located/coincident with) the reference point O1 in the image M1. When the projection point P1′ does not correspond to the reference point O1 in the image M1, the robotic arm 110 is controlled to move until the projection point P1′ corresponds to the reference point O1 in the image M1. When the projection point P1′ corresponds to the reference point O1 in the image M1, the controller 140 controls the robotic arm 110 to move so that the projection point P1′ moves by the second space vector {right arrow over (V.sub.1)}(x.sub.2,y.sub.2,z.sub.2) from the reference point O1 along the second space vector (for example, the y.sub.R axis) of the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). During the moving, the controller 140 analyzes the image M1 captured by the camera 120 and determines whether the projection point P1′ in the image M1 has moved by the second space vector {right arrow over (V.sub.1)}(x.sub.2, y.sub.2, z.sub.2).
[0033] In step S114, the controller 140 could analyze the image captured by the camera 120 to obtain the value of the first plane coordinate P′y(x.sub.2, y.sub.2) of the projection point P′y of the second space vector {right arrow over (V.sub.1)}(x.sub.2, y.sub.2, z.sub.2), that is, the first axis coordinate value x.sub.2 and the second axis coordinate value y.sub.2.
[0034] In step S115, the controller 140 controls the tool 10 moves by a third space vector {right arrow over (W.sub.1)}(x.sub.3,y.sub.3,z.sub.3) from the reference point O1 along a third space vector (for example, the z.sub.R axis) of the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). The value (or length) of the third space vector {right arrow over (W.sub.1)}(x.sub.3,y.sub.3,z.sub.3) is L.sub.R, and an end point of the third space vector {right arrow over (W.sub.1)}(x.sub.3, y.sub.3, z.sub.3) is the projection point Pz(x.sub.3, y.sub.3, z.sub.3) in
[0035] Similarly, in step S115, the controller 140 could analyze the image M1 captured by the camera 120 and determine whether the projection point P1′ in the image M1 corresponds to (or is located/coincident with) the reference point O1 in the image M1. When the projection point P1′ does not correspond to the reference point O1 in the image M1, the robotic arm 110 is controlled to move until the projection point P1′ corresponds to the reference point O1 in the image M1. When the projection point P1′ corresponds to the reference point O1 in the image M1, the controller 140 controls the robotic arm 110 to move so that the projection point P1′ moves by the third space vector {right arrow over (W.sub.1)}(x.sub.3,y.sub.3,z.sub.3) from the reference point O1 along the third space vector (for example, the z.sub.R axis) of the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). During the moving, the controller 140 analyzes the image M1 captured by the camera 120 and determines whether the projection point P1′ in the image M1 has moved by the third space vector {right arrow over (W.sub.1)}(x.sub.3,y.sub.3,z.sub.3).
[0036] In step S116, the controller 140 could analyze the image captured by the camera 120 to obtain the value of the first plane coordinate P′z(x.sub.3, y.sub.3) of the projection point P′z of the third space vector {right arrow over (W.sub.1)}(x.sub.3,y.sub.3,z.sub.3), that is, the first axis coordinate value x.sub.3 and the second axis coordinate value y.sub.3.
[0037] In step S117, the controller 140 establishes the first conversion relationship T1 between the camera reference coordinate system (x.sub.C-y.sub.C-z.sub.C) and the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R) according to mutually orthogonal characteristics of the first space vector {right arrow over (U.sub.1)}(x.sub.1,y.sub.1,z.sub.1), the second space vector {right arrow over (V.sub.1)}(x.sub.2,y.sub.2,z.sub.2) and the third space vector {right arrow over (W.sub.1)}(x.sub.3,y.sub.3,z.sub.3). For example, the controller 140 could use the following equations (1) to (3) to obtain the third axis coordinate values z.sub.1, z.sub.2 and z.sub.3. As a result, the controller 140 obtains x.sub.1, x.sub.2, x.sub.3, y.sub.1, y.sub.2, y.sub.3, z.sub.1, z.sub.2 and z.sub.3. Then, the controller 140 establishes the first conversion relationship T1 according to the following formula (4).
[0038] As shown in formula (5), the controller 140 could use the first conversion relationship T1 to convert the projection point movement vector S.sub.W into the robotic arm movement vector S.sub.R, wherein the projection point movement vector S.sub.W is the movement vector of the projection point P1 on the test plane 20 relative to the camera reference coordinate system (x.sub.C-y.sub.C-z.sub.C), and the projection point movement vector S.sub.W is the movement vector of the robotic arm 110 relative to the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). The robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R) could be established at any position of the robotic arm 110, for example, the base 111 of the robotic arm 110. Equations (1), (2), and (3) represent the space vectors {right arrow over (U.sub.1)}, {right arrow over (V.sub.1)} and {right arrow over (W.sub.1)} orthogonal to each other. The first conversion relation T1 in formula (4) is the inverse matrix of the space vectors {right arrow over (U.sub.1)}, {right arrow over (V.sub.1)} and {right arrow over (W.sub.1)} divided by the length of the vector (the result is unit vector). Formula (5) represents that the dot product of the first conversion relationship T1 and the projection point movement vector S.sub.W is equal to the robotic arm movement vector S.sub.R.
[0039] Then, in step S120, the robotic arm system 100 obtains the tool axis vector T.sub.ez of the tool 10 relative to the installation surface reference coordinate system (x.sub.f-y.sub.f-z.sub.f).
[0040] For example, referring to
[0041] In step S121, as shown in
[0042] In step S122, the controller 140 obtains the robotic arm movement vector S.sub.R according to the first conversion relationship T1 and the projection point movement vector S.sub.W. For example, the controller 140 could substitute the projection point movement vector S.sub.W into the above formula (5) to obtain (or calculate) the robotic arm movement vector S.sub.R required for the robotic arm 110 to move the projection point P1 to approach or coincide with the reference point O1. The purpose of steps S122 and S123 is to prevent the projection point P1′ from being out of the test plane 20 after of the robotic arm moving or rotating.
[0043] In step S123, as shown in
[0044] Due to the projection point P1 approaching to the reference point O1 in step S123, the moved projection point P1′ in the subsequent step S124A (the moved projection point P1′ is shown in
[0045] Then, in step S124, the controller 140 could perform the offset correction to the tool axis A1 of the tool 10 relative to the first axis (for example, the axis x.sub.C). Steps S124A to S124C are further described below.
[0046] In step S124A, as shown in
[0047] In step S124B, the controller 140 determines whether the position of the projection point P1 on the test plane 20 in the first axis (for example, the axis x.sub.C) changes according to the image captured by the camera 120. If so (for example, in a translation test of the first axis x.sub.C, the position of the projection point P1 of
[0048] In step S124C, as shown in
[0049] In detail, in step S124A, after the robotic arm 110 drives the tool 10 to move or translate along the axis +/−z.sub.C of the camera reference coordinate system (x.sub.C-y.sub.C-z.sub.C). As shown in
[0050] In another embodiment, as shown in
[0051] The controller 140 repeats steps S124A to S124C until the tool axis A1 of the tool 10 projected on the plane x.sub.C-y.sub.C of the camera reference coordinate system (x.sub.C-y.sub.C-z.sub.C) (for example, the angle of viewed in
[0052] In steps S125A to S125C, the controller 140 and the robotic arm 110 could use the process similar to steps S124A to S124C to complete the offset correction to the axis y.sub.C. Hereinafter, further examples are shown in
[0053] In step S125A, as shown in
[0054] In step S125B, the controller 140 determines whether the position of the projection point P1 on the test plane 20 in the second axis (for example, the axis y.sub.C) changes according to the image M1 captured by the camera 120. If so (for example, the position of the projection point P1 of
[0055] In step S125C, as shown in
[0056] In detail, in step S125A, after the robotic arm 110 drives the tool 10 to move or translate along the axis +/−z.sub.C of the camera reference coordinate system (x.sub.C-y.sub.C-z.sub.C). As shown in
[0057] The controller 140 repeats steps S125A to S125C until the tool axis A1 of the tool 10 projected on the plane y.sub.C-z.sub.C of the camera reference coordinate system (x.sub.C-y.sub.C-z.sub.C) (for example, the angle of viewed in
[0058] In step S126, after the offset correction to the first axis and the second axis is completed (it means that the tool axis A1 is perpendicular to the test plane 20, the purpose of steps S124 and S125 is to correct the shift along the axis x.sub.C and the axis y.sub.C axis), the controller 140 establishes a second conversion relationship T.sub.2 according to the posture of the robotic arm 110 when the tool axis A1 is perpendicular to the test plane 20, and obtains the tool axis vector T.sub.ez according to the second conversion relationship T.sub.2, wherein the tool axis vector T.sub.ez is, for example, parallel to or coincides with the tool axis A1. For example, the controller 140 establishes the second conversion relationship T.sub.2 according to the joint angles of the joints J1 to J6 of the robotic arm 110 when the tool axis A1 is perpendicular to the test plane 20. The second conversion relationship T.sub.2 is the conversion relationship of the installation surface (or flange surface) reference coordinate system (x.sub.f-y.sub.f-z.sub.f) of an installation surface 110s of the tool 10 relative to the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). The tool 10 could be installed on the installation surface 110s, and the tool axis A1 of the tool 10 is not limited to be perpendicular to the installation surface 110s. In an embodiment, the second conversion relationship T.sub.2 could be expressed in the following formula (6), and the elements in formula (6) could be obtained by the linkage parameters (Denavit-Hartenberg Parameters) of the robotic arm 110, the coordinates of the joints J1 to J6 and the tool center point WO1 relative to the installation surface reference coordinate system (x.sub.f-y.sub.f-z.sub.f), wherein the link parameters could include link offset, joint angle, link length and Link twist. In addition, the second conversion relationship T.sub.2 could be established by using a known kinematics method.
[0059] As shown in the following formula (7), the vector z.sub.w is the normal vector (i.e., the axis z.sub.C) of the test plane 20 relative to the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R), and the vector T.sub.ez is the vector (herein referred to as the “tool axis vector”) of the tool axis A1 relative to the installation surface reference coordinate system (x.sub.f-y.sub.f-z.sub.f). The controller 140 could convert the vector z.sub.w into the tool axis vector T.sub.ez through the inverse matrix of the second conversion relationship T.sub.2.
T.sub.ez=T2.sup.−1.Math.z.sub.w (7)
[0060] In step S130, the robotic arm system 100 executes the step of obtaining calibration point information groups. Further examples are given below.
[0061] In step S131, referring to
[0062] In the present embodiment, the tool 10 is shown by taking the luminance meter as an example, and the tool center point WO1 is the virtual tool center point. The tool center point WO1 is, for example, the focus of the first light L1 (detection light) projected by the tool 10. In another embodiment, the tool 10 is, for example, a machining tool, and the tool center point WO1 is the tool center point, such as a solid tool tip point. In summary, the tool center point of the embodiment of the present disclosure could be the physical tool center point or the virtual tool center point.
[0063] In one of the methods for adjusting the angle of the light source 130, the controller 140 could obtain the angle θ between the second light L2 emitted by the light source 130 and the first light L1 emitted by the tool 10 along the direction (dotted line from the rotation fulcrum 131 of the tool axis A1) perpendicular to the tool axis A1 according to the following formula (8), and then the angle of the light source 130 could be adjusted to the angle θ by manual or a mechanism (not shown) controlled by the controller 140, so that the emitted second light L2 emitted by the light source 130 and the first light L1 emitted by the tool 10 intersect at the tool center point WO1. The aforementioned mechanisms are, for example, various mechanisms that could drive the light source 130 to rotate, such as a linkage mechanism, a gear set mechanism, etc. Due to the angle θ being given (known), the angle of the light source 130 could be quickly adjusted so that the second light L2 emitted and the first light L1 emitted by the tool 10 intersect at the tool center point WO1. When the angle of the light source 130 is adjusted to the angle θ, the relative relationship between the light source 130 and the tool 10 could be fixed to fix the relative relationship between the tool center point WO1 and the tool 10.
[0064] In formula (8), H1 is the distance (for example, the focal length of the first light L1) between the tool center point WO1 and a light emitting surface 10s of the tool 10 along the tool axis A1, and H2 is the distance between the light emitting surface 10s of the tool 10 and the rotation fulcrum 131 of the light source 130 along the tool axis A1, and H3 is the vertical distance (perpendicular to the tool axis A1) between the rotation fulcrum 131 of the light source 130 and the tool axis A1.
[0065] As shown in
[0066] In step S132, the controller 140 executes the step of obtaining the calibration point information group. For example, the controller 140 could control the robotic arm 110 to make a plurality of calibration points whose tool center point WO1 coincide with the reference point O1 of the test plane 20 under a plurality of different postures, and accordingly record the calibration point information group of each calibration point. For example, the controller 140 could control the robotic arm 110 in a posture to make the tool center point WO1 coincide with the reference point O1 of the test plane 20, and accordingly record the calibration point information group in such posture. Then, the controller 140 controls the robotic arm 110 in another posture to make the tool center point WO1 coincide with the reference point O1 of the test plane 20, and accordingly record the calibration point information group in such posture. According to this principle, the controller 140 could obtain several calibration point information groups of the robotic arm 110 in several different postures. Each calibration point information group could include the coordinates of the joints J1 to J6, and the coordinates of each joint could be the rotation angle of each joint relative to its preset starting point. At least one of the rotation angles of the robotic arm 110 in different postures could be different.
[0067] For example, referring to
[0068] In step S132A, as shown in
[0069] In step S132B, as shown in
[0070] In step S132C, as shown in
[0071] In step S132D, the controller 140 obtains the robotic arm movement vector S.sub.R according to the first conversion relationship T1 and the projection point movement vector S.sub.W. For example, the controller 140 could substitute the projection point movement vector S.sub.W of
[0072] In step S132E, referring to
[0073] In step S132F, the controller 140 determines whether the tool center point WO1 coincides with the reference point O1 of the test plane 20 according to (or analyzes) the image (for example, the image M1 shown in
[0074] Furthermore, if the tool axis A1 in
[0075] In step S132G, the controller 140 records the joint angles of the joints J1 to J6 of the robotic arm 110 in the state where the tool center point WO1 coincides with the reference point O1 of the test plane 20, and uses it as one calibration point information group.
[0076] In step S132H, the controller 140 determines whether the number of the calibration point information groups has reached a predetermined number, for example, at least 3 groups, but could be more. When the number of the calibration point information groups has reached the predetermined number, the process proceeds to step S133, when the number of the calibration point information groups has not reached the predetermined number, the process proceeds to step S132I.
[0077] In step S132I, the controller 140 controls the robotic arm 110 to change the posture of the tool 10. For example, the controller 140 controls the robotic arm 110 to change the angle of at least one of the tool axis A1 of the tool 10 relative to the axis x.sub.C, the axis y.sub.C and the axis z.sub.C, wherein the changed angle is, for example, 30 degrees, 60 degrees or other arbitrary. For example, the controller 140 could generate an Euler angle increments ΔR.sub.x, ΔR.sub.y, ΔR.sub.z, through a random number generator, to correct the azimuth angle (Euler angle) of the robotic arm 110, thereby changing the posture of the robotic arm 110. As this time, the azimuth angle of the robotic arm 110 could be expressed as (R.sub.x+ΔR.sub.x, R.sub.y+ΔR.sub.y, R.sub.z+ΔR.sub.z), wherein (R.sub.x, R.sub.y, R.sub.z) is the original azimuth angle of the robotic arm 110, R.sub.x represents the Yaw angle, R.sub.y represents Pitch angle, and R.sub.z represents Roll angle. If the corrected azimuth angle exceeds the motion range of the robotic arm 110, the controller 140 could regenerate the Euler angle increments through the random number generator.
[0078] Then, the process returns to step S132A to record the calibration point information group of the robotic arm 110 in the new (different) posture of the tool 10. Furthermore, after the controller 140 controls the robotic arm 110 to change the posture of the tool 10, the tool center point WO1 of the tool 10 may deviate from the test plane 20. Therefore, the process returns to step S132A to make the tool center point WO1 coincide with the reference point O1 again, and in the state where the tool center point WO1 coincides with the reference point O1, another calibration point information group under different posture of the robotic arm 110 is recorded. Steps S132A to S132I are repeated until the number of the calibration point information groups recorded by the controller 140 reaches the predetermined number.
[0079] In step S133, when the number of the predetermined number groups recorded by the controller 140 reaches the predetermined number, the controller 140 obtains the tool center point coordinate TP of the tool center point WO1 relative to the installation surface reference coordinate system (x.sub.f-y.sub.f-z.sub.f).
[0080] As shown in the following formula (9), the tool center point coordinate TP could be established according to a plurality of the calibration point information groups of the robotic arm 110 in a plurality of the different postures. The controller 140 could obtain (calculate) the coordinates of the tool center point WO1 according to the calibration point information groups, wherein the coordinates of each calibration point information group could be obtained through the linkage parameters (Denavit-Hartenberg Parameters) of the robot 110, the coordinates of the joints J1 to J6 and Information about the tool center point WO1 relative to the installation surface reference coordinate system (x.sub.f-y.sub.f-z.sub.f), wherein the link parameters could include link offset, joint angle, link length and link twist.
[0081] The coordinates of the tool center point WO1 could be obtained (calculated) by the following formula (9):
[0082] In formula (9), the matrix T.sub.2i is 4×4 homogeneous conversion matrix which converts the coordinate system of the i.sup.th calibration point information group to the installation surface reference coordinate system (x.sub.f-y.sub.f-z.sub.f) from the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). The matrix W.sub.1f in formula (9) includes [T.sub.x T.sub.y T.sub.z].sup.t, which is the coordinate W.sub.1f (T.sub.x, T.sub.y, T.sub.z) of the tool center point WO1 relative to the installation surface reference coordinate system (x.sub.f-y.sub.f-z.sub.f), the matrix [P.sub.x P.sub.y P.sub.z 1].sup.t includes the coordinate W.sub.1R (P.sub.x, P.sub.y, P.sub.z) of the tool center point WO1 relative to the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R) in space. Each calibration point information group could obtain three linear equations through formula (9). Therefore, n calibration point information groups could obtain 3n equations, and then the coordinates of the tool center point WO1 could be obtained through Pseudo-inverse matrix.
[0083] Furthermore, in formula (9), (e.sub.11i, e.sub.21i, e.sub.31i) represents direction of the vector of the i.sup.th calibration point information group in the first axis (for example, the axis x.sub.f) relative to the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). (e.sub.12i, e.sub.22i, e.sub.32i) represents direction of the vector of the i.sup.th calibration point information group in the second axis (for example, the axis y.sub.f) relative to the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). (e.sub.13i, e.sub.23i, e.sub.33i) represents the direction of the vector of the i.sup.th calibration point information group in the third axis (for example, the axis z.sub.f) relative to the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R). The following formulas (10) and (11) could be obtained from formula (9).
[0084] In formula (11), T.sub.3.sup.t is the transpose matrix of T.sub.3, (T.sub.3T.sub.3.sup.t).sup.−1 is the inverse matrix of (T.sub.3T.sub.3.sup.t), and the coordinate (T.sub.x T.sub.y T.sub.z) is the tool center point coordinate TP, and the matrix T.sub.3 is a calibration point information group matrix composed of the calibration point information groups.
[0085] If the number of the calibration point information groups is sufficient, substitute each element in the matrix T.sub.2i corresponding to the i.sup.th calibration point information group into formula (10) and relocate the matrix T.sub.3 to obtain formula (11) to obtain the coordinate W.sub.1f (Tx, Ty, Tz) of the tool center point WO1 relative to the installation surface reference coordinate system (x.sub.f-y.sub.f-z.sub.f) and the coordinate W.sub.1R (P.sub.x, P.sub.y, P.sub.z) of the tool center point WO1 relative to the robotic arm reference coordinate system (x.sub.R-y.sub.R-z.sub.R).
[0086] Of course, the above-mentioned tool center point calibration method is only an example, and each component and/or calibration method of the robotic arm system 100 could be changed according to actual need/demand; however, such exemplification not meant to be limiting.
[0087] After the tool center point coordinate TP is obtained, the controller 140 could accordingly drive the robotic arm 110 to control the tool center point WO1 to a desired position. As a result, the robotic arm system 100 could perform an automatic teaching process for the robotic arm, and the following description is submitted with
[0088] Referring to
[0089] In the following, the process of the robotic arm system 100 of
[0090] In step S210, as shown in
[0091] In step S220, as shown in
[0092] In step S230, as shown in
θ.sub.H=π/2−tan.sup.−1(ΔT.sub.Z1/L.sub.H) (13)
[0093] In step S240, the controller 140 determines whether the detection angle θ.sub.H meets a first specification angle. When the detection angle θ.sub.H does not meet the first specification angle, the process proceeds to step S250, and the controller 140 drives the tool 10 back to the first position S1. For example, the controller 140 could control the moving element 150 to translate for driving the tool 10 back to the first position S1. The first specification angle is, for example, specification value of the product, which is the specification value required when the tool 10 detects the detection surface 30 along the first detection direction. In detail, when the detection angle θ.sub.H meets the first specification angle, analysis result of the display by the tool 10 (for example, the luminance meter) does not exceed the range (for example, when viewing display screen of the display in a skewed angle of view, a black screen or abnormal color will not be generated). The value of the first specification angle depends on the type of product, for example, the maximum viewing angle or the viewing angle of the flat-panel display; however, such exemplification not meant to be limiting.
[0094] In step S260, when the tool 10 returns to the first position, the controller 140 adjusts the posture of the robotic arm 110 to change the angle of the tool axis A1 relative to the detection surface 30, and then the process returns to step S210. The controller 140 could repeat steps S210 to S260 until the detection angle θ.sub.H meets the first specification angle. For example, if the detection angle θ.sub.H does not meet the first specification angle, the controller 140 controls the robotic arm 110 to rotate to rotate the tool 10 by an angle around the second axis (for example, the axis y.sub.d), and then the process returns to step S210. Repeat steps S210 to S260 according to this principle until the detection angle θ.sub.H meets the first specification angle.
[0095] Similarly, as shown in
[0096] For example, in step S210, as shown in
[0097] In step S220, as shown in
[0098] as shown in
θ.sub.V=tan.sup.−1(ΔT.sub.Z2/L.sub.V) (14)
[0099] In step S240, the controller 140 determines whether the detection angle θ.sub.V meets a second specification angle. When the detection angle θ.sub.V does not meet the second specification angle, the process proceeds to step S250, and the controller 140 drives the tool 10 back to the second position S2. For example, the controller 140 could control the moving element 150 to translate for driving the tool 10 back to the second position S2. The second specification angle is, for example, specification value of the product, which is the specification value required when the tool 10 detects the detection surface 30 along the second detection direction. In detail, when the detection angle θ.sub.V meets the second specification angle, analysis result of the display by the tool 10 (for example, the luminance meter) does not exceed the range (for example, when viewing display screen of the display in a skewed angle of view, a black screen or abnormal color will not be generated). The value of the second specification angle depends on the type of product, for example, the maximum viewing angle or the viewing angle of the flat-panel display; however, such exemplification not meant to be limiting.
[0100] In step S260, when the tool 10 returns to the first position S1, the controller 140 adjusts the posture of the robotic arm 110 to change the angle of the tool axis A1 relative to the detection surface 30, and then the process returns to step S210. The controller 140 could repeat steps S210 to S260 until the detection angle θ.sub.V meets the second specification angle. For example, if the detection angle θ.sub.V does not meet the second specification angle, the controller 140 controls the robotic arm 110 to rotate to rotate the tool 10 by an angle around the first axis (for example, the axis x.sub.d) until the detection angle θ.sub.V meets the first specification angle.
[0101] When the detection angle θ.sub.H meets the first specification angle and the detection angle θ.sub.V meets the second specification angle, the controller 140 records the joint coordinate combination or performs detection according to the current posture of the robotic arm 110. For example, the controller 140 records the change amount of the motion parameters of the joints J1 to J6 of the robotic arm 110 during the steps S210 to S260.
[0102] In summary, according to the robotic arm system and the calibration method using the tool center point in the embodiments of the present disclosure, even if additional sensors and measuring devices (for example, three-dimensional measurement equipment) are not used, the calibration for the tool center point and the automatic teaching for the robotic arm could be performed.
[0103] It will be apparent to those skilled in the art that various modifications and variations could be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.