ROBOT CONTROL DEVICE, ROBOT, ROBOT SYSTEM, AND CALIBRATION METHOD OF CAMERA FOR ROBOT
20190015988 ยท 2019-01-17
Inventors
Cpc classification
G05B2219/37009
PHYSICS
G05B2219/39008
PHYSICS
International classification
Abstract
An processor moves an arm to rotate a calibration pattern around three rotation axes linearly independent from each other and to stop at a plurality of rotation positions. A processor causes a camera to capture a pattern image of the calibration pattern at the plurality of rotation positions. A processor estimates parameters of the camera for calculating a coordinate transformation between a target coordinate system and a camera coordinate system using a pattern image captured at the plurality of rotation positions.
Claims
1. A control device that controls a robot having an arm provided with a calibration pattern of a camera and the camera provided independently of the arm, comprising: a processor that is configured to execute computer-executable instructions so as to control the robot, wherein the processor is configured to: move the arm to rotate a calibration pattern around three rotation axes linearly independent from each other and to stop at a plurality of rotation positions, cause the camera to capture a pattern image of the calibration pattern at the plurality of rotation positions, and determine parameters of the camera for calculating the coordinate transformation between a target coordinate system having a known relative position and attitude with respect to a robot coordinate system of the robot and a camera coordinate system of the camera using the pattern image captured at the plurality of rotation positions.
2. The control device according to claim 1, wherein the three rotation axes are set around an origin point of the target coordinate system.
3. The control device according to claim 1, wherein the processor estimates three rotation vectors having a direction of each rotation axis as a vector direction and an angle of the rotation as a vector length from the pattern image captured at the plurality of rotation positions, normalizes each of the three rotation vectors to acquire three normalized rotation vectors, and determines a rotation matrix constituting a coordinate transformation matrix between the target coordinate system and the camera coordinate system by arranging the three normalized rotation vectors as a row component or a column component.
4. The control device according to claim 3, wherein the coordinate transformation matrix between the target coordinate system and the camera coordinate system is represented by a product of a first transformation matrix between the camera coordinate system and a pattern coordinate system of the calibration pattern and a second transformation matrix between the pattern coordinate system and the target coordinate system, and wherein the processor (a) estimates the first transformation matrix from the pattern image captured at one specific rotation position among the plurality of the rotation positions, (b) estimates a square sum of two translation vector components in two coordinate axis directions orthogonal to each rotation axis among three components of a translation vector constituting the second transformation matrix from the pattern image captured at the plurality of rotation positions, and calculates the translation vector constituting the second transformation matrix from the square sum of the translation vector components estimated respectively for the three rotation axes, and (c) calculates a translation vector constituting the coordinate transformation matrix from the first transformation matrix estimated at the specific rotation position and the translation vector of the second transformation matrix.
5. The control device according to claim 1, wherein the target coordinate system is a coordinate system having a relative position and attitude fixed with respect to the robot coordinate system of the robot independently of the arm.
6. The control device according to claim 1, wherein the target coordinate system is a hand coordinate system of the arm.
7. A robot connected to the control device according to claim 1.
8. A robot connected to the control device according to claim 2.
9. A robot connected to the control device according to claim 3.
10. A robot connected to the control device according to claim 4.
11. A robot connected to the control device according to claim 5.
12. A robot connected to the control device according to claim 6.
13. A robot system comprising: a robot; and the control device connected to the robot according to claim 1.
14. A robot system comprising: a robot; and the control device connected to the robot according to claim 2.
15. A robot system comprising: a robot; and the control device connected to the robot according to claim 3.
16. A robot system comprising: a robot; and the control device connected to the robot according to claim 4.
17. A robot system comprising: a robot; and the control device connected to the robot according to claim 5.
18. A robot system comprising: a robot; and the control device connected to the robot according to claim 6.
19. A method for performing camera calibration in a robot system including a robot having an arm provided with a calibration pattern of a camera and the camera provided independently of the arm, the method comprising: moving the arm to rotate a calibration pattern around three rotation axes linearly independent from each other and to stop at a plurality of rotation positions; causing the camera to capture a pattern image of the calibration pattern at the plurality of rotation positions; and determining parameters of the camera for calculating a coordinate transformation between a target coordinate system having a known relative position and attitude with respect to a robot coordinate system of the robot and a camera coordinate system of the camera using the pattern image captured at the plurality of rotation positions.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0027] The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
DESCRIPTION OF EXEMPLARY EMBODIMENTS
A. Configuration of Robot System
[0037]
[0038] The robot 100 is provided with a base 110, a body portion 120, a shoulder portion 130, a neck portion 140, a head portion 150, and two arms 160L and 160R. Hands 180L and 180R are detachably attached to the arms 160L and 160R. These hands 180L and 180R are end effectors for holding a workpiece or a tool. Cameras 170L and 170R are installed in the head portion 150. These cameras 170L and 170R are provided independently of the arms 160L and 160R, and are fixed cameras whose position and attitude are not changed. A calibration pattern 400 for the cameras 170L and 170R can be installed in the arms 160L and 160R.
[0039] Force sensors 190L and 190R are provided in a wrist portion of the arms 160L and 160R. The force sensors 190L and 190R are sensors for detecting a reaction force or a moment with respect to a force that the hands 180L and 180R exert on the workpiece. As the force sensors 190L and 190R, for example, it is possible to use a six-axis force sensor capable of simultaneously detecting six components of force components in translational three-axis directions and the moment components around three rotation axes. The force sensors 190L and 190R are optional.
[0040] The letters L and R appended to the end of symbols of the arms 160L and 160R, the cameras 170L and 170R, the hands 180L and 180R, and the force sensors 190L and 190R mean left and right. In a case where these distinctions are unnecessary, explanations will be made using symbols without the letters L and R.
[0041] The control device 200 includes a processor 210, a main memory 220, a non-volatile memory 230, a display control unit 240, a display 250, and an I/O interface 260. These units are connected via a bus. The processor 210 is, for example, a microprocessor or a processor circuit. The control device 200 is connected to the robot 100 via the I/O interface 260. The control device 200 may be stored in the robot 100.
[0042] As a configuration of the control device 200, various configurations other than the configuration illustrated in
[0043]
B. Robot Coordinate System and Coordinate Transformation
[0044]
[0045] A tool center point (TCP) is set on at an end of the arm 160. Typically, control of the robot 100 is executed to control a position and attitude of the tool center point TCP. A position and attitude means three coordinate values in a three-dimensional coordinate system and a state defined by rotation around each coordinate axis. In the example in
[0046] The calibration of the camera 170 is a process for determining an intrinsic parameter and an extrinsic parameter of the camera 170. The intrinsic parameter is a specific parameter of the camera 170 and the lens system thereof, and includes, for example, a projective transformation parameter, a distortion parameter, and the like. The extrinsic parameter is a parameter used when calculating a relative position and attitude between the camera 170 and the arm 160 of the robot 100, and includes a parameter for expressing translation or rotation between a robot coordinate system .sub.0 and a camera coordinate system .sub.C. However, the extrinsic parameter can be configured as a parameter for expressing translation or rotation between the camera coordinate system .sub.C and a target coordinate system other than the robot coordinate system .sub.0. The target coordinate system may be a coordinate system acquired from the robot coordinate system .sub.0. For example, a coordinate system having a known relative position and attitude fixed with respect to the robot coordinate system .sub.0, and a coordinate system in which the relative position and attitude with respect to the robot coordinate system .sub.0 according to movement amount of a joint of the arm 160 may be selected as a target coordinate system. The extrinsic parameter corresponds to a camera parameter for calculating the coordinate transformation between the target coordinate system and a camera coordinate system of the camera.
[0047] In
[0048] (1) Robot coordinate system .sub.0: a coordinate system having a reference point R0 of the robot 100 as a coordinate origin point
[0049] (2) Arm coordinate system .sub.A: a coordinate system having a reference point A0 of the arm 160 as a coordinate origin point
[0050] (3) Hand coordinate system .sub.T: a coordinate system having a tool center point (TCP) as a coordinate origin point
[0051] (4) Pattern coordinate system .sub.P: a coordinate system having a predetermined position on the calibration pattern 400 as a coordinate origin point
[0052] (5) Camera coordinate system .sub.C: a coordinate system set in the camera 170
[0053] The arm coordinate system .sub.A and the hand coordinate system .sub.T are individually set in the right arm 160R and the left arm 160L. In the example in
[0054] In general, a transformation from a certain coordinate system .sub.A to another coordinate system .sub.B, or transformation of position and attitude in these coordinate systems can be expressed as a homogeneous transformation matrix .sup.AH.sub.B illustrated below.
[0055] Here, R represents a rotation matrix, T represents a translation vector, and R.sub.x, R.sub.y, and R.sub.z represent column components of a rotation matrix R. Hereinafter, the homogeneous transformation matrix .sup.AH.sub.B is also referred to as coordinate transformation matrix .sup.AH.sub.B, transformation matrix .sup.AH.sub.B, or simply transformation .sup.AH.sub.B. The superscript .sup.A on the left side of a transformation symbol .sup.AH.sub.B indicates the coordinate system before the transformation, and the subscript .sub.B on the right side of the transformation symbol .sup.AH.sub.B indicates the coordinate system after the transformation. The transformation .sup.AH.sub.B can be also considered as indicating an origin position and basic vector components of the coordinate system .sub.B seen in the coordinate system .sub.A.
[0056] An inverse matrix .sup.AH.sub.B.sup.1(=.sup.BH.sub.A) of the transformation .sup.AH.sub.B is given by the following expression.
[0057] The rotation matrix R has the following important properties.
Rotation Matrix R Property 1
[0058] The rotation matrix R is an orthonormal matrix, and an inverse matrix R.sup.1 thereof is equal to a transposed matrix R.sup.T.
Rotation Matrix R Property 2
[0059] The three column components R.sub.x, R.sub.y, and R.sub.z of the rotation matrix R are equal to three basic vector components of the coordinate system .sub.B after rotation seen in the original coordinate system .sub.A.
[0060] In a case where the transformations .sup.AH.sub.B and .sup.BH.sub.C are sequentially applied to a certain coordinate system .sub.A, a combined transformation .sup.AH.sub.C is acquired by multiplying each of the transformations .sup.AH.sub.B and .sup.BH.sub.C sequentially to the right.
.sup.AH.sub.C=.sup.AH.sub.B.Math..sup.BH.sub.C(3)
[0061] Regarding the rotation matrix R, the same relationship as Expression (3) is established.
.sup.AR.sub.C=.sup.AR.sub.B.Math..sup.BR.sub.C(4)
C. AX=XB Problem of Coordinate Transformation
[0062] In
[0063] (1) Transformation .sup.0H.sub.T (calculable): a transformation from the robot coordinate system .sub.0 to the hand coordinate system .sub.T
[0064] (2) Transformation .sup.TH.sub.P (unknown): a transformation from the hand coordinate system .sub.T to the pattern coordinate system .sub.P
[0065] (3) Transformation .sup.PH.sub.C (estimable): a transformation from the pattern coordinate system .sub.P to the camera coordinate system .sub.C
[0066] (4) Transformation .sup.CH.sub.0 (unknown): a transformation from the camera coordinate system .sub.C to the robot coordinate system .sub.0
[0067] The parameter that associates the robot coordinate system .sub.0 and the camera coordinate system .sub.C is the transformation .sup.CH.sub.0. Normally, acquiring the transformation .sup.CH.sub.0 corresponds to the calibration of the camera 170.
[0068] The calibration of the camera 170 in a first embodiment, TCP is set as a calibration target point, and the hand coordinate system .sub.T is selected as the target coordinate system of a calibration target point. Then, a transformation
[0069] .sup.TH.sub.C(=.sup.TH.sub.P.Math..sup.PH.sub.C) or .sup.CH.sub.T(=.sup.CH.sub.P.Math..sup.PH.sub.T) between the hand coordinate system .sub.T and the camera coordinate system .sub.C is estimated. Since the transformation .sup.TH.sub.0 (or .sup.CH.sub.T) between the hand coordinate system .sub.T and the robot coordinate system .sub.0 is calculable, if the transformation .sup.TH.sub.C (or .sup.CH.sub.T) between the hand coordinate system .sub.T and the camera coordinate system .sub.C can be acquired, the transformation .sup.CH.sub.0 (or .sup.0H.sub.C) between the robot coordinate system .sub.0 and the camera coordinate system .sub.C is also calculable. A coordinate system other than the hand coordinate system .sub.T can be selected as the target coordinate system, and any coordinate system having the known relative position and attitude with respect to the robot coordinate system .sub.0 can be selected. The case of selecting a coordinate system other than the hand coordinate system .sub.T as the target coordinate system will be explained in a second embodiment.
[0070] Among the four transformations .sup.0H.sub.T, .sup.TH.sub.P, .sup.PH.sub.C, and .sup.CH.sub.0 described above, the transformation .sup.0H.sub.T is the transformation that connects the robot coordinate system .sub.0 with the hand coordinate system .sub.T of the TCP as the calibration target point. Normally, the process of acquiring the position and attitude of the TCP with respect to the robot coordinate system .sub.0 is referred to as a forward kinematics, and is calculable if the geometric shape of the arm 160 and movement amount (rotation angle) of each joint are determined. In other words, the transformation .sup.0H.sub.T is a calculable transformation. The transformation .sup.0H.sub.A from the robot coordinate system .sub.0 to the arm coordinate system .sub.A is fixed and known.
[0071] The transformation .sup.TH.sub.P is a transformation from the hand coordinate system .sub.T to the pattern coordinate system .sub.P of the calibration pattern 400. In JP-A-2010-139329, the transformation .sup.TH.sub.P is required to be a known fixed transformation, but it is assumed to be unknown in the present embodiment.
[0072] The transformation .sup.PH.sub.C is a transformation from the pattern coordinate system .sub.P to the camera coordinate system .sub.C, an image of the calibration pattern 400 is captured by the camera 170, and can be estimated by performing image processing on the image. The process of estimating the transformation .sup.PH.sub.C can be executed using standard software (for example, camera calibration function of OpenCV or MATLAB) for performing camera calibration.
[0073] Following the above-described transformations .sup.0H.sub.T, .sup.TH.sub.P, .sup.PH.sub.C, and .sup.CH.sub.0 in order will lead to the initial robot coordinate system .sub.0, and the following expression will be established using an identity transformation I.
.sup.0H.sub.T.Math..sup.TH.sub.P.Math..sup.PH.sub.C.Math..sup.CH.sub.0=I(5)
[0074] The following expression can be acquired by multiplying inverse matrixes .sup.0H.sub.T.sup.1, .sup.TH.sub.P.sup.1, and .sup.PH.sub.C.sup.1 of each transformation in order from the left on both sides of Expression (5).
.sup.CH.sub.0=.sup.PH.sub.C.sup.1.Math..sup.TH.sub.P.sup.1.Math..sup.0H.sub.T.sup.1(6)
[0075] In Expression (6), the transformation .sup.PH.sub.C can be estimated using a camera calibration function, and the transformation .sup.0H.sub.T is calculable. Accordingly, if the transformation .sup.TH.sub.P is known, the right side of the expression is calculable, and the answer of the transformation .sup.CH.sub.0 can be known. This is the reason why the transformation .sup.TH.sub.P is assumed to be known in the related art.
[0076] On the other hand, if the transformation .sup.TH.sub.P is unknown, the right side of Expression (6) is not calculable, and another processing is required. For example, with consideration of two attitudes i and j of the arm 160R in
.sup.0H.sub.T(i).Math..sup.TH.sub.P.Math..sup.PH.sub.C(i).Math..sup.CH.sub.0=I(7a)
.sup.0H.sub.T(j).Math..sup.TH.sub.P.Math..sup.PH.sub.C(j).Math..sup.CH.sub.0=I(7b)
[0077] By multiplying an inverse matrix .sup.CH.sub.0.sup.1 of the transformation .sup.CH.sub.0 on both Expressions (7a) and (7b) from the right side, following expressions are acquired.
.sup.0H.sub.T(i).Math..sup.TH.sub.P.Math..sup.PH.sub.C(i)=.sup.CH.sub.0.sup.1(8a)
.sup.0H.sub.T(j).Math..sup.TH.sub.P.Math..sup.PH.sub.C(j)=.sup.CH.sub.0.sup.1(8b)
[0078] Although the right sides of Expressions (8a) and (8b) are unknown, since the expressions are the same transformation, the following expression is established.
.sup.0H.sub.T(i).Math..sup.TH.sub.P.Math..sup.PH.sub.C(i)=.sup.0H.sub.T(j).Math..sup.TH.sub.P.Math..sup.PH.sub.C(j)(9)
[0079] When multiplying .sup.0H.sub.T(J).sup.1 on the left side and .sup.PH.sub.C(i).sup.1 on the right side on both sides of Expression (9), the following expression is acquired.
(.sup.0H.sub.T(j).sup.1.Math..sup.0H.sub.T(i)).Math..sup.TH.sub.P=.sup.TH.sub.P.Math.(.sup.PH.sub.C(j).Math..sup.PH.sub.C(i).sup.1)(10)
[0080] Here, when the products of the transformation in parentheses of the left and the right sides of Expression (10) are written as A and B, and the unknown transformation .sup.TH.sub.P as X, following equation can be acquired.
AX=XB(11)
[0081] This is a well-known process as AX=XB problem, and a nonlinear optimization process is required to solve the unknown matrix X. However, there is a problem that there is no guarantee that the nonlinear optimization process will converge to an optimal solution.
[0082] As will be described in detail below, in the first embodiment, by causing the calibration pattern 400 to change the predetermined position and attitude using the fact that the arm 160 provided with the calibration pattern 400 can be optionally controlled, it is possible to estimate the transformation .sup.TH.sub.C(=.sup.TH.sub.P.Math..sup.PH.sub.C) or .sup.CH.sub.T(=.sup.CH.sub.P.Math..sup.PH.sub.T) between the hand coordinate system .sub.T which is the target coordinate system and the camera coordinate system .sub.C. As a result, it is possible to determine the extrinsic parameter of the camera 170.
D. Processing Procedure of Embodiment
[0083]
[0084] Step S110 to step S120 are processes for determining the intrinsic parameter of the camera 170. First, in step S110, the camera 170 is used to capture images of the calibration pattern 400 in a plurality of positions and attitudes. Since these plurality of positions and attitudes are to determine the intrinsic parameter of the camera 170, any position and attitude can be applied. In step S120, the camera calibration execution unit 213 estimates the intrinsic parameter of the camera 170 using a plurality of pattern images acquired in step S110. As described above, the intrinsic parameter of the camera 170 is a specific parameter of the camera 170 and the lens system thereof, and includes, for example, a projective transformation parameter, a distortion parameter, and the like. Estimation of the intrinsic parameter can be executed using standard software (for example, camera calibration function of OpenCV or MATLAB) for performing camera calibration.
[0085] Steps S130 to S180 are processes for determining the extrinsic parameter of the camera 170. In step S130, the calibration pattern 400 is rotated around three rotation axes of the hand coordinate system .sub.T, and an image of the calibration pattern 400 is captured at a plurality of rotation positions in the rotation around each rotation axis. Hereinafter, the captured image of the calibration pattern 400 with the camera 170 is referred to as pattern image.
[0086]
[0087] In step S140, the transformation .sup.PH.sub.C or .sup.CH.sub.P between the pattern coordinate system .sub.P and the camera coordinate system .sub.C is estimated for each pattern image captured in step S130. The estimation can be executed using standard software (for example, Open CV function FindExtrinsicCameraParams2) for estimating the extrinsic parameter of the camera with the intrinsic parameter acquired in step S120.
[0088] Instep S150, a rotation matrix .sup.CR.sub.T or .sup.TR.sub.C between the camera coordinate system .sub.C and the hand coordinate system .sub.T can be estimated using the transformation .sup.PH.sub.C or .sup.CH.sub.P acquired in step S140. Hereinafter, first, rotation around the X axis will be described as an example.
[0089] Frist, a rotation matrix .sup.PR.sub.C of the transformation .sup.PH.sub.C acquired from the pattern image of the basic rotation position is simply written as R(.sub.0). In addition, the rotation matrix .sup.PR.sub.C of the transformation .sup.PH.sub.C acquired from the pattern image in a state being rotated x around X axis will be written as R(.sub.0+x) and R(.sub.0x), respectively. At this time, the following expressions are established.
R (.sub.0+x)=R (.sub.0).Math.R (x)(12a)
R (x)=R (.sub.0).sup.1.Math.R (.sub.0+x)(12b)
Here, the rotation matrix R(x) is a rotation matrix that rotates the coordinate system by +x from the basic rotation position. As expressed in Expression (12b), the rotation matrix R(x) can be calculated as a product of the inverse matrix R(.sub.0).sup.1 of the rotation matrix R(.sub.0) at the basic rotation position and the rotation matrix R(.sub.0+x) at a position being rotated by +x from the basic rotation position.
[0090] In general, any rotation around three axes of coordinate system is expressed in a rotation matrix or three Euler angles in many cases, instead, the rotation can be expressed with one rotation axis and a rotation angle around the rotation axis. When using the latter expression, the rotation matrix R(x) can be transformed to a rotation vector Rod(x) given by the following expressions.
Here, n.sub.x, n.sub.y, and n.sub.z are three axis components indicating a direction of the rotation axis. In other words, rotation vector Rod is a vector having a rotation axis direction as a vector direction and a rotation angle as a vector length. The transformation from the rotation matrix R(x) to the rotation vector Rod(x) can be performed using, for example, OpenCV function, Rodrigues2.
[0091] As described above, the rotation matrix R(x) is a matrix representing the fact that the coordinate system is rotated by +x around the X axis of the hand coordinate system .sub.T from the basic rotation position. Accordingly, the vector direction of the rotation vector Rod(x) equivalent to the rotation matrix R(x) indicates the rotation axis direction, that is, the X axis direction of the hand coordinate system .sub.T seen in the camera coordinate system .sub.C.
[0092] Here, consider the rotation matrix .sup.CR.sub.T from the camera coordinate system .sub.C to the hand coordinate system .sub.T. As described as the rotation matrix R property 2 with respect to the general homogeneous transformation matrix indicated in the above described Expressions (1a) to (1d), the three column components R.sub.x, R.sub.y, and R.sub.z of a random rotation matrix R refer to three basic vectors of the coordinate system seen from the original coordinate system. Accordingly, a normalized rotation vector Rod*(x) acquired by normalizing the length of the above-described rotation vector Rod(x) to 1 is the X component (leftmost column component) of the rotation matrix .sup.CR.sub.T from the camera coordinate system .sub.C to the hand coordinate system .sub.T.
[0093] By performing the same process for Y axis and Z axis, three column components Rod*(x), Rod*(y), and Rod*(z) of the rotation matrix .sup.CR.sub.T from the camera coordinate system .sub.C to the hand coordinate system .sub.T can be acquired.
.sup.CR.sub.T=(Rod*(x) Rod*(.sub.y) Rod*(.sub.y))(15)
[0094] The inverse transformation .sup.TR.sub.C of the rotation matrix .sup.CR.sub.T is the same as the transposed matrix of the rotation matrix .sup.CR.sub.T. Thereby, if the normalized rotation vectors Rod*(x), Rod*(y), and Rod*(y) can be arranged as a row component instead of a column component, a rotation matrix .sup.TR.sub.C from the hand coordinate system .sub.T to the camera coordinate system .sub.C can be acquired directly.
[0095] In this way, in step S150, three rotation vectors Rod(x), Rod(y) , and Rod(y) having directions of each rotation axes as the vector direction and the rotation angle as the vector length are estimated from the pattern image captured at the plurality of rotation positions in the rotation around each of the rotation axes of the hand coordinate system .sub.T, which is the target coordinate system. By arranging these components of the normalized rotation vectors Rod*(x), Rod*(y), and Rod*(y) acquired by normalizing the rotation vectors as a row component or a column component, it is possible to determine the rotation matrix .sup.CR.sub.T or .sup.TR.sub.C constituting a coordinate transformation matrix .sup.CH.sub.T or .sup.TH.sub.C between the hand coordinate system .sub.T and the camera coordinate system .sub.C.
[0096] There is a possibility that a detection error may be included in the process in step S150. In this case, in the example illustrated in
[0097] Further, there is a possibility that the rotation matrix .sup.TR.sub.P acquired by the above-described process does not have orthonormality. In this case, it is preferable to orthogonalize each column of the rotation matrix .sup.TR.sub.P using some kind of orthogonalization means (for example, Gram-Schmidt orthogonalization method). It is preferable to select an axis orthogonal to an image plane (Z axis in the example of
[0098] The rotation angles x, y, and z on the X, Y, Z axes are already known. Therefore, in a case where the difference between the rotation angle detected in the above-described process and the known rotation angle exceeds the allowable range considering the detection error, it may be determined that the processing result is abnormal.
[0099] In step S160, the rotation matrix .sup.TR.sub.P or .sup.PR.sub.T between the hand coordinate system .sub.T and the pattern coordinate system .sub.P is calculated. In step S140 described above, in each pattern image, the transformation .sup.PH.sub.C or .sup.CH.sub.P between the pattern coordinate system .sub.P and the camera coordinate system .sub.C is estimated, and the rotation matrix .sup.PR.sub.C or .sup.CR.sub.P constituting the transformation .sup.PH.sub.C or .sup.CH.sub.P thereof is already known. For example, the rotation matrix .sup.TR.sub.P between the hand coordinate system .sub.T and the pattern coordinate system .sub.P can be calculated with the following expression using the rotation matrix .sup.CR.sub.P estimated in a specific rotation position (for example, basic rotation position) and the rotation matrix .sup.TR.sub.C acquired in step S150.
.sup.TR.sub.P=.sup.TR.sub.C.Math..sup.CR.sub.P(16)
[0100]
[0101] In step S170, a translation vector .sup.TT.sub.P or .sup.PT.sub.T between the hand coordinate system .sub.T and the pattern coordinate system .sub.P is estimated. Here, first, consider the case where the calibration pattern 400 is rotated around X axis of the hand coordinate system .sub.T.
[0102]
[0103] Expressions similar to Expressions (17a) to (17c) are established around Y axis rotation and Z axis rotation, and are given as below.
[0104] When Expressions (18a) to (18c) are deformed, the following expression can be acquired.
[0105] As explained in
[0106] In step S160 described above, the rotation matrix .sup.TR.sub.P or .sup.PR.sub.T between the hand coordinate system .sub.T and the pattern coordinate system .sub.P is acquired. If the translation vector .sup.TT.sub.P from the hand coordinate system .sub.T to the pattern coordinate system .sub.P can be estimated by a process in step S170 described above, the translation vector .sup.PT.sub.T from the pattern coordinate system .sub.P to the hand coordinate system .sub.T can be calculated with Expression (2) described above.
[0107] In this way, in step S170, square sums r.sub.x.sup.2, r.sub.y.sup.2, and r.sub.z.sup.2 of two translation vector components of two coordinate axis directions orthogonal to each rotation axis can be estimated among three components T.sub.x, T.sub.y, T.sub.z of the translation vector .sup.PT.sub.T or .sup.TT.sub.P constituting a transformation matrix .sup.PH.sub.T or .sup.TH.sub.P between the pattern coordinate system .sub.P and the hand coordinate system .sub.T from the pattern image captured at the plurality of rotation positions rotating around each rotation axis of the hand coordinate system .sub.T, which is the target coordinate system. In addition, the translation vector .sup.PT.sub.T or .sup.TT.sub.P constituting the transformation matrix .sup.PH.sub.T or .sup.TH.sub.P can be calculated from square sums r.sub.x.sup.2, r.sub.y.sup.2, and r.sub.z.sup.2 of the translation vector components estimated respectively in the three rotation axes.
[0108]
[0109] In step S180, a translation vector .sup.CT.sub.T or .sup.TT.sub.C between the camera coordinate system .sub.C and the hand coordinate system .sub.T is calculated from a transformation matrix .sup.CH.sub.P or .sup.PH.sub.C estimated at a specific rotation position (for example, basic rotation position) in step S140 and the translation vector .sup.PT.sub.T or .sup.TT.sub.P acquired in step S170. For example, the translation vector .sup.CT.sub.T from the camera coordinate system .sub.C to the hand coordinate system .sub.T can be calculated by the following expression.
Here, .sup.CH.sub.P is a homogeneous transformation matrix estimated from the pattern image of the specific rotation position (for example, basic rotation position) in step S140, and .sup.PT.sub.T is a translation vector acquired in step S170. A translation vector .sup.TT.sub.C from the hand coordinate system .sub.T to the camera coordinate system .sub.C can also be calculated with the same expression.
[0110] By the process in
[0111] In the present embodiment, three rotation axes X, Y, and Z are set around the origin point of the hand coordinate system .sub.T which is the target coordinate system, and the arm 160 is operated to rotate the calibration pattern 400 around each rotation axis and to be stopped at a plurality of rotation positions. The pattern image of the calibration pattern 400 at the plurality of rotation positions of the rotation around each rotation axis is captured by the camera 170, and a coordinate transformation matrix .sup.TH.sub.C or .sup.CH.sub.T between a hand coordinate system .sub.T and a camera coordinate system .sub.C can be estimated using these pattern images. In the processing procedure, directions of the three rotation axes seen in the camera coordinate system .sub.C using the pattern image of the plurality of rotation positions around each rotation axis can be estimated. In addition, since the three rotation axes X, Y, and Z are linearly independent of each other, the coordinate transformation matrix .sup.TH.sub.C or .sup.CH.sub.T between the hand coordinate system .sub.T and the camera coordinate system .sub.C can be determined from the directions of these rotation axes. As a result, an extrinsic parameter for calculating a coordinate transformation between the hand coordinate system .sub.T and the camera coordinate system .sub.C can be acquired, and thereby it is possible to detect a position of a target using the camera 170.
[0112] In the above-described embodiment, X axis, Y axis, and Z axis are selected as a rotation axis around the origin point of the hand coordinate system .sub.T, but as long as the three rotation axes are linearly independent, any three rotation axes can be selected. In the case of using three rotation axes other than X axis, Y axis, and Z axis, it may be transformed to components of X axis, Y axis, and Z axis of the hand coordinate system .sub.T from components of each axis of the estimated result. However, if the direction (X, Y, Z axis) of the three basic vectors of the hand coordinate system .sub.T is selected as the rotation axis, there is an advantage that it is easier to perform the above-described process. The three rotation axes need not be set around the origin point of the hand coordinate system .sub.T that is the target coordinate system, but may be set to other positions. If three rotation axes are set around the origin point of the target coordinate system, since the correspondence relation between the three rotation axes and the target coordinate system is simple, there is an advantage that the coordinate transformation matrix between the target coordinate system and the camera coordinate system can be easily determined from directions of the rotation axes seen in the camera coordinate system.
[0113] In the above-described embodiment, the basic rotation position was rotated to both positive side and negative side, but it may be rotated only to either one side in the rotation around each rotation axis. If the basic rotation position is rotated to both positive side and negative side, it is easier to perform the above-described process. Also, it is preferable that the value of the rotation angle on the positive side is equal to the value thereof on the negative side.
[0114]
[0115] In this way, by setting the calibration target coordinate system .sub.t at a position different from the hand coordinate system .sub.T, it is possible to improve the detection accuracy of an object by the camera 170 in the vicinity of the target coordinate system .sub.t. For example, there are cases where the physically large hand 180 does not fit into a small working space. On the other hand, the target coordinate system .sub.t illustrated in
[0116] The calibration process of the camera 170 is a process of determining an extrinsic parameter for calculating the coordinate transformation between the target coordinate system .sub.t having known relative position and attitude with respect to the robot coordinate system .sub.0 and the camera coordinate system .sub.C. A coordinate transformation matrix .sup.CH.sub.t (or .sup.tH.sub.C) between the target coordinate system .sub.t and the camera coordinate system .sub.C is represented by a product of a first transformation matrix .sup.CH.sub.P (or .sup.PH.sub.C) between the camera coordinate system .sub.C and the pattern coordinate system .sub.P and a second transformation matrix .sup.PH.sub.t (or .sup.tH.sub.P) between the pattern coordinate system .sub.P and the target coordinate system .sub.t. At this time, the process in step S140 in
[0117] In the above-described embodiment, the calibration related to the camera 170 of the head portion 150 of the robot 100 is explained. However, the invention can be applied to calibration of a camera contained in a robot installed in places other than the head portion 150 or a camera installed separately from the robot 100. The invention can be applied to not only a double arm robot but also to a single arm robot.
[0118] The invention is not limited to the above-described embodiments, examples, and modifications, and can be realized in various configurations without departing from the spirit thereof. For example, it is possible to replace or combine the technical features in the embodiments, examples, and modifications corresponding to the technical features in each embodiment described in the summary of the invention section as necessary in order to solve some or all of the above-mentioned problems or achieve some or all of the above effects. Unless the technical features are described as essential in the present specification, it can be deleted as appropriate.
[0119] The entire disclosure of Japanese Patent Application No. 2017-135108, filed Jul. 11, 2017 is expressly incorporated by reference herein.