Robot Control Device, Robot, Robot System, And Calibration Method Of Camera
20190015989 ยท 2019-01-17
Inventors
Cpc classification
G05B2219/39057
PHYSICS
G05B2219/39008
PHYSICS
International classification
Abstract
A robot control device includes a processor that creates a parameter of a camera including a coordinate transformation matrix between a hand coordinate system of an arm and a camera coordinate system of the camera. The processor calculates a relationship between an arm coordinate system and a pattern coordinate system at the time of capturing the pattern image of the calibration pattern, and estimates a coordinate transformation matrix between the hand coordinate system of the arm and the camera coordinate system of the camera with the relationship between the arm coordinate system and the pattern coordinate system, a position and attitude of the arm at the time of capturing a pattern image, and the pattern image.
Claims
1. A control device that controls a robot having an arm on which a camera is installed, comprising: a processor that is configured to execute computer-executable instructions so as to control the robot, wherein the processor is configured to: cause the camera to capture a pattern image of a calibration pattern of the camera, calculate a relationship between an arm coordinate system of the arm and a pattern coordinate system of the calibration pattern at the time of capturing the pattern image, and estimate the coordinate transformation matrix with the relationship between the arm coordinate system and the pattern coordinate system, a position and attitude of the arm at the time of capturing the pattern image, and the pattern image.
2. The control device according to claim 1, wherein the processor calculates a first transformation matrix between the arm coordinate system and the hand coordinate system from the position and attitude of the arm at the time of capturing the pattern image, calculates or estimates a second transformation matrix between the pattern coordinate system and the arm coordinate system, estimates a third transformation matrix between the camera coordinate system and the pattern coordinate system from the pattern image, and calculates the coordinate transformation matrix from the first transformation matrix, the second transformation matrix, and the third transformation matrix.
3. The control device according to claim 2, wherein the robot has a second arm provided with the calibration pattern set in a predetermined installation state, and wherein the processor calculates the second transformation matrix between the pattern coordinate system and the arm coordinate system from a position and attitude of the second arm at the time of capturing the pattern image.
4. The control device according to claim 2, wherein the processor causes a fixed camera disposed independently of the arm to capture a second pattern image of the calibration pattern, and wherein the processor estimates the second transformation matrix between the pattern coordinate system and the arm coordinate system from the second pattern image.
5. The control device according to claim 4, wherein the fixed camera is a stereo camera.
6. A robot connected to the control device according to claim 1.
7. A robot connected to the control device according to claim 2.
8. A robot connected to the control device according to claim 3.
9. A robot connected to the control device according to claim 4.
10. A robot connected to the control device according to claim 5.
11. A robot system comprising: a robot; and the control device connected to the robot according to claim 1.
12. A robot system comprising: a robot; and the control device connected to the robot according to claim 2.
13. A robot system comprising: a robot; and the control device connected to the robot according to claim 3.
14. A robot system comprising: a robot; and the control device connected to the robot according to claim 4.
15. A robot system comprising: a robot; and the control device connected to the robot according to claim 5.
16. A robot system comprising: a robot; and the control device connected to the robot according to claim 6.
17. A calibration method of a camera for a robot having an arm on which the camera is installed, comprising: causing the camera to capture a pattern image of a calibration pattern of the camera; calculating a relationship between an arm coordinate system of the arm and a pattern coordinate system of the calibration pattern at the time of capturing the pattern image; and estimating a coordinate transformation matrix between a hand coordinate system of the arm and a camera coordinate system of the camera with a position and attitude of the arm at the time of capturing the pattern image and the pattern image.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
DESCRIPTION OF EXEMPLARY EMBODIMENTS
A. Configuration of Robot System
[0033]
[0034] The robot 100 is provided with a base 110, a body portion 120, a shoulder portion 130, a neck portion 140, a head portion 150, and two arms 160L and 160R. Hands 180L and 180R are detachably attached to the arms 160L and 160R. These hands 180L and 180R are end effectors for holding a workpiece or a tool. Cameras 170L and 170R are installed in the head portion 150. These cameras 170L and 170R are provided independently of the arms 160L and 160R, and are fixed cameras whose position and attitude are not changed. Hand eyes 175L and 175R are provided in a wrist portion of the arms 160L and 160R as a camera. A calibration pattern 400 for the cameras 170L and 170R and the hand eyes 175L and 175R can be installed in the arms 160L and 160R. Hereinafter, in order to distinguish with the hand eyes 175L and 175R, the cameras 170L and 170R provided in the head portion 150 are referred to as fixed cameras 170L and 170R.
[0035] Force sensors 190L and 190R are provided in a wrist portion of the arms 160L and 160R. The force sensors 190L and 190R are sensors for detecting a reaction force or a moment with respect to a force that the hands 180L and 180R exert on the workpiece. As the force sensors 190L and 190R, for example, it is possible to use a six-axis force sensor capable of simultaneously detecting six components of force components in translational three-axis directions and the moment components around three rotation axes. The force sensors 190L and 190R are optional.
[0036] The letters L and R appended to the end of symbols of the arms 160L and 160R, the cameras 170L and 170R, the hand eyes 175L and 175R, the hands 180L and 180R, and the force sensors 190L and 190R mean left and right. In a case where these distinctions are unnecessary, explanations will be made using symbols without the letters L and R.
[0037] The control device 200 includes a processor 210, a main memory 220, a non-volatile memory 230, a display control unit 240, a display 250, and an I/O interface 260. These units are connected via a bus. The processor 210 is, for example, a microprocessor or a processor circuit. The control device 200 is connected to the robot 100 via the I/O interface 260. The control device 200 may be stored in the robot 100.
[0038] As a configuration of the control device 200, various configurations other than the configuration illustrated in
[0039]
B. Robot Coordinate System and Coordinate Transformation
[0040]
[0041] A tool center point (TCP) is set on at an end of the arm 160. Typically, control of the robot 100 is executed to control a position and attitude of the tool center point TCP. A position and attitude means three coordinate values in a three-dimensional coordinate system and a state defined by rotation around each coordinate axis.
[0042] In the arms 160L and 160R, the calibration pattern 400 can be set in a predetermined installation state. In the example of
[0043] The calibration of the hand eye 175L is a process for estimating an intrinsic parameter and an extrinsic parameter of the hand eye 175L. The intrinsic parameter is a specific parameter of the hand eye 175L and the lens system thereof, and includes, for example, a projective transformation parameter, a distortion parameter, and the like. The extrinsic parameter is a parameter used when calculating the relative position and attitude between the hand eye 175L and the arm 160L of the robot 100, and a parameter expressing translation and rotation between a hand coordinate system .sub.T1 of the arm 160L and a hand eye coordinate system .sub.E. The extrinsic parameter can also be configured as a parameter expressing translation and rotation between a target coordinate system other than the hand coordinate system .sub.T1 and a hand eye coordinate system .sub.E. The target coordinate system may be a coordinate system capable of acquiring from a robot coordinate system .sub.0. For example, a coordinate system having a fixed known relative position and attitude with respect to the robot coordinate system .sub.0 and a coordinate system in which the relative position and attitude with the robot coordinate system .sub.0 is determined according to the movement amount of the joint of the arm 160L may be selected as a target coordinate system. The extrinsic parameter corresponds to a parameter of a camera including a coordinate transformation matrix between a hand coordinate system of an arm and a camera coordinate system of a camera.
[0044] In
[0050] The arm coordinate systems .sub.A1, and .sub.A2 and the hand coordinate systems .sub.T1, and E.sub.T2 are individually set in the left arm 160L and the right arm 160R. Hereinafter, the coordinate systems related to the left arm 160L are referred to as first arm coordinate system .sub.A1, and first hand coordinate system .sub.T1, and the coordinate systems related to the right arm 160R are referred to as second arm coordinate system .sub.A2, and second hand coordinate system .sub.T2. The relative position and attitude of the arm coordinate systems .sub.A1, and .sub.A2 and the robot coordinate system .sub.0 is known. The hand eye coordinate system .sub.E is also individually set on the hand eyes 175L and 175R. In the description below, the hand eye 175L of the left arm 160L is set as a calibration target, and thereby the coordinate system of the hand eye 175L of the left arm 160L is used as the hand eye coordinate system .sub.E. In
[0051] In general, a transformation from a certain coordinate system .sub.A to another coordinate system .sub.B, or transformation of position and attitude in these coordinate systems can be expressed as a homogeneous transformation matrix .sup.AH.sub.B illustrated below.
Here, R represents a rotation matrix, T represents a translation vector, and R.sub.x, R.sub.y, and R.sub.z represent column components of a rotation matrix R. Hereinafter, the homogeneous transformation matrix .sup.AH.sub.B is also referred to as coordinate transformation matrix .sup.AH.sub.B, transformation matrix .sup.AH.sub.B, or simply transformation .sup.AH.sub.B. The superscript .sup.A on the left side of a transformation symbol .sup.AH.sub.B indicates the coordinate system before the transformation, and the subscript a on the right side of the transformation symbol .sup.AH.sub.B indicates the coordinate system after the transformation. The transformation .sup.AH.sub.B can be also considered as indicating an origin position and basic vector components of the coordinate system .sub.B seen in the coordinate system .sub.A.
[0052] An inverse matrix .sup.AH.sub.B.sup.1 (=.sup.BH.sub.A) of the transformation .sup.AH.sub.B is given by the following expression.
.sup.AH.sub.B.sup.1=(R.sub.0.sup.TR.sub.1.sup.T.Math.T) (2)
[0053] The rotation matrix R has the following important properties.
Rotation Matrix R Property 1
[0054] The rotation matrix R is an orthonormal matrix, and an inverse matrix R.sup.1 thereof is equal to a transposed matrix R.sup.T.
Rotation Matrix R Property 2
[0055] The three column components R.sub.x, R.sub.y, and R.sub.z of the rotation matrix R are equal to three basic vector components of the coordinate system .sub.B after rotation seen in the original coordinate system .sub.A.
[0056] In a case where the transformations .sup.AH.sub.B and .sup.BH.sub.C are sequentially applied to a certain coordinate system .sub.A, a combined transformation .sup.AH.sub.C is acquired by multiplying each of the transformations .sup.AH.sub.B and .sup.BH.sub.C sequentially to the right.
.sup.AH.sub.C=.sup.AH.sub.B.Math..sup.BH.sub.C (3)
[0057] Regarding the rotation matrix R, the same relationship as Expression (3) is established.
.sup.AR.sub.C=.sup.AR.sub.B.Math..sup.BR.sub.C (4)
C. AX=XB Problem of Coordinate Transformation
[0058] In
[0063] Among the above described four transformations .sup.A1H.sub.T1, .sup.T1H.sub.E, .sup.EH.sub.P, and .sup.PH.sub.A1, the transformation .sup.A1H.sub.T1 is transformation from the first arm coordinate system .sub.A1 to the first hand coordinate system .sub.T1. The first hand coordinate system .sub.T1 indicates position and attitude of the TCP of the first arm 160L. Normally, the process of acquiring the position and attitude of the TCP with respect to the first arm coordinate system .sub.A1 is referred to as a forward kinematics, and is calculable if the geometric shape of the arm 160L and movement amount (rotation angle) of each joint are determined. In other words, the transformation .sup.A1H.sub.T1 is a calculable transformation.
[0064] The transformation .sup.T1H.sub.E is a transformation from the first hand coordinate system .sub.T1 to the hand eye coordinate system .sub.E. The transformation .sup.T1H.sub.E is unknown, and acquiring the transformation .sup.T1H.sub.E corresponds to the calibration of the hand eye 175.
[0065] The transformation .sup.EH.sub.P is a transformation from the hand eye coordinate system .sub.E to the pattern coordinate system .sub.P, and can be estimated by capturing an image of the calibration pattern 400 with the hand eye 175, and performing image processing with respect to the image. The process of estimating the transformation .sup.EH.sub.P can be executed using standard software (for example, camera calibration function of OpenCV or MATLAB) for performing camera calibration.
[0066] The transformation .sup.PH.sub.A1 is a transformation from the pattern coordinate system .sub.P to the first arm coordinate system .sub.A1. The transformation .sup.PH.sub.A1 is unknown.
[0067] Following the above-described transformations .sup.A1H.sub.T1, .sup.T1H.sub.E, .sup.EH.sub.P, and .sup.PH.sub.A1 in order will lead to the initial first arm coordinate system .sub.A1, and the following expression will be established using an identity transformation I.
.sup.A1H.sub.T1.Math..sup.T1H.sub.E.Math..sup.EH.sub.P.Math..sup.PH.sub.A1=I (5)
[0068] The following expression can be acquired by multiplying inverse matrixes .sup.A1H.sub.T1.sup.1, .sup.T1H.sub.E.sup.1, and .sup.EH.sub.P.sup.1 of each transformation in order from the left on both sides of Expression (5).
.sup.PH.sub.A1=.sup.EH.sub.P.sup.1.Math..sup.T1H.sub.E.sup.1.Math..sup.A1H.sub.T1.sup.1 (6)
[0069] In Expression (6), the transformation .sup.EH.sub.P can be estimated from the camera calibration function, and the transformation .sup.A1H.sub.T1 is calculable. Accordingly, if the transformation .sup.T1H.sub.E is known, the right side is calculable, and the transformation .sup.PH.sub.A1 on the left side can be known.
[0070] On the other hand, if the transformation .sup.T1H.sub.E is unknown, the right side of Expression (6) is not calculable, and a different processing is required. For example, with consideration of two attitudes i and j of the left arm 160L in
.sup.A1H.sub.T1(i).Math..sup.T1H.sub.E.Math..sup.EH.sub.P(i).Math..sup.PH.sub.A1=I (7a)
.sup.A1H.sub.T1(j).Math..sup.T1H.sub.E.Math..sup.EH.sub.P(j).Math..sup.PH.sub.A1=I (7b)
[0071] The following expressions are acquired by multiplying an inverse matrix .sup.PH.sub.A1.sup.1 of the transformation .sup.PH.sub.A1 on both sides of each Expressions (7a) and (7b) from the right.
.sup.A1H.sub.T1(i).Math..sup.T1H.sub.E.Math..sup.EH.sub.P(i)=.sup.PH.sub.A1.sup.1 (8a)
(8b)
[0072] Although the right sides of Expressions (8a) and (8b) are unknown, since the expressions are the same transformation, the following expression is established.
.sup.A1H.sub.T1(i).Math..sup.T1H.sub.E.Math..sup.EH.sub.P(i)=.sup.A1H.sub.T1(j).Math..sup.T1H.sub.E.Math..sup.EH.sub.P(j) (9)
[0073] When multiplying .sup.A1H.sub.T1(j).sup.1 on the left side and .sup.EH.sub.P(i).sup.1 on the right side on both sides of Expression (9), the following expression is acquired.
(.sup.A1H.sub.T1(j).sup.1.Math..sup.A1H.sub.T1(i)).Math..sup.T1H.sub.E=.sup.T1H.sub.E.Math.(.sup.EH.sub.P(j).Math..sup.EH.sub.P(i).sup.1) (10)
[0074] Here, when the products of the transformation in parentheses of the left and the right sides of Expression (10) are written as A and B, and the unknown transformation .sup.T1H.sub.E as X, following equation can be acquired.
AX=XB (11)
[0075] This is a well-known process as AX=XB problem, and a nonlinear optimization process is required to solve the unknown matrix X. However, there is a problem that there is no guarantee that the nonlinear optimization process will converge to an optimal solution.
[0076] As will be described in detail below, in a first embodiment, by calculating the relationship between the second arm coordinate system .sub.A2 and the pattern coordinate system .sub.P from the position and attitude of the second arm 160R using the fact that the second arm 160R provide with the calibration pattern 400 can be optionally controlled, it is possible to estimate the transformation .sup.T1H.sub.E or .sup.EH.sub.T1 between the first hand coordinate system .sub.T1 and the hand eye coordinate system .sub.E. As a result, it is possible to determine the extrinsic parameter of the hand eye 175.
[0077] To perform such a process, in the first embodiment, the following transformations are used in addition to the above-described transformations .sup.A1H.sub.T1, .sup.T1H.sub.E, .sup.EH.sub.P, and .sup.PH.sub.A1. [0078] (5) Transformation .sup.A1H.sub.A2 (known) : a transformation from the first arm coordinate system .sub.A1 to the second arm coordinate system .sub.A2 [0079] (6) Transformation .sup.A2H.sub.T2 (calculable): a transformation from the second arm coordinate system .sub.A2 to the second hand coordinate system .sub.T2 [0080] (7) Transformation .sup.T2H.sub.P (known): a transformation from the second hand coordinate system .sub.T2 to the pattern coordinate system .sub.P
[0081] The transformation .sup.T2H.sub.P from the second hand coordinate system .sub.T2 to the pattern coordinate system .sub.P is assumed to be known. If a tool (for example, flange) for installing the calibration pattern 400 in the wrist portion of the arm 160R is designed and manufactured with high accuracy, it is possible to determine the transformation .sup.T2H.sub.P from the design data. Alternatively, an image of the calibration pattern 400 installed in the wrist portion of the arm 160R may be captured with the fixed camera 170, a transformation .sup.CH.sub.P of a camera coordinate system .sub.C and the pattern coordinate system .sub.P may be estimated from the pattern image, and the transformation .sup.T2H.sub.P from the second hand coordinate system .sub.T2 to the pattern coordinate system .sub.P may be acquired using the transformation .sup.CH.sub.P.
D. Processing Procedure of First Embodiment
[0082]
[0083] Step S110 and step S120 are processes for determining the intrinsic parameter of the hand eye 175. First, in step S110, the images of the calibration pattern 400 are captured at a plurality of positions and attitudes using the hand eye 175. Since these plurality of positions and attitudes are to determine the intrinsic parameter of the hand eye 175, any position and attitude can be applied. Hereinafter, the image acquired from capturing the image of the calibration pattern 400 with the hand eye 175 is referred to as pattern image. In step S120, the camera calibration execution unit 213 estimates the intrinsic parameter of the hand eye 175 using the plurality of the pattern images acquired in step S110. As described above, the intrinsic parameter of the hand eye 175 is a specific parameter of the hand eye 175 and the lens system thereof and includes, for example, a projective transformation parameter, a distortion parameter, and the like. Estimation of the intrinsic parameter can be executed using standard software (for example, camera calibration function of OpenCV or MATLAB) for performing camera calibration.
[0084] The steps S130 to S170 are processes for estimating the extrinsic parameter of the hand eye 175. In step S130, the image of the calibration pattern 400 is captured at a specific position and attitude using the hand eye 175. In the above-described step S110, since the images of the calibration pattern 400 are captured at the plurality of positions and attitudes, one of these plurality of positions and attitudes may be used as specific position and attitude. In this case, step S130 is optional. Hereinafter, the state of the robot 100 that the calibration pattern 400 is taking the specific position and attitude is simply referred to as specific position and attitude state.
[0085] In step S140, the transformation .sub.A1H.sub.T1 or .sup.T1H.sub.A1 between the first arm coordinate system .sub.A1 and the first hand coordinate system .sub.T1 in the specific position and attitude state is calculated. The transformation .sup.A1H.sub.T1 or .sup.T1H.sub.A1 can be calculated by the forward kinematics of the arm 160L.
[0086] In step S150, the transformation .sup.A1H.sub.P or .sup.PH.sub.A1 between the first arm coordinate system .sub.A1 and the pattern coordinate system .sub.P in the specific position and attitude state can be calculated. For example, the transformation .sup.A1H.sub.P can be calculated with the following expression.
.sup.A1H.sub.P=.sup.A1H.sub.A2.Math..sup.A2H.sub.T2.Math..sup.T2H.sub.P (12)
[0087] Among the three transformations .sup.A1H.sub.A2, .sup.A2H.sub.T2, and .sup.T2H.sub.P in the right side of Expression (12), the first transformation .sup.A1H.sub.A2 and the third transformation .sup.T2H.sub.P are constant, and the second transformation .sup.A2H.sub.T2 is calculated by the position and attitude of the second arm 160R.
[0088] In this way, in step S150, the transformation .sup.A1H.sub.P or .sup.PH.sub.A1 between the first arm coordinate system .sub.A1 and the pattern coordinate system .sub.P can be calculated from the position and attitude of the second arm 160R in the specific position and attitude state. In other words, the camera calibration execution unit 213 can calculate the relationship between the first arm coordinate system .sub.A1 and the pattern coordinate system .sub.P in the specific position and attitude state.
[0089] In step S160, the transformation .sup.EH.sub.P or .sup.PH.sub.E between the hand eye coordinate system .sub.E and the pattern coordinate system .sub.P can be estimated using the pattern image captured with the hand eye 175 in the specific position and attitude state. The estimation can be executed using standard software (for example, OpenCV function FindExtrinsicCameraParams2) for estimating the extrinsic parameter of the camera with the intrinsic parameter acquired in step S120.
[0090] In step S170, transformations .sup.T1H.sub.E, and .sup.EH.sub.T1 of the first hand coordinate system and the hand eye coordinate system are calculated. For example, for the transformation .sup.T1H.sub.E, the following expression is established in
.sup.T1H.sub.E=.sup.T1H.sub.A1.Math..sup.A1H.sub.A2.Math..sup.A2H.sub.T2.Math..sup.T2H.sub.P.Math..sup.PH.sub.E (13)
[0091] Among the five transformations on the right side of Expression (13), the first transformation .sup.T1H.sub.A1 is calculated in step S140. The second transformation .sup.A1H.sub.A2 is known. The third transformation .sup.A2H.sub.T2 can be calculated by the forward kinematics of the arm 160R. The fourth transformation .sup.T2H.sub.P is known. The fifth transformation .sup.PH.sub.E is estimated in step S160. Thereby, the transformation .sup.T1H.sub.E of the first hand coordinate system .sub.T1 and the hand eye coordinate system .sub.E can be calculated according to Expression (13).
[0092] The acquired homogeneous transformation matrix .sup.T1H.sub.E or .sup.EH.sub.T1 is stored in the non-volatile memory 230 as the extrinsic parameter 233 of the hand eye 175. It is possible to perform various detection process or control using the hand eye 175 with the extrinsic parameter 233 and the intrinsic parameter 232 of the hand eye 175. As the extrinsic parameter 233 of the hand eye 175, various parameters for calculating the coordinate transformation between the robot coordinate system .sub.0 and the hand eye coordinate system .sub.E can be applied.
[0093] In this way, in the first embodiment, it is possible to estimate the coordinate transformation matrix .sup.T1H.sub.E between the first hand coordinate system .sub.T1 and the hand eye coordinate system .sub.E using the position and attitude of the arm 160 at the time of capturing the pattern image and the pattern image. Particularly, in the first embodiment, the camera calibration execution unit 213 calculates the first transformation matrix .sup.A1H.sub.T1 or .sup.T1H.sub.A1 between the first arm coordinate system .sub.A1 and the first hand coordinate system .sub.T1 from the position and attitude of the arm 160 at the time of capturing the pattern image in step 5140. In step S150, a second transformation matrix .sup.PH.sub.A1 or .sup.A1H.sub.P between the pattern coordinate system .sub.P and the first arm coordinate system .sub.A1 is calculated. In step S160, the third transformation matrix .sup.EH.sub.P or .sup.PH.sub.E between the hand eye coordinate system EE and the pattern coordinate system .sub.P is estimated from the pattern image. In step S170, the coordinate transformation matrix .sup.T1H.sub.E or .sup.EH.sub.T1 between the first hand coordinate system .sub.T1 and the hand eye coordinate system .sub.E is calculated from these transformation matrixes. Thereby, it is possible to easily acquire the extrinsic parameter of the hand eye 175 including the coordinate transformation matrix .sup.T1H.sub.E or .sup.EH.sub.T1 between the first hand coordinate system .sub.T1 and the hand eye coordinate system .sub.E.
E. Second Embodiment
[0094]
[0095] One or both of two cameras 170L and 170R is used as the fixed camera 170. It is possible to estimate the position and attitude of the calibration pattern 400 with higher accuracy by using two cameras 170L and 170R as stereo cameras. In the second embodiment, the calibration is assumed to be completed, and the intrinsic parameter and the extrinsic parameter are assumed to be determined in the camera 170. Assume that a transformation .sup.A1H.sub.C between the first arm coordinate system .sub.A1 and the camera coordinate system .sub.C is known.
[0096]
[0097] In step S151, an image of the calibration pattern 400 is captured at the specific position and attitude using the fixed camera 170. The specific position and attitude is the same specific position and attitude in step S130. In step S152, the transformation .sup.CH.sub.P or .sup.PH.sub.C between the camera coordinate system .sub.C and the pattern coordinate system .sub.P is estimated using the pattern image (second pattern image) captured with the fixed camera 170 in the specific position and attitude state. For example, since the position and attitude of the pattern coordinate system .sub.P can be determined from the pattern image captured the calibration pattern 400 by using the fixed camera 170 as the stereo camera, the transformation .sup.CH.sub.P or .sup.PH.sub.C between the camera coordinate system .sub.C and the pattern coordinate system .sub.P can be estimated. On the other hand, in the case of using one fixed camera 170, it is possible to estimate the transformation .sup.CH.sub.P or .sup.PH.sub.C between the camera coordinate system .sub.C and the pattern coordinate system .sub.P using standard software (for example, OpenCV function FindExtrinsicCameraParams2) for estimating the extrinsic parameter of the camera.
[0098] In step S153, the transformation .sup.A1H.sub.P or .sup.PH.sub.A1 between the first arm coordinate system .sub.A1 and the pattern coordinate system .sub.P in the specific position and attitude state is calculated. For example, the transformation .sup.A1H.sub.P can be calculated with the following expression.
.sup.A1H.sub.P=.sup.A1H.sub.C.Math..sup.CH.sub.P (14)
[0099] Between the two transformations on the right side of Expression (14), the first transformation .sup.A1H.sub.C is known. The second transformation .sup.CH.sub.P is estimated in step S152.
[0100] In this way, in the second embodiment, it is possible to estimate the transformation .sup.A1H.sub.P or .sup.PH.sub.A1 between the first arm coordinate system .sub.A1 and the pattern coordinate system .sub.P from the second pattern image captured with the fixed camera 170 in step S150a. In other words, the camera calibration execution unit 213 can estimate the relationship between the first arm coordinate system .sub.A1 and the pattern coordinate system .sub.P in the specific position and attitude state.
[0101] When the transformation .sup.A1H.sub.P or .sup.PH.sub.A1 between the first arm coordinate system .sub.A1 and the pattern coordinate system .sub.P is determined, similarly to the first embodiment, by processing steps S160 and S170, it is possible to acquire the extrinsic parameter of the hand eye 175 including the homogeneous transformation matrix .sup.T1H.sub.E or .sup.EH.sub.T1 of the first hand coordinate system .sub.T1 and the hand eye coordinate system .sub.E.
[0102] In this way, in the second embodiment, it is possible to estimate the coordinate transformation matrix .sup.T1H.sub.E between the first hand coordinate system .sub.T1 and the hand eye coordinate system .sub.E using the position and attitude of the arm 160 at the time of capturing the pattern image and the pattern image. Particularly, in step S150a, the second pattern image of the calibration pattern 400 is captured with the fixed camera 170 disposed independently of the arm 160, and the second transformation matrix .sup.A1H.sub.P or .sup.PH.sub.A1 between the pattern coordinate system .sub.P and the first arm coordinate system .sub.A1 from the second pattern image is estimated in the second embodiment. In other words, since the second transformation matrix .sup.A1H.sub.P or .sup.PH.sub.A1 can be estimated from the second pattern image, it is possible to easily acquire the coordinate transformation matrix .sup.T1H.sub.E or .sup.EH.sub.T1 between the first hand coordinate system .sub.T1 and the hand eye coordinate system .sub.g.
F. Third Embodiment
[0103]
[0104] Similarly to the second embodiment, it is possible to estimate the coordinate transformation matrix .sup.T1H.sub.E or .sup.EH.sub.T1 between the hand coordinate system .sub.T and the hand eye coordinate system .sub.E using the position and attitude of the arm 160 at the time of capturing the pattern image and the pattern image in the third embodiment. In addition, it is possible to acquire the extrinsic parameter of the hand eye 175 including the coordinate transformation matrix .sup.T1H.sub.E or .sup.EH.sub.T1.
[0105] The invention is not limited to the above-described embodiments, examples, and modifications, and can be realized in various configurations without departing from the spirit thereof. For example, it is possible to replace or combine the technical features in the embodiments, examples, and modifications corresponding to the technical features in each embodiment described in the summary of the invention section as necessary in order to solve some or all of the above-mentioned problems or achieve some or all of the above effects. Unless the technical features are described as essential in the present specification, it can be deleted as appropriate.
[0106] The entire disclosure of Japanese Patent Application No. 2017-135107, filed Jul. 11, 2017 is expressly incorporated by reference herein.