ROBOT CONTROL DEVICE
20180050452 ยท 2018-02-22
Inventors
Cpc classification
G05B2219/40565
PHYSICS
G05B2219/39391
PHYSICS
G05B2219/40022
PHYSICS
International classification
Abstract
A robot control device includes a feature-point detecting unit that detects, from an image of an object acquired by a visual sensor, the positions of a plurality of feature points on the object in a predetermined cycle; a position/orientation calculating unit that updates, in the predetermined cycle, respective equations of motion of the plurality of feature points on the basis of the detected positions of the plurality of feature points and that calculates the position or orientation of the object on the basis of the detected positions of the plurality of feature points calculated from the updated equations of motion; and a robot-arm-movement control unit that controls the movement of a robot arm so as to follow the object, on the basis of the calculated position or orientation of the object.
Claims
1. A robot control device comprising: a feature-point detecting unit that detects, from an image of an object acquired by a visual sensor, the positions of a plurality of feature points on the object, in a predetermined cycle; a position/orientation calculating unit that updates, in the predetermined cycle, respective equations of motion of the plurality of feature points based on the detected positions of the plurality of feature points and that calculates the position or orientation of the object based on the detected positions of the plurality of feature points calculated from the updated equations of motion; and a robot-arm-movement control unit that controls the movement of a robot arm so as to follow the object, based on the calculated position or orientation of the object.
2. A robot control device according to claim 1, wherein the position/orientation calculating unit weights the plurality of feature points on the object based on the frequencies of detection of the plurality of feature points.
3. A robot control device according to claim 1, wherein the robot-arm-movement control unit is provided with a singular-point avoiding unit that avoids a singular point of the robot arm.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0005]
[0006]
[0007]
[0008]
[0009]
DESCRIPTION OF EMBODIMENT(S)
[0010] A robot system according to an embodiment of the present invention will be described in detail below with reference to the drawings. Note that, in this specification, a hat disposed above an English character symbol is referred to as (English character symbol).sup.hat or a bar disposed above an English character symbol is referred to as (English character symbol).sup.bar or
[0011] As shown in
[0012] The workpiece 7 is placed on a conveying device 6, such as a belt conveyor or a rotary table, and is moved according to the movement of the conveying device 6.
[0013] Note that, in this embodiment, although the camera 2 is supported by the stand 3, the position thereof is not limited thereto, and the camera 2 may be mounted on the robot arm 4.
[0014] The configuration of the robot control device 5, which is provided in the robot system 1, will be described in detail below with reference to
[0015] As shown in
[0016] The feature-point detecting unit 51 is connected to the camera 2 and is connected to the equation-of-motion updating unit 521 and the weight calculating unit 523, which are provided in the position/orientation calculating unit 52. The equation-of-motion updating unit 521 is connected to the feature-point-position calculating unit 522. The feature-point-position calculating unit 522 is connected to the weight calculating unit 523 and the rotation/position-matrix calculating unit 524. The weight calculating unit 523 is connected to the rotation/position-matrix calculating unit 524.
[0017] The rotation/position-matrix calculating unit 524 is connected to the trajectory generating unit 531, which is provided in the robot-arm-movement control unit 53. The trajectory generating unit 531 is connected to the singular-point avoiding unit 532. The trajectory generating unit 531 and the singular-point avoiding unit 532 are connected to the adding unit 533. The adding unit 533 is connected to the robot arm 4.
[0018] The feature-point detecting unit 51 is configured to perform pattern matching using a preset shape with respect to an image that includes the workpiece 7 captured by the camera 2 and to detect the positions y.sub.k(t.sub.i.sup.k) of a plurality of feature points on the workpiece 7 in the image that matches this shape. Here, specifically, the detected position y.sub.k(t.sub.i.sup.k) expresses, for a k-th feature point, the position vector including x, y, and z directions at time t.sub.i.sup.k at which the camera detects the workpiece.
[0019] The equation-of-motion updating unit 521 is configured to update the equation of motion of each feature point by using the position y.sub.k(t.sub.i.sup.k) of the feature point, which is detected by the feature-point detecting unit 51. The position of each feature point detected from the image acquired by the camera 2 can be expressed as in Equation (1) by using an equation of motion.
y.sub.k(t.sub.i.sup.k)=f.sub.k[y.sub.k(t.sub.i1.sup.k),t.sub.i.sup.k]+v.sub.i.sup.k (1)
[0020] Here, fk[y.sub.k(t.sub.i1.sup.k),t.sub.i.sup.k] represents the equation of motion of each feature point, and v.sub.i.sup.k represents sensor noise. Specifically, the position vector detected at the time t.sub.i.sup.k is expressed on the basis of: an equation of motion that is updated by using the position y.sub.k(t.sub.i1.sup.k) detected at time which is one cycle before; and the sensor noise.
[0021] Note that the equation of motion of each feature point can be expressed by Equation (2) and Equation (3), for example.
[0022] Here, x.sub.k(t) represents the nine-dimensional vector including the position, the velocity, and the acceleration of each feature point at time t, w(t) represents an acceleration vector, which is a factor of velocity change at the feature point, and v(t) represents noise obtained at the time of detection using the camera.
[0023] More specifically, the equation-of-motion updating unit 521 calculates y.sub.k.sup.bar(t.sub.i.sup.k) by the following Equation (4) and Equation (5), thereby updating each equation of motion.
.sub.k(t.sub.i.sup.k)=f.sub.k[
[0024] Here, y.sub.k.sup.hat(t.sub.i.sup.k) is a calculated value obtained by using the equation of motion updated, at the time t.sub.i.sup.k, by using the position y.sub.k.sup.bar(t.sub.i1.sup.k) at the time t.sub.i1.sup.k, which is one cycle before. Furthermore, y.sub.k.sup.bar(t.sub.iK) in Equation (5) represents the position of each feature point, calculated in consideration of both the detected value y.sub.k(t.sub.i.sup.k), which is detected by the camera 2 at the time t.sub.i.sup.k, and the calculated value y.sub.k.sup.hat(t.sub.i.sup.k), which is calculated by using Equation (4). Note that, in Equation (5), F.sub.k(t.sub.i.sup.k) represents a gain of the difference between y.sub.k(t.sub.i.sup.k) and y.sub.k.sup.hat(t.sub.i.sup.k).
[0025] The feature-point-position calculating unit 522 is configured to calculate the position y.sub.k.sup.hat(t.sub.now) of each feature point at the current time t.sub.now, as in Equation (6), by using the equation of motion updated by using the position y.sub.k.sup.bar(t.sub.i.sup.k) at the time tik, which is calculated as described above.
.sub.k(t.sub.now)=f.sub.k[
[0026] The weight calculating unit 523 is configured to calculate a weight for each feature point. Specifically, the weight calculating unit 523 calculates, as the weight for each feature point, the covariance .sub.k.sup.hat(t.sub.now) of the position of the feature point at the current time t.sub.now, as shown in Equation (7), such that the importance of a feature point becomes higher as the feature point is detected a larger number of times (higher frequency) by the camera 2.
{circumflex over ()}.sub.k(t.sub.now)=g.sub.k[
[0027] Here, g.sub.k is the result obtained after calculating the covariance for an equation of motion f.sub.k, as shown in Equation (8).
g.sub.k=Cov(f.sub.k) (8)
[0028] Furthermore, .sub.k.sup.bar(t.sub.i.sup.k) is the covariance calculated in consideration of both the covariance of the detected value y(t.sub.i.sup.k) detected by the camera 2 at the time t.sub.i.sup.k, and the covariance of the calculated value y.sub.k.sup.hat(t.sub.i.sup.k) calculated by the equation of motion.
[0029] Note that, if there is a feature point that cannot be detected, the position of that feature point is calculated by using information obtained when the feature point was detected in the previous cycle.
[0030] The rotation/position-matrix calculating unit 524 is configured to use the covariance .sub.k.sup.hat(t.sub.now), which is calculated by the weight calculating unit 523, and the position y.sub.k.sup.hat(t.sub.now) of the feature point, which is calculated by the feature-point-position calculating unit 522, to calculate, by Equation (9), a rotation matrix and a position matrix of the workpiece 7, in which each feature point is weighted.
[0031] Here, R.sub.w represents the rotation matrix of the workpiece 7 in the robot-world coordinate system, T.sub.w represents the position matrix of the workpiece 7 in the robot-world coordinate system, and s.sub.k represents the position (vector including the x, y, and z directions) of a feature point, viewed from the object coordinate system fixed to the workpiece 7. However, if there is a feature point that is not detected, the result obtained when the feature point was detected in the previous cycle is used as the position of the feature point.
[0032] The trajectory generating unit 531 is configured to generate a trajectory of the robot arm 4. Specifically, the trajectory generating unit 531 calculates the differences between: the position and the orientation (a rotation matrix Rr and a position matrix T.sub.r) of the current TCP generated by the robot-arm-movement control unit 53; and the position and the orientation (the rotation matrix R.sub.w and the position matrix T.sub.w) of the workpiece 7 that were calculated by the position/orientation calculating unit 52, and multiplies these differences by the inverse of the Jacobian (the inverse matrix of the Jacobian), thereby calculating the differences for each axis angle, i.e., a command velocity q.sup.dot* for the robot arm 4.
{dot over (q)}*=IK(R,T)=J.sup.+(q)h(R,T) (10)
[0033] Here, J.sup.+ is the inverse of the Jacobian transformation between axis positions and perpendicular positions. h(R,T) is a function indicating that the differences R and T of the position and the orientation are multiplied by a proportional gain, and the proportional gain is adjusted according to the movement frequency of the workpiece 7 to be followed.
[0034] The singular-point avoiding unit 532 is configured to generate an interpolation velocity q.sup.dot for an avoidance movement by Equation (11) when the current TCP generated by the robot-arm-movement control unit 53 is close to a singular point.
{dot over (q)}=(IJ.sup.+(q)J(q))p(q) (11)
[0035] Here, p(q) is a function for correcting the velocity and is a correction velocity for each axis. The correction velocity p(q) for each axis can be obtained from Equation (12) and Equation (13).
[0036] H(q) is an expression indicating robot manipulability (manipulability). As shown in Equation (13), the correction velocity p(q) for each axis is calculated from partial differentiation of H(q).
[0037] In the above-described Equation (13), .sub.0 represents the proportional gain for avoiding a singular point. Note that the proportional gain .sub.0 becomes 0 when the position and the orientation of the TCP are not close to a singular point, and the value of .sub.0 is increased when the position and the orientation of the TCP are close to a singular point.
[0038] The adding unit 533 is configured to add the command velocity q.sup.dot* which is generated by the trajectory generating unit 531, and the interpolation velocity q.sup.dot*, which is calculated by the singular-point avoiding unit, to calculate an each-axis velocity command q.sup.dot for the robot arm 4.
{dot over (q)}={dot over (q)}*+{dot over (q)}(14)
[0039] The robot arm 4 is controlled according to the calculated each-axis velocity command q.sup.dot.
[0040] The operation of the robot control device 5, which has the above-described configuration, will be described below with reference to
[0041] First, an image acquired by the camera 2 is input to the robot control device 5 (Step S1 in
[0042] Next, the processing for calculating the position and the orientation of the workpiece 7, which is performed in Step S3 in
[0043] First, the equation-of-motion updating unit 521 calculates the actual position y.sub.k.sup.bar(t.sub.i.sup.k) of each feature point at the time t.sub.i.sup.k, which is obtained on the basis of: the detected position y.sub.k(t.sub.i.sup.k) of the feature point, which is detected at the time t.sub.i.sup.k by the feature-point detecting unit 51; and the calculated position y.sub.k.sup.hat(t.sub.i.sup.k) of the feature point at the time t.sub.i.sup.k, which is calculated from the equation of motion updated at the time t.sub.i1.sup.k (see Equation (4) and Equation (5)). Then, the calculated actual position y.sub.k.sup.bar(t.sub.i.sup.k) is used to update the equation of motion (Step S21 in
[0044] Then, the feature-point-position calculating unit 522 calculates the position y.sub.k.sup.hat(t.sub.now) of the feature point at the current time t.sub.now by using the updated equation of motion, as shown in Equation (6) (Step S22 in
[0045] In this way, because the equation of motion is updated for each interpolation cycle of the robot arm 4, even when the workpiece 7 randomly moves, it is possible to accurately calculate the current position of the workpiece 7.
[0046] Next, the weight calculating unit 523 calculates the covariance .sub.k.sup.hat(t.sub.now) of the position y.sub.k.sup.hat(t.sub.now) of the feature point at the current time t.sub.now by Equation (7) on the basis of both the covariance of the detected position y.sub.k(t.sub.i.sup.k) of the feature point, which is detected at the time t.sub.i.sup.k by the feature-point detecting unit 51, and the covariance of the calculated position y.sub.k.sup.hat(t.sub.i.sup.k) of the feature point at the time t.sub.i.sup.k, which is calculated from the equation of motion updated at the time t.sub.i1.sup.k (Step S23).
[0047] Then, the rotation/position-matrix calculating unit 524 calculates a rotation matrix R.sub.w and a position matrix T.sub.w of the workpiece 7 by Equation (9) by using the covariance .sub.k.sup.hat(t.sub.now), which is calculated by the weight calculating unit 523 and which serves as a weight, and the position y.sub.k.sup.hat(t.sub.now) of the feature point, calculated by the feature-point-position calculating unit 522 (Step S24 in
[0048] In this way, because each feature point is weighted such that the importance of a feature point becomes higher as the feature point is detected a larger number of times by the camera 2, even if part of the workpiece 7 is blocked by obstacle or the like, for example, and thus, there is a feature point that cannot be detected, the robot arm 4 can be made to precisely follow the workpiece 7.
[0049] Next robot-arm-movement control processing performed in Step S4 of
[0050] First, the trajectory generating unit 531 calculates the position and the orientation (the rotation matrix Rr and the position matrix T.sub.r) of the TCP of the robot arm 4 (Step S31 in
[0051] Next, the singular-point avoiding unit 532 checks whether the position and the orientation of the TCP of the robot arm 4, which are calculated by the trajectory generating unit 531, are close to a singular point (Step S34 in
[0052] Finally, the command velocity q.sup.dot* for each axis, which is generated by the trajectory generating unit 531, and the interpolation velocity q.sup.dot, which is generated by the singular-point avoiding unit 532, are added to calculate a velocity command for each axis of the TCP of the robot arm 4 (Step S37 in
[0053] By doing so, a singular point can be avoided.
[0054] Although the embodiment of the present invention has been described above in detail with reference to the drawings, the specific configurations are not limited to those in the embodiment, and design changes etc. that do not depart from the scope of the present invention are also encompassed.
[0055] As a result, the above-described embodiment leads to the following aspect.
[0056] According to a first aspect, the present invention provides a robot control device including: a feature-point detecting unit that detects, from an image of an object acquired by a visual sensor, the positions of a plurality of feature points on the object, in a predetermined cycle; a position/orientation calculating unit that updates, in the predetermined cycle, respective equations of motion of the plurality of feature points on the basis of the detected positions of the plurality of feature points and that calculates the position or orientation of the object on the basis of the positions of the plurality of feature points calculated from the updated equations of motion; and a robot-arm-movement control unit that controls the movement of a robot arm so as to follow the object, on the basis of the calculated position or orientation of the object.
[0057] According to the robot control device of the above-described first aspect, the feature-point detecting unit detects, from an image of an object acquired by the visual sensor, the positions of a plurality of feature points on the object in a predetermined cycle. Next, the position/orientation calculating unit updates, in the predetermined cycle, equations of motion of the plurality of feature points on the basis of the positions of the plurality of feature points detected by the feature-point detecting unit. Furthermore, the position/orientation calculating unit calculates anew the positions of the plurality of feature points from the updated equations of motion and calculates the position or orientation of the object on the basis of the calculated positions of the plurality of feature points. Then, the robot-arm-movement control unit controls the robot arm so as to follow the object, on the basis of the position or orientation of the object calculated by the position/orientation calculating unit.
[0058] In this way, because the equation of motion of each feature point is updated in the predetermined cycle, building of an equation of motion of an object is enabled online. As a result, the robot arm can be made to follow an object that moves randomly, with high accuracy.
[0059] In the above-described robot control device, the position/orientation calculating unit may weight the plurality of feature points on the object on the basis of the frequencies of detection of the plurality of feature points.
[0060] By doing so, even if part of the plurality of feature points is blocked by obstacle or the like, and thus, the position or orientation of the object cannot be directly detected by a visual sensor, the position of each feature point calculated from the equation of motion and the position of each feature point that can be detected by the visual sensor are weighted on the basis of the frequencies of detection to calculate the position of the object, thereby making it possible to make the robot arm accurately follow the object.
[0061] In the above-described robot control device, the robot-arm-movement control unit may be provided with a singular-point avoiding unit that avoids a singular point of the robot arm.
[0062] With this configuration, a singular point can be avoided, thus making it possible to avoid an operational problem, such as abnormally fast rotation of a particular joint in the robot arm.
[0063] According to the present invention, an advantageous effect is afforded in that the robot arm can be made to accurately follow a workpiece that randomly moves.