ROBOT SYSTEM
20230364812 · 2023-11-16
Assignee
Inventors
- Yuutarou Takahashi (Yamanashi, JP)
- Fumikazu WARASHINA (Yamanashi, JP)
- Junichirou YOSHIDA (Yamanashi, JP)
Cpc classification
B25J9/1661
PERFORMING OPERATIONS; TRANSPORTING
B25J9/10
PERFORMING OPERATIONS; TRANSPORTING
International classification
B25J9/10
PERFORMING OPERATIONS; TRANSPORTING
Abstract
The objective of the present invention is to provide a robot system with which, if the position of a robot becomes displaced, it is easy to perform work by employing a camera or the like to apply a three-dimensional correction. This robot system is provided with: a robot 2; a robot conveying device 3 on which the robot is mounted, for moving the robot to a predetermined work space; at least two target marks 4 installed in the work space; a target mark position acquiring unit 5 for obtaining a three-dimensional position by using a vision sensor provided on the robot 2 to perform stereoscopic measurement of the at least two target marks 4; a displacement amount acquiring unit 6 for obtaining the displacement amount between the robot 2 and a desired relative position in the work space, from the acquired three-dimensional position; and a robot control unit 7 for activating the robot 2 using a value adjusted from a prescribed activation amount, using the acquired displacement amount.
Claims
1. A robot system comprising: a robot; a robot conveying device on which the robot is mounted, for moving the robot to a predetermined workspace; at least two target marks installed in the workspace; a target mark position acquiring unit for obtaining a three-dimensional position by using a vision sensor provided on the robot to perform stereoscopic measurement of the at least two target marks; a displacement amount acquiring unit for obtaining a displacement amount between the robot and a desired relative position in the workspace, from the acquired three-dimensional position; and a robot control unit for operating the robot using a value corrected from a prescribed operation amount, using the acquired displacement amount.
2. The robot system according to claim 1, wherein the visual sensor is provided on a movable part of the robot.
3. The robot system according to claim 1, wherein the at least two target marks comprise at least three target marks installed in the workspace, the visual sensor is provided on a hand section of the robot, and the robot control unit performs a three-dimensional and operates the robot.
4. The robot system according to claim 1, wherein an operation program for the robot, an image processing program including measuring settings for the visual sensor and a program for calculating the displacement amount, and camera calibration data for the visual sensor are set and packaged in advance.
5. The robot system according to claim 1, wherein one of the target marks is measured and the position thereof is obtained immediately before or while performing work, and in a case where the acquired displacement amount exceeds a preset threshold value, all the target marks in the workspace at the current time are measured and the displacement amount is reacquired.
6. The robot system according to claim 1, wherein rough positioning is performed using the target marks provided on a machine tool that is the workspace while or immediately before the robot enters the machine tool, and then the robot enters the machine tool that is the workspace and obtains the displacement amount of the machine tool using the target marks provided in an interior of the machine tool.
7. The robot system according to claim 6, wherein before the robot enters the machine tool, an alarm is issued when a space between the robot and the machine tool becomes equal to or less than a preset threshold value.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
PREFERRED MODE FOR CARRYING OUT THE INVENTION
[0020] A robot system according to an embodiment of the present invention is described below with reference to
[0021] As illustrated in
[0022] The visual sensor 51 of the target mark position acquiring unit 5 is provided on a movable part of the robot 2. Specifically, the visual sensor 51 is provided on a movable part such as a hand section, a wrist section, an arm section, or the like of the robot 2. In the present embodiment, stereoscopic measurement is performed, and therefore, a low-cost two-dimensional camera may be used as the visual sensor 51.
[0023] The robot 2 illustrated in
[0024] In the robot system 1 according to the present embodiment, for example, an operation program of the robot 2, an image processing program including the measuring settings for the visual sensor 51 and a program for calculating the displacement amount, and camera calibration data for the visual sensor 51 are set and packaged in advance, and stored in a storage unit 8. This will be described in detail below.
[0025] In addition, in the robot system 1 according to the present embodiment, one target mark 4 is measured and the position thereof is obtained immediately before or during the measuring work of the visual sensor 51, and a determination unit 9 determines whether or not the acquired displacement amount exceeds a preset threshold value. Then, in case the result of the determination indicates that the displacement amount exceeds the threshold value, all target marks 4 in the workspace at the current time are measured and the displacement amount is reacquired.
[0026] In addition, the robot system 1 according to the present embodiment is configured to perform rough positioning using the target marks 4 provided on a machine tool 10 that is the workspace while or immediately before the robot enters the machine tool 10, and then the robot enters the machine tool 10 that is the workspace and obtain a precise displacement amount of the machine tool 10 using the target marks 4 provided in the interior of the machine tool 10.
[0027] Further, the robot system 1 according to the present embodiment is provided with a warning unit 11, and is configured such that before the robot enters the machine tool 10, the warning unit 11 issues an alarm when the space between the robot 2 and the machine tool 10 becomes equal to or less than a preset threshold value.
[0028] With the robot system 1 according to the present embodiment having the configuration described above, two or more target marks 4 are installed in the workspace by pasting or the like, and each of the target marks 4 is stereoscopically measured to obtain a three-dimensional position. Preferably, three target marks are set, in which case at least two target marks 4 are set in the interior of the workspace and at least one is set in the exterior of the workspace.
[0029] For example, as illustrated in
[0030] At this time, one target mark 4 is detected at two camera (target mark position acquiring unit 5, visual sensor 51) positions, and the three-dimensional position of the target mark 4 is calculated through a stereoscopic calculation on the basis of the two detected results. For example, lines of sight from the camera toward the target mark 4 are detected (X, Y, W′, P′, R′), and the three-dimensional position of a workpiece is detected through a stereoscopic calculation using two pieces of line-of-sight data. W′ and P′ are direction vectors representing the lines of sight, and R′ is the angle around the target.
[0031] In a preferred aspect of the present embodiment, each of three target marks 4 installed on a surface of the machine tool 10 is stereoscopically measured to measure the three-dimensional position (X, Y, Z) of each target mark 4. By stereoscopically measuring each of the three target marks 4, a total of six detections are performed.
[0032] Next, by combining the acquired three-dimensional positions of the three target marks 4, the three-dimensional position and orientation of the machine tool 10 relative to the robot 2 are obtained. In other words, three locations on one object are three-dimensionally measured, and the measured results are combined to obtain the position and the orientation of the entire object. In the present embodiment, three locations on the surface of the machine tool 10 are measured, and the position and the orientation of the entire machine tool 10 are calculated.
[0033] For example, the three-dimensional position (X, Y, Z, W, P, R) of the entire machine tool is calculated from the three-dimensional positions (X, Y, Z) of the three target marks 4. At this time, the three-dimensional position (X, Y, Z, W, P, R) of the entire machine tool is calculated by calculating a coordinate system wherein the position of the first target mark 4 is determined as the origin, the position of the second target mark 4 is determined as an X-axis direction point, and the position of the third target mark 4 is determined as a point on an XY plane.
[0034] Next, as illustrated in
[0035] In the present embodiment, the displacement amount is calculated from the actual detected three-dimensional positions and orientations and an original reference position and orientation. A prescribed operation of the robot 2 is corrected by moving and rotating the coordinate system itself such that the machine tool in the actual detected position overlaps the machine tool in the reference position, and setting the thus obtained movement amount of the coordinate system as the displacement amount (correction amount). Although
[0036] In the present embodiment, all settings are set from the start on the basis of the correction method for the robot 2 described above, and made usable as a package. The specific components of the package are the operation program of the robot 2, the image processing program, and the camera calibration data. These are stored in the storage unit 8.
[0037] The storage unit 8 stores calibration data for the camera (visual sensor 51) using a coordinate system (mechanical interface coordinate system) set at the hand section 21 of the robot 2, that is to say, calibration data for the mechanical interface system. Meanwhile, the robot control unit 7 can ascertain the position of the hand section 21 of the robot 2 in the robot coordinate system at the time of capturing an image by the camera (visual sensor 51). Thus, by associating two-dimensional points in the sensor coordinate system and three-dimensional points in the mechanical interface coordinate system with one another using the calibration data stored in the storage unit 8, and coordinate transforming the mechanical interface coordinate system into the robot coordinate system according to the position of the hand section 21 of the robot 2 ascertained by the robot control unit 7, the two-dimensional points in the sensor coordinate system and the three-dimensional points in the robot coordinate system at the time of capturing an image by the camera (visual sensor 51) can be associated with one another. In other words, the position and orientation of the sensor coordinate system as seen from the robot coordinate system can be obtained, and thus the three-dimensional position can be calculated.
[0038] In the present embodiment, while or immediately before the robot 2 performs work with respect to the workspace, only one target mark is first measured visually, and in a case where the measured result is the same as when the above operation is performed, it is determined that the positional relationship of the robot and the workspace has not changed after performing the operation and the work is resumed, and in a case where the result differs, the work is interrupted and the operation performed again.
[0039] Always measuring all target marks 4 every time takes a lot of time, but with the technique according to the present embodiment, the time can be shortened. The threshold for determining that the positions are the same can be set according to the total required precision of the system.
[0040] When the workspace is a machine tool 10 as in the present embodiment (when the workspace is set to the interior (inside) of the machine tool 10), rough positioning using the target marks 4 provided on the outside of the machine tool 10 is performed while or immediately before the robot 2 enters the machine tool 10, and then the robot 2 enters the machine tool 10 and performs precise positioning (two-step positioning) using the target marks 4 provided in the interior of the machine tool 10.
[0041] In case precision is required, it is desirable to perform positioning relative to a table or the like in the interior of the machine tool 10, but when the frontage of the machine tool 10 is narrow, without measuring there is a possibility that the robot 2 collides with the inlet of the machine tool 10. In such case, it is sufficient that the robot 2 can be moved so as to not collide, and that an alarm can be raised if a collision is imminent.
[0042] Therefore, according to the robot system 1 according to the present embodiment, even when the position of the robot 2 becomes displaced due to the movement of the robot conveying device 3 such as a cart or an AGV, a three-dimensional six-degree-of-freedom correction can be applied such that the robot 2 can perform work. By applying a three-dimensional six-degree-of-freedom correction, it is possible to apply corrections that would not be possible with a simple XYZ three-dimensional correction, such as in cases where the floor is not flat or irregular.
[0043] In addition, by performing stereoscopic measurement of each of two or more target marks, a three-dimensional correction can be applied using, for example, a low-cost two-dimensional camera. In particular, by performing stereoscopic measurement of three or more target marks 4, a six-degree-of-freedom correction can be applied even using a low-cost two-dimensional camera. When using two marks, an amount of rotation about an axis that is a line connecting the two marks cannot be identified. However, in cases where this amount of rotation is not susceptible to change due to the configuration of the system, the configuration is sufficiently practical.
[0044] Further, the correction can be applied automatically and the robot 2 can perform work without the user having to pay attention to the concept of coordinate systems or vision settings.
[0045] An embodiment of the present robot system is described above, but the present invention is not limited to the embodiment described above, and may be modified as appropriate within a scope that does not depart from the gist thereof.
EXPLANATION OF REFERENCE NUMERALS
[0046] 1 Robot system [0047] 2 Robot [0048] 3 Robot conveying device [0049] 4 Target mark [0050] 5 Target mark position acquiring unit [0051] 6 Displacement amount acquiring unit [0052] 7 Robot control unit [0053] 8 Storage unit [0054] 9 Determination unit [0055] 10 Machine tool (industrial machine) [0056] 11 Warning unit [0057] 21 Hand section [0058] 51 Visual sensor