DEVICE FOR TEACHING POSITION AND POSTURE FOR ROBOT TO GRASP WORKPIECE, ROBOT SYSTEM, AND METHOD
20240342919 ยท 2024-10-17
Assignee
Inventors
Cpc classification
G05B19/42
PHYSICS
B25J9/1669
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A device includes: an image data acquisition unit that acquires, when a robot is grasping a workpiece by a hand, image data of the workpiece imaged by a visual sensor disposed at a known position on a control coordinate system; a workpiece position acquisition unit that acquires workpiece position data indicating a position and a posture of the workpiece on the basis of the image data; a hand position acquisition position that acquires hand position data indicating a position and a posture of the hand obtained when the visual sensor has imaged the image data; and a teaching position acquisition unit that acquires, on the basis of the workpiece position data and the hand position data, teaching position data indicating a positional relationship between the hand and the workpiece obtained when the visual sensor has imaged the image data.
Claims
1. A device configured to teach a position and orientation at which a robot grips a workpiece with a hand in a control coordinate system for controlling the robot, the device comprising: an image data acquisition unit configured to acquire image data of the workpiece imaged by a vision sensor arranged at a known position in the control coordinate system when the robot grips the workpiece with the hand; a workpiece position acquisition unit configured to acquire workpiece position data, which indicates a position and orientation of the workpiece in the control coordinate system when the vision sensor images the image data, based on the image data; a hand position acquisition unit configured to acquire hand position data, which indicates a position and orientation of the hand in the control coordinate system when the vision sensor images the image data; and a teaching position acquisition unit configured to acquire teaching position data, which indicates a positional relationship between the hand and the workpiece in the control coordinate system when the vision sensor images the image data, based on the workpiece position data and the hand position data.
2. The device of claim 1, further comprising a robot control unit configured to operate the robot so as to repeatedly change the orientation of the hand gripping the workpiece, wherein the image data acquisition unit is configured to acquire a plurality of pieces of the image data imaged by the vision sensor every time the robot control unit changes the orientation of the hand, wherein the workpiece position acquisition unit is configured to acquire a plurality of pieces of the workpiece position data respectively when each of the plurality of pieces of the image data is imaged, based on each of the plurality of pieces of the image data acquired by the image data acquisition unit, wherein the hand position acquisition unit is configured to acquire a plurality of pieces of the hand position data respectively when each of the plurality of pieces of the image data is imaged, and wherein the teaching position acquisition unit is configured to acquire a plurality of pieces of the teaching position data respectively when each of the plurality of pieces of the image data is imaged, based on the respective pieces of the workpiece position data acquired by the workpiece position acquisition unit and on the respective pieces of the hand position data acquired by the hand position acquisition unit.
3. The device of claim 2, wherein the teaching position acquisition unit is configured to obtain new teaching position data to be used for an operation to cause the robot to grip the workpiece with the hand, based on the plurality of pieces of the acquired teaching position data.
4. The device of claim 3, wherein the teaching position data is represented as coordinates of the control coordinate system, and wherein the teaching position acquisition unit is configured to obtain the new teaching position data by: excluding the coordinates of the teaching position data, which are outside of a predetermined allowable range, among the coordinates of the plurality of pieces of the teaching position data; or obtaining an average of the coordinates of the plurality of pieces of the teaching position data.
5. The device of claim 1, further comprising an operation program generating unit configured to generate an operation program in which the teaching position data is defined.
6. The device of claim 1, wherein the workpiece position acquisition unit is configured to acquire, as the workpiece position data, data indicating a position and orientation in the control coordinate system of a workpiece model which models the workpiece when the workpiece model is matched to the workpiece shown in the image data.
7. The device of claim 1, wherein the control coordinate system includes: a robot coordinate system set to the robot; a workpiece coordinate system set to the workpiece; a tool coordinate system set to the hand and having a known positional relationship with the robot coordinate system; and a sensor coordinate system set to the vision sensor and having a known positional relationship with the robot coordinate system, wherein the vision sensor is arranged at the known position in the robot coordinate system, wherein the workpiece position acquisition unit is configured to: acquire first coordinates in the sensor coordinate system of the workpiece coordinate system indicating a position and orientation of the workpiece shown in the image data; and acquire, as the workpiece position data, second coordinates of the workpiece coordinate system in the robot coordinate system by transforming the first coordinates into the robot coordinate system, wherein the hand position acquisition unit is configured to acquire, as the hand position data, third coordinates in the robot coordinate system of the tool coordinate system indicating a position and orientation of the hand, and wherein the teaching position acquisition unit is configured to acquire the teaching position data as coordinates of the workpiece coordinate system in the tool coordinate system or as coordinates of the tool coordinate system in the workpiece coordinate system, based on the second coordinates and the third coordinates.
8. A robot system comprising: a robot including a hand configured to grip a workpiece; a vision sensor configured to image the workpiece; and the device of claim 1.
9. The robot system of claim 8, comprising a controller configured to control the robot so as to grip the workpiece with the hand, based on second image data of the workpiece imaged by the vision sensor, wherein the controller is configured to: acquire, as second workpiece position data, data indicating a position and orientation in the control coordinate system of the workpiece shown in the second image data; and determine a position and orientation of the hand in the control coordinate system when the hand grips the workpiece imaged by the vision sensor, based on the second workpiece position data and the teaching position data.
10. A method of teaching a position and orientation at which a robot grips a workpiece with a hand in a control coordinate system for controlling the robot, the method comprising: acquiring, by a processor, image data of the workpiece imaged by a vision sensor arranged at a known position in the control coordinate system when the robot grips the workpiece with the hand; acquiring, by the processor, workpiece position data indicating a position and orientation of the workpiece in the control coordinate system when the vision sensor images the image data, based on the image data; acquiring, by the processor, hand position data indicating a position and orientation of the hand in the control coordinate system when the vision sensor images the image data; and acquiring, by the processor, teaching position data indicating a positional relationship between the hand and the workpiece in the control coordinate system when the vision sensor images the image data, based on the workpiece position data and the hand position data.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
[0017] Embodiments of the present disclosure are described in detail below based on the drawings. In the various embodiments described below, the same reference signs are given to the same elements and redundant descriptions are omitted. First, a robot system 10 according to an embodiment will be described with reference to
[0018] In the present embodiment, the robot 12 is a vertical articulated robot and includes a robot base 20, a swivel body 22, a lower arm 24, an upper arm 26, a wrist 28, and a hand 30. The robot base 20 is secured on the floor of a work cell. The swivel body 22 is provided on the robot base 20 so as to be rotatable around the vertical shaft. The lower arm 24 is provided at the swivel body 22 so as to be rotatable around the horizontal shaft, and the upper arm 26 is rotatably provided at the tip part of the lower arm 24. The wrist 28 is rotatably provided at the tip part of the upper arm 26.
[0019] A plurality of servo motors 40 (
[0020] The hand 30 is removably attached to the tip part (so-called wrist flange) of the wrist 28 and moved by the mechanical part 42 of the robot 12. Specifically, as illustrated in
[0021] The claw parts 34 and 36 are openably and closably provided at the tip part of the hand arm 32. In the present embodiment, each of the claw parts 34 and 36 is a cylindrical rod member extending in a straight line. The claw part drive part 38 includes, for example, a pneumatic or hydraulic cylinder or a servo motor to open and close the claw parts 34 and 36 in response to commands from the controller 16. The hand 30 can grip and release the workpiece W by opening and closing the claw parts 34 and 36.
[0022] Referring again to
[0023] In the present embodiment, the vision sensor 14 is a three-dimensional vision sensor including an image sensor (CMOS, CCD, or the like) and an optical lens (collimating lens, focusing lens, or the like) that guides the subject image to the image sensor and is configured to image the subject along a visual line direction VL and measure the distance d to the subject image.
[0024] The teaching device 18 teaches the robot 12 to grip the workpieces W that are stacked in pieces in the container A with the hand 30. Specifically, the teaching device 18 is a portable computer such as a teaching pendant or a tablet-type terminal device and includes a processor 50, a memory 52, an I/O interface 54, a display device 56, and an input device 58. The processor 50 includes a CPU or a GPU, is communicably connected to the memory 52, the I/O interface 54, the display device 56, and the input device 58 via a bus 60, and performs arithmetic processing to achieve the teaching function described below while communicating with these components.
[0025] The memory 52 includes a RAM or a ROM and temporarily or permanently stores various data. The I/O interface 54 includes, for example, an Ethernet (trade name) port, a USB port, an optical fiber connector, or an HDMI (trade name) terminal and communicates data wiredly or wirelessly with an external device under a command from the processor 50.
[0026] The display device 56 includes a liquid crystal display or an organic EL display and displays various kinds of data visually under a command from the processor 50. The input device 58 includes a push button, a keyboard, a mouse, or a touch panel and accepts input data from an operator.
[0027] The teaching device 18 is configured to send a command to the robot 12 via the controller 16 in response to input data to the input device 58 and to allow the robot 12 to jog in accordance with the command. The display device 56 and the input device 58 may be integrated into the housing of the teaching device 18 or may be externally attached to the housing separately from the housing of the teaching device 18.
[0028] The controller 16 controls the operation of the robot 12 and the vision sensor 14. Specifically, the controller 16 is a computer with a processor 70, a memory 72, an I/O interface 74, a display device 76, and an input device 78. The configurations and functions of the processor 70, the memory 72, the I/O interface 74, the display device 76, and the input device 78 are the same as those of the processor 50, the memory 52, the I/O interface 54, the display device 56, and the input device 58, which are described above, so that overlapping descriptions are omitted.
[0029] The processor 70 is communicably connected to the memory 72, the I/O interface 74, the display device 76, and the input device 78 via a bus 80 and, while communicating with these components, performs arithmetic processing to achieve the function of operating the robot 12 and the vision sensor 14. The I/O interface 54 of the teaching device 18, the respective servo motors 40 of the robot 12, and the vision sensor 14 are connected to the I/O interface 74, and the processor 70 communicates with these components through the I/O interface 74.
[0030] As illustrated in
[0031] On the other hand, a tool coordinate system C2 is set at the hand 30 of the robot 12 as illustrated in
[0032] The positional relationship between the tool coordinate system C2 and the robot coordinate system C1 is known, and the coordinates of the tool coordinate system C2 and the coordinates of the robot coordinate system C1 can be mutually transformed via a known transformation matrix M1 (e.g., the homogeneous transformation matrix). Thus, the origin position and the direction of each axis of the tool coordinate system C2 in the robot coordinate system C1 are represented as coordinates (X.sub.RT, Y.sub.RT, Z.sub.RT, W.sub.RT, P.sub.RT, R.sub.RT) of the robot coordinate system C1. Here, the coordinates (X.sub.RT, Y.sub.RT, Z.sub.RT) indicate the origin position of the tool coordinate system C2 in the robot coordinate system C1, and the coordinates (W.sub.RT, P.sub.RT, R.sub.RT) indicate the direction (so-called yaw, pitch, roll) of each axis of tool coordinate system C2 in the robot coordinate system C1.
[0033] When the hand 30 is positioned at a predetermined position and orientation by the mechanical part 42 of the robot 12, the processor 70 of the controller 16 first sets the tool coordinate system C2 representing the predetermined position and orientation in the robot coordinate system C1. Subsequently, the processor 70 generates a command to each of the servo motors 40 to arrange the hand 30 at the position and orientation determined by the set tool coordinate system C2 and moves the hand 30 by operating the mechanical part 42 in response to the command. Thus, the processor 70 can position the hand 30 at the predetermined position and orientation by the operation of the mechanical part 42 in the robot coordinate system C1.
[0034] As illustrated in
[0035] Here, in the present embodiment, the vision sensor 14 is disposed at a known position of the robot coordinate system C1 by the retention frame 44. More specifically, the positional relationship between the sensor coordinate system C3 and the robot coordinate system C1 is known by calibration, and the coordinates of the sensor coordinate system C3 and the coordinates of the robot coordinate system C1 can be mutually transformed via a known transformation matrix M2 (e.g., the homogeneous transformation matrix). Thus, the position and orientation (i.e., the origin position and the direction of each axis of the sensor coordinate system C3) of the vision sensor 14 in the robot coordinate system C1 are known.
[0036] In an actual work line, the processor 70 of the controller 16 images the workpieces W, which are stacked in pieces in the container A, with the vision sensor 14 and, based on imaged image data ID, executes the operation to cause the robot 12 to grip the workpiece W with the hand 30 and pick the workpiece W up from the container A.
[0037] In the example illustrated in
[0038] A workpiece coordinate system C4 is set at the workpiece W to perform the operation of the robot 12 gripping the workpiece W with the hand 30. The workpiece coordinate system C4 is a control coordinate system C that determines the position and orientation of the workpiece W in the robot coordinate system C1. In the present embodiment, the workpiece coordinate system C4 is set with respect to the workpiece W such that an origin of the workpiece coordinate system C4 is located at the center of the shaft W1, a y-axis of the workpiece coordinate system C4 is parallel (or coincides) with the axis B, and a z-axis of the workpiece coordinate system C4 is parallel to the central axes of the through holes H1 and H2.
[0039]
[0040] The following describes a method of teaching the robot 12 an operation for gripping the workpiece W in the robot system 10. First, an operator causes the hand 30 of the robot 12 to grip the workpiece W at a gripping position that the operator wants to teach. For example, the operator controls the teaching device 18 to jog the robot 12, so that the workpiece W is gripped by the hand 30.
[0041] More specifically, the operator, while viewing the display device 56 of the teaching device 18, operates the input device 58 to move the hand 30 by the mechanical part 42 of the robot 12 and inserts the closed claw parts 34 and 36 into the through hole H1 of the workpiece W arranged at a predetermined storage location. Subsequently, the operator grips the workpiece W by operating the input device 58 to open the claw parts 34 and 36 and pressing the claw parts 34 and 36 against the inner wall surface of the through hole H1.
[0042] At this time, the operator causes the robot 12 to jog so that the hand 30 grips the workpiece W at the gripping position that the operator wants to teach. Hereafter, a case in which the operator causes the hand 30 to grip the workpiece W at the gripping position illustrated in
[0043] Subsequently, the operator operates the input device 58 to cause the vision sensor 14 to image the workpiece W gripped by the hand 30. The vision sensor 14 receives an imaging command from the teaching device 18 via the controller 16 and images image data ID.sub.1 of the workpiece W. An example of the image data ID.sub.1 is illustrated in
[0044] In the example illustrated in
[0045] The processor 50 of the teaching device 18 acquires the image data ID.sub.1 from the vision sensor 14 through the controller 16 and the I/O interface 54. Thus, in the present embodiment, the processor 50 serves as an image data acquisition unit 82 (
[0046] Subsequently, based on the image data ID.sub.1, the processor 50 acquires the workpiece position data WPD.sub.1 indicating the position and orientation of the workpiece W in the robot coordinate system C1 when the vision sensor 14 images the image data ID.sub.1. Specifically, the processor 50 first obtains a workpiece model WM which models the workpiece W. This workpiece model WM is, for example, a three-dimensional CAD model, which is stored in advance in the memory 52.
[0047] The processor 50 analyzes the three-dimensional point cloud image of the workpiece W shown in the image data ID.sub.1 using a predetermined pattern matching parameter and arranges the workpiece model WM in the image data ID.sub.1 in a simulated manner so as to match the workpiece W shown in the image data ID.sub.1.
[0048] Subsequently, the processor 50 sets the workpiece coordinate system C4 in the positional relationship illustrated in
[0049] Here, the coordinates (X.sub.SW_1, Y.sub.SW_1, Z.sub.SW_1) indicate the origin position of the workpiece coordinate system C4 in the sensor coordinate system C3, and the coordinates (W.sub.SW_1, P.sub.SW_1, R.sub.SW_1) indicate the direction (so-called yaw, pitch, roll) of each axis of the workpiece coordinate system C4 in the sensor coordinate system C3. The processor 50 acquires the coordinates Q.sub.SW_1 of the workpiece coordinate system C4 in the sensor coordinate system C3 as data indicating the position and orientation, in the sensor coordinate system C3, of the workpiece W shown in the image data ID.sub.1.
[0050] Subsequently, the processor 50 uses the above-described transformation matrix M2 to transform the acquired coordinates Q.sub.SW_1 into the robot coordinate system C1, thereby obtaining coordinates Q.sub.RW_1 (X.sub.RW_1, Y.sub.RW_1, Z.sub.RW_1, W.sub.RW_1, P.sub.RW_1, R.sub.RW_1) (second coordinates) of the workpiece coordinate system C4 illustrated in
[0051] The processor 50 acquires these coordinates Q.sub.RW_1 as workpiece position data WPD.sub.1. Thus, in the present embodiment, the processor 50 serves as a workpiece position acquisition unit 84 (
[0052] On the other hand, the processor 50 acquires hand position data HPD.sub.1 indicating the position and orientation of the hand 30 in the robot coordinate system C1 when the vision sensor 14 images the image data ID.sub.1. Specifically, the processor 50 acquires, as the hand position data HPD.sub.1, the coordinates Q.sub.RT_1 (X.sub.RT_1, Y.sub.RT_1, Z.sub.RT_1, W.sub.RT_1, P.sub.RT_1, R.sub.RT_1) (third coordinates) of the tool coordinate system C2 in the robot coordinate system C1 when the vision sensor 14 images the image data ID.sub.1. Thus, in the present embodiment, the processor 50 serves as a hand position acquisition unit 86 (
[0053] Subsequently, based on the workpiece position data WPD.sub.1 and the hand position data HPD.sub.1 that are acquired, the processor 50 acquires teaching position data TPD.sub.1 indicating the positional relationship between the hand 30 and the workpiece W in the control coordinate system C when the vision sensor 14 images the image data ID.sub.1.
[0054] As an example, the processor 50 transforms the coordinates Q.sub.RW_1 as the workpiece position data WPD.sub.1 into the coordinates of the tool coordinate system C2 represented by the coordinates Q.sub.RT_1 as the hand position data HPD.sub.1, based on coordinates Q.sub.RW_1 acquired as the workpiece position data WPD.sub.1 and the coordinates Q.sub.RT_1 acquired as the hand position data HPD.sub.1. Since the positional relationship between the tool coordinate system C2 and the workpiece coordinate system C4 in the robot coordinate system C1 is known by the coordinates Q.sub.RW_1 and Q.sub.RT_1, the coordinates Q.sub.RW_1 of the workpiece coordinate system C4 in the robot coordinate system C1 can be transformed into the tool coordinate system C2.
[0055] With this coordinate transformation, the processor 50 acquires the coordinates Q.sub.TW_1 (X.sub.TW_1, Y.sub.TW_1, Z.sub.TW_1, W.sub.TW_1, P.sub.TW_1, R.sub.TW_1) of the workpiece coordinate system C4 in the tool coordinate system C2 when the vision sensor 14 images the image data ID.sub.1. These coordinates Q.sub.TW_1 are data indicating the position and orientation (i.e., the origin position and direction of each axis of the workpiece coordinate system C4) of the workpiece W relative to the hand 30 (i.e., tool coordinate system C2) when the vision sensor 14 images the image data ID.sub.1. The processor 50 acquires these coordinates Q.sub.TW_1 as the teaching position data TPD.sub.1.
[0056] As another example, based on the coordinates Q.sub.RW_1 as the workpiece position data WPD.sub.1 and the coordinates Q.sub.RT_1 as the hand position data HPD.sub.1, the processor 50 transforms the coordinates Q.sub.RT_1 as the hand position data HPD.sub.1 into the coordinates of the workpiece coordinate system C4 represented by the coordinates Q.sub.RW_1 as the workpiece position data WPD.sub.1. Thus, the processor 50 acquires coordinates Q.sub.WT_1 (X.sub.WT_1, Y.sub.WT_1, Z.sub.WT_1, W.sub.WT_1, P.sub.WT_1, R.sub.WT_1) of the tool coordinate system C2 in the workpiece coordinate system C4 when the vision sensor 14 images the image data ID.sub.1.
[0057] These coordinates Q.sub.WT_1 are data indicating the position and orientation (i.e., the origin position and the direction of each axis of the tool coordinate system C2) of the hand 30 relative to the workpiece W (i.e., the workpiece coordinate system C4) when the vision sensor 14 images the image data ID.sub.1. The processor 50 acquires these coordinates Q.sub.WT_1 as teaching position data TPD.sub.1.
[0058] Thus, in the present embodiment, the processor 50 serves as a teaching position acquisition unit 88 (
[0059] After acquiring the teaching position data TPD.sub.1, the operator operates the input device 58 to change the orientation of the hand 30 while gripping the workpiece W by the operation of the mechanical part 42 of the robot 12. For example, the operator operates the input device 58 to enter input data for rotating the hand 30 by a predetermined angle ? (e.g., 10 degrees) around the x-axis, y-axis, or z-axis of the tool coordinate system C2, which is set in the robot coordinate system C1 at this point.
[0060] Alternatively, the operator may enter input data to rotate the hand 30 by a predetermined angle ? about the x-axis, y-axis, or z-axis of the workpiece coordinate system C4, which is set in the robot coordinate system C1 at this point. For example, when the y-axis of the workpiece coordinate system C4 coincides with the axis B of the workpiece W, the operator may enter input data to rotate the hand 30 around the y-axis (i.e., axis B) of the workpiece coordinate system C4.
[0061] In response to input data from the operator, the processor 50 sends a command to the servo motors 40 of the robot 12 via the controller 16 to operate the mechanical part 42 so as to change the orientation of the hand 30 gripping the workpiece W. Thus, in the present embodiment, the processor 50 serves as a robot control unit 90 (
[0062] When the orientation of the hand 30 is changed, the operator operates the input device 58 to cause the vision sensor 14 to image the workpiece W gripped by the hand 30, and the processor 50, which serves as the image data acquisition unit 82, acquires image data ID.sub.2 of the workpiece W. Subsequently, the processor 50 serves as the workpiece position acquisition unit 84 and acquires the coordinates Q.sub.SW_2 (X.sub.SW_2, Y.sub.SW_2, Z.sub.SW_2, W.sub.SW_2, P.sub.SW_2, R.sub.SW_2) (first coordinates) of the workpiece coordinate system C4 in the sensor coordinate system C3 by applying the workpiece model WM to the image data ID.sub.2 by the above-described method.
[0063] Subsequently, the processor 50 serves as the workpiece position acquisition unit 84 and acquires the coordinates Q.sub.RW_2 (X.sub.RW_2, Y.sub.RW_2, Z.sub.RW_2, W.sub.RW_2, P.sub.RW_2, R.sub.RW_2) (second coordinates) of the workpiece coordinate system C4 in the robot coordinate system C1, as workpiece position data WPD.sub.2 when the image data ID.sub.2 is imaged, by transforming the acquired coordinates Q.sub.SW_2 into the robot coordinate system C1 by the above-described method.
[0064] In addition, the processor 50 serves as the hand position acquisition unit 86 and acquires the coordinates Q.sub.RT_2 (X.sub.RT_2, Y.sub.RT_2, Z.sub.RT_2, W.sub.RT_2, P.sub.RT_2, R.sub.RT_2) (third coordinates) of the tool coordinate system C2 in the robot coordinate system C1 as hand position data HPD.sub.2 during imaging of the image data ID.sub.2 by the above-described method.
[0065] Subsequently, the processor 50 serves as the teaching position acquisition unit 88 and acquires the coordinates Q.sub.TW_2 (X.sub.TW_2, Y.sub.TW_2, Z.sub.TW_2, W.sub.TW_2, P.sub.TW_2, R.sub.TW_2) of the workpiece coordinate system C4 in the tool coordinate system C2 or the coordinates Q.sub.WT_2 (X.sub.WT_2, Y.sub.WT_2, Z.sub.WT_2, W.sub.WT_2, P.sub.WT_2, R.sub.WT_2) of the tool coordinate system C2 in the workpiece coordinate system C4, as the teaching position data TPD.sub.2 during imaging of the image data ID.sub.2 by the above-described method.
[0066] In this way, the operator operates the input device 58 to repeatedly change the orientation of the hand 30, the vision sensor 14 images the workpiece W gripped by the hand 30 each time the orientation of the hand 30 is changed, and the processor 50 serves as the image data acquisition unit 82 to acquire a plurality of pieces of image data ID.sub.n (n=1, 2, 3, . . . ) imaged by the vision sensor 14.
[0067] The processor 50 serves as the workpiece position acquisition unit 84 to acquire workpiece position data WPD.sub.n: coordinates Q.sub.RW_n (X.sub.RW_n, Y.sub.RW_n, Z.sub.RW_n, W.sub.RW_n, P.sub.RW_n, R.sub.RW_n) when each image data ID.sub.n is imaged, by the above-described method and serves as the hand position acquisition unit 86 to acquire hand position data HPD.sub.n: coordinates Q.sub.RT_n (X.sub.RT_n, Y.sub.RT_n, Z.sub.RT_n, W.sub.RT_n, P.sub.RT_n, R.sub.RT_n) when each image data ID.sub.n is imaged, by the above-described method.
[0068] Subsequently, the processor 50 serves as the teaching position acquisition unit 88 and acquires teaching position data TPD.sub.n: coordinates Q.sub.TW_n (X.sub.TW_n, Y.sub.TW_n, Z.sub.TW_n, W.sub.TW_n, P.sub.TW_n, R.sub.TW_n) or coordinates Q.sub.WT_n (X.sub.WT_n, Y.sub.WT_n, Z.sub.WT_n, W.sub.WT_n, P.sub.WT_n, R.sub.WT_n) when each image data ID.sub.n is imaged, based on the corresponding workpiece position data WPD.sub.n(coordinates Q.sub.RW_n) and the corresponding hand position data HPD.sub.n(coordinates Q.sub.RT_n) that are acquired, by the above-described method. Thus, the processor 50 can acquire a plurality of pieces of teaching position data TPD.sub.n corresponding to the various orientations of the hand 30 and the workpiece W.
[0069] Subsequently, the processor 50 serves as the teaching position acquisition unit 88 and obtains, based on the plurality of pieces of teaching position data TPD.sub.n that are acquired, new teaching position data TPD.sub.0 that is to be used in the operation to cause the robot 12 to actually grip the workpiece W with the hand 30. The following describes how the processor 50 obtains new teaching position data TPD.sub.0 upon acquiring the coordinates Q.sub.TW_n (X.sub.TW_n, Y.sub.TW_n, Z.sub.TW_n, W.sub.TW_n, P.sub.TW_n, R.sub.TW_n) of the tool coordinate system C2 as the teaching position data TPD.sub.n.
[0070] First, for each of the plurality of coordinates Q.sub.TW_n, the processor 50 performs processing PR1, which excludes coordinates outside of the predetermined allowable range. Specifically, the processor 50 performs the processing PR1, which excludes coordinates outside of the allowable range, for coordinates (X.sub.TW_n, Y.sub.TW_n, Z.sub.TW_n) representing positions in the coordinates Q.sub.TW_n.
[0071] As an example, the processor 50 obtains the distance ?.sub.n from the origin of the tool coordinate system C2, from the equation: ?.sub.n=(X.sub.TW_n.sup.2+Y.sub.TW_n.sup.2+Z.sub.TW_n.sup.2).sup.1/2. Subsequently, the processor 50 determines whether or not the obtained distance ?.sub.n is within a predetermined allowable range [?.sub.th1, ?.sub.th2], and when the distance ?.sub.n is within the allowable range [?.sub.th1, ?.sub.th2] (i.e., ?.sub.th1??n??.sub.th2), the acquired coordinates Q.sub.TW_n are registered in the memory 52 as the active coordinates group GRP, while when the distance ?.sub.n is outside of the allowable range [?.sub.th1, ?.sub.th2] (i.e., ?n<?.sub.th1, or ?.sub.th2<?.sup.n), the acquired coordinates Q.sub.TW_n are excluded (or deleted from the memory 52) from the active coordinates group GRP.
[0072] As another example, the processor 50 obtains the average coordinates Q.sub.TW_AV (X.sub.TW_AV, Y.sub.TW_AV, Z.sub.TW_AV) of the coordinates Q.sub.TW_n (X.sub.TW_n, Y.sub.TW_n, Z.sub.TW_n). Specifically, the processor 50 obtains the average coordinates Q.sub.TW_AV (X.sub.TW_AV, Y.sub.TW_AV, Z.sub.TW_AV) from the equations: X.sub.TW_AV=1/n.Math.?(X.sub.TW_n), Y.sub.TW_AV=1/n.Math.?(Y.sub.TW_n), and Z.sub.TW_AV=1/n.Math.?(Z.sub.TW_n).
[0073] Furthermore, the processor 50 obtains the standard deviation ?.sub.X, ?.sub.Y and ?.sub.Z of the coordinates Q.sub.TW_n (X.sub.TW_n, Y.sub.TW_n, Z.sub.TW_n), respectively. For example, the processor 50 obtains the above from the following equations: ?.sub.X=(1/n.Math.?{X.sub.TW_n?X.sub.TW_AV}).sup.1/2, ?.sub.Y=(1/n.Math.?{Y.sub.TW_n?Y.sub.TW_AV}).sup.1/2, ?.sub.Z=(1/n.Math.?{Z.sub.TW_n?Z.sub.TW_AV}).sup.1/2.
[0074] Subsequently, for each of the coordinates X.sub.TW_n, Y.sub.TW_n, and Z.sub.TW_n in the coordinates Q.sub.TW_n, the processor 50 determines the allowable range using the obtained average and standard deviation a and the predetermined coefficient ? (e.g., ? is a positive integer) as follows: [X.sub.TW_AV???.sub.X, X.sub.TW_AV+??.sub.X] (i.e., X.sub.TW_AV???.sub.X?X.sub.TW_n?X.sub.TW_AV+??.sub.X), [Y.sub.TW_AV???.sub.Y, Y.sub.TW_AV+??.sub.Y] (i.e., Y.sub.TW_AV???.sub.Y?Y.sub.TW_n?Y.sub.TW_AV+??.sub.Y), and [Z.sub.TW_AV???.sub.Z, Z.sub.TW_AV+??.sub.Z] (i.e., Z.sub.TW_AV???.sub.Z?Z.sub.TW_n?Z.sub.TW_AV+??.sub.Z).
[0075] The processor 50 determines whether or not the coordinates X.sub.TW_n are within the allowable range [X.sub.TW_AV??.sub.X, X.sub.TW_AV+?.sub.X], whether or not the coordinates Y.sub.TW_n are within the allowable range [Y.sub.TW_AV??.sub.Y, Y.sub.TW_AV+?.sub.Y], and whether or not the coordinates Z.sub.TW_n are within the allowable range [Z.sub.TW_AV??.sub.Z, Z.sub.TW_AV+?.sub.Z].
[0076] Subsequently, the processor 50 registers the acquired coordinates Q.sub.TW_n as the active coordinates group GRP when all of the coordinates X.sub.TW_n, Y.sub.TW_n, and Z.sub.TW_n are within the allowable range, while excludes the acquired coordinates Q.sub.TW_n from the active coordinates group GRP when at least one of the coordinates X.sub.TW_n, Y.sub.TW_n, and Z.sub.TW_n are outside of the allowable range.
[0077] In addition, the processor 50 performs the processing PR1, which excludes the coordinates outside of the allowable range, for the coordinates (W.sub.TW_n, P.sub.TW_n, R.sub.TW_n) representing the orientation in the coordinates Q.sub.TW_n. Specifically, the processor 50 first represents the coordinates (W.sub.TW_n, P.sub.TW_n, R.sub.TW_n) indicating the orientation as a 3?3 known matrix M3.sub._n.
[0078] In this matrix M3.sub._n, a vector VT1.sub._n represented by three parameters in a first column is a unit vector representing the rotation component around the x-axis of the tool coordinate system C2, a vector VT2.sub._n represented by three parameters in a second column is a unit vector representing the rotation component around the y-axis of the tool coordinate system C2, and a vector VT3.sub._n represented by three parameters in a third column is a unit vector representing the rotation component around the z-axis of the tool coordinate system C2.
[0079] For example, the processor 50 obtains the inner product IP1.sub.n of the vector VT1.sub._n of the matrix M3.sub._n representing the first coordinates Q.sub.TW_n (W.sub.TW_n, P.sub.TW_n, R.sub.TW_n) and the vector VT1.sub._n+1 of the matrix M3.sub._n+1 representing the second coordinates Q.sub.TW_n+1 (W.sub.TW_n+1, P.sub.TW_n+1, R.sub.TW_n+1). This inner product IP1.sub.n represents an angle ?1 (specifically, cos ?1) between the vector VT1.sub._n and the vector VT1.sub._n+1, that is, the amount of change in the rotation component around the x-axis of the tool coordinate system C2.
[0080] The processor 50 also obtains inner product IP3.sub.n of the vector VT3.sub._n of the matrix M3.sub._n and the vector VT3.sub._n+1 of the matrix M3.sub._n+1. This inner product IP3.sub.n represents the angle ?3 (specifically, cos ?3) between the vector VT3.sub._n and the vector VT3.sub._n+1, that is, the amount of change in the rotation component around the z-axis of the tool coordinate system C2.
[0081] Subsequently, the processor 50 determines whether or not the obtained inner product IP1.sub.n is equal to or more than a predetermined threshold value IP1.sub.th (IP1.sub.n?IP1.sub.th), and also determines whether or not the obtained inner product IP3n is equal to or more than a predetermined threshold value IP3.sub.th (IP3.sub.n?IP3.sub.th). When IP1.sub.n?IP1.sub.th and IP3.sub.n>IP3.sub.th, the processor 50 registers both of the acquired first coordinates Q.sub.TW_n and second coordinates Q.sub.TW_n+1 as active coordinates group GRP in the memory 52.
[0082] On the other hand, the processor 50 excludes (or deletes from the memory 52) from the active coordinates group GRP, either the acquired first coordinates Q.sub.TW_n or the second coordinates Q.sub.TW_n+1 when IP1.sub.n<IP1.sub.th or IP3.sub.n<IP3.sub.th. The operator may decide in advance which of the first coordinates Q.sub.TW_n and the second coordinates Q.sub.TW_n+1 are to be excluded.
[0083] The processor 50 may obtain the inner product IP1.sub.i of the vector VT1.sub._n of the matrix M3.sub._n representing the first coordinates Q.sub.TW_n (W.sub.TW_n, P.sub.TW_n, R.sub.TW_n) and each of the vectors VT1.sub._i (i is a positive integer other than n) of the matrix M3_1 representing the coordinates Q.sub.TW_1 (W.sub.TW_1, P.sub.TW_1, R.sub.TW_1) other than the first coordinates Q.sub.TW_n. Similarly, the processor 50 may obtain an inner product IP3.sub.i of the vector VT3.sub._n of the matrix M3.sub._n of the first coordinates Q.sub.TW_n and each of the vectors VT3.sub._i of the matrix M3.sub._i of the coordinates Q.sub.TW_i other than the first coordinates Q.sub.TW_n.
[0084] Subsequently, the processor 50 may determine whether or not each of the obtained inner products IP1.sub.i is equal to or more than a threshold value IP1.sub.th (IP1.sub.i?IP1.sub.th) and also determines whether or not each of the obtained inner products IP3.sub.i is equal to or more than a threshold value IP3.sub.th (IP3.sub.i?IP3.sub.th). When at least one (or all) of the obtained inner products IP1.sub.i satisfies IP1.sub.i?IP1.sub.th and at least one (or all) of the obtained inner products IP3.sub.i satisfies IP3.sub.i?IP3.sub.th, the processor 50 may register the acquired first coordinates Q.sub.TW_n in the memory 52 as the active coordinates group GRP.
[0085] On the other hand, the processor 50 may exclude the obtained first coordinates Q.sub.TW_n from the active coordinates group GRP when all (or at least one) of the obtained inner products IP1.sub.i are IP1.sub.i<IP1.sub.th, or when all (or at least one) of the obtained inner products IP3.sub.i are IP3.sub.i<IP3.sub.th. The processor 50 may repeat such processing PR1 for all of the acquired coordinates Q.sub.TW_n.
[0086] Alternatively, the processor 50 obtains the resultant vector VT1.sub.R=?(VT1.sub._n) for the vector VT1.sub._1, VT1.sub._2, VT1.sub._3, . . . VT1.sub._n and obtains the inner product IP1.sub.R_n of the resultant vector VT1.sub.R and each vector VT1.sub._n. Subsequently, the processor 50 determines whether or not the obtained inner product IP1.sub.R_n is equal to or more than a predetermined threshold value IP1.sub.Rth (IP1.sub.R_n?IP1.sub.Rth). The processor 50 registers coordinates Q.sub.TW_n as the active coordinates group GRP in the memory 52 when IP1.sub.R_n?IP1.sub.Rth, while excludes (or deletes from the memory 52) the coordinates Q.sub.TW_n from the active coordinates group GRP when IP1.sub.R_n?IP1.sub.Rth.
[0087] Similarly to the vector VT1.sub._n, the processor 50 can also determine the coordinates Q.sub.TW_n to be excluded from the active coordinates group GRP for the vector VT2.sub._n by obtaining the resultant vector VT2.sub.R=?(VT2.sub._n), obtaining the inner product IP2.sub.R_n of the resultant vector VT2.sub.R and each vector VT2.sub._n, and comparing the obtained inner product with the threshold value IP2.sub.Rth or for the VT3.sub._n by obtaining the resultant vector VT3.sub.R=?(VT3.sub._n), obtaining the inner product IP3.sub.R_n of the resultant vector VT3.sub.R and each vector VT3.sub._n, and comparing the obtained inner product with the threshold value IP3.sub.Rth.
[0088] In this way, the processor 50 performs the processing PR1 to exclude for each of a plurality of coordinates Q.sub.TW_n. This processing PR1 allows to exclude the coordinates Q.sub.TW_n acquired by false detection. The threshold value ?.sub.th1, ?.sub.th2, IP1.sub.th, IP3.sub.th, IP1.sub.Rth, IP2.sub.Rth or IP3.sub.Rth (or coefficient ?), which defines the various allowable ranges described above, is predetermined by the operator.
[0089] After the excluding processing PR1, the processor 50 performs processing PR2 that averages coordinates Q.sub.TW_m (m represents number n of the coordinates Q.sub.TW_n registered in the active coordinates group GRP) registered in the active coordinates group GRP. Specifically, the processor 50 obtains the average coordinates (X.sub.TW_0, Y.sub.TW_0, Z.sub.TW_0) of the coordinates (X.sub.TW_m, Y.sub.TW_m, Z.sub.TW_m) representing a position in the coordinates Q.sub.TW_m registered in the active coordinates group GRP.
[0090] Specifically, processor 50 obtains the average coordinates (X.sub.TW_0, Y.sub.TW_0, Z.sub.TW_0) from the equations X.sub.TW_0=1/k.Math.?(X.sub.TW_m), Y.sub.TW_0=1/k.Math.?(Y.sub.TW_m), and Z.sub.TW_0=1/k.Math.?(Z.sub.TW_m). In these equations, k indicates the number of the coordinates Q.sub.TW_m registered in the active coordinates group GRP.
[0091] Additionally, the processor 50 performs the processing PR2, which averages coordinates (W.sub.TW_m, P.sub.TW_m, R.sub.TW_m) representing the orientation in the coordinates Q.sub.TW_m registered in the active coordinates group GRP. Specifically, for the coordinates (W.sub.TW_m, P.sub.TW_m, R.sub.TW_m) representing the orientation, the processor 50 obtains the resultant vector VT1.sub.R=?(VT1.sub._m) of the vector VT1.sub._m and the resultant vector VT3.sub.R=?(VT3.sub._m) of the vector VT3.sub._m, as described above.
[0092] Subsequently, the processor 50 obtains an outer product OP1 of the unit vector VT1.sub.R of the resultant vector VT1.sub.R and the unit vector VT3.sub.R of the resultant vector VT3.sub.R. This outer product OP1 represents a vector in the direction perpendicular to the unit vector VT1.sub.R and the unit vector VT3.sub.R. Subsequently, the processor 50 obtains a unit vector VT2.sub.R by normalizing the vector represented by the outer product OP1.
[0093] Subsequently, the processor 50 obtains an outer product OP2 of the unit vector VT2.sub.R and the unit vector VT3.sub.R and obtains the unit vector VT1.sub.R by normalizing the vector represented by the outer product OP2. Thus, the processor 50 obtains unit vectors VT1.sub.R, VT2.sub.R and VT3.sub.R.
[0094] Subsequently, the processor 50 obtains the orientation (W.sub.TW_0, P.sub.TW_0, R.sub.TW_0) represented by these unit vectors VT1.sub.R, VT2.sub.R and VT3.sub.R. The coordinates in this orientation indicate the direction of each axis of the workpiece coordinate system C4 in the tool coordinate system C2. The x-axis direction of the workpiece coordinate system C4 is the direction of the above-described unit vector VT1.sub.R, the y-axis direction is the direction of the unit vector VT2.sub.R, and the z-axis direction is the direction of the unit vector VT3.sub.R.
[0095] Alternatively, to obtain the coordinates (W.sub.TW_0, P.sub.TW_0, R.sub.TW_0) of the orientation, the processor 50 may obtain the unit vector VT2.sub.R of the resultant vector VT2.sub.R=?(VT2.sub._m) of the vector VT2.sub._m together with the above-described unit vector VT1.sub.R, and obtain an outer product OP3 of the unit vector VT1.sub.R and the unit vector VT2.sub.R.
[0096] Subsequently, the processor 50 may obtain the unit vector VT3.sub.R by normalizing the vector represented by the outer product OP3, obtain an outer product OP4 of the unit vector VT3.sub.R and the unit vector VT1.sub.R, and obtain the unit vector VT2.sub.R by normalizing the vector represented by the outer product OP4. The processor 50 can obtain the coordinates (W.sub.TW_0, P.sub.TW_0, R.sub.TW_0) of the orientation from the thus obtained unit vectors VT1.sub.R, VT2.sub.R and VT3.sub.R.
[0097] By the above-described method, the processor 50 performs the processing PR2, which averages coordinates Q.sub.TW_m registered in the active coordinates group GRP. As a result, the processor 50 can acquire the coordinates Q.sub.TW_0 (X.sub.TW_0, Y.sub.TW_0, Z.sub.TW_0, W.sub.TW_0, P.sub.TW_0, R.sub.TW_0) as the teaching position data TPD.sub.0.
[0098] These coordinates Q.sub.TW_0 indicate the origin position of the workpiece coordinate system C4 in the tool coordinate system C2 (X.sub.TW_0, Y.sub.TW_0, Z.sub.TW_0) and the direction of each axis (W.sub.TW_0, P.sub.TW_0, R.sub.TW_0). Thus, the processor 50 serves as the teaching position acquisition unit 88 and obtains one teaching position data TPD.sub.0 (coordinates Q.sub.TW_0) from the acquired plurality of coordinates Q.sub.TW_n (n=1, 2, 3, 4 . . . ).
[0099] It is to be understood that even when obtaining the coordinates Q.sub.WT_n (X.sub.WT_n, Y.sub.WT_n, Z.sub.WT_n, W.sub.WT_n, P.sub.WT_n, R.sub.WT_n) of the tool coordinate system C2 in the workpiece coordinate system C4 as the teaching position data TPD.sub.n, the processor 50 can obtain coordinates Q.sub.WT_0 (X.sub.WT_0, Y.sub.WT_0, Z.sub.WT_0, W.sub.WT_0, P.sub.WT_0, R.sub.WT_0) of the tool coordinate system C2 in the workpiece coordinate system C4 as new teaching position data TPD.sub.0 by the above-described method.
[0100] Subsequently, the processor 50 uses the obtained teaching position data TPD.sub.0 to generate an operation program OP in which the teaching position data TPD.sub.0 (i.e., coordinates Q.sub.TW_0 or Q.sub.WT_0) is defined as an instruction code. Thus, the processor 50 serves as an operation program generating unit 92 (
[0101] In the actual work line, the processor 70 of the controller 16 operates the robot 12 in accordance with the operation program OP and performs the operation of gripping and picking up the workpieces W that are stacked in pieces in the container A, by the hand 30. Specifically, the processor 70 operates the vision sensor 14 to image the workpiece W in the container A and acquires the imaged image data ID.sub.W (second image data) from the vision sensor 14.
[0102] Subsequently, the processor 70 acquires workpiece position data WPD.sub.W (second workpiece position data) indicating the position and orientation of the workpiece W in the robot coordinate system C1 shown in the image data ID.sub.W, based on the acquired image data ID.sub.W, as by the workpiece position acquisition unit 84 described above. Specifically, the processor 70 arranges the workpiece model WM to match the workpiece W shown in the image data ID.sub.W and sets the workpiece coordinate system C4 to the arranged workpiece model WM.
[0103] Subsequently, the processor 70 acquires the coordinates Q.sub.RW_W (X.sub.RW_W, Y.sub.RW_W, Z.sub.RW_W, W.sub.RW_W, P.sub.RW_W, R.sub.RW_W) of the workpiece coordinate system C4 in the robot coordinate system C1 as workpiece position data WPD.sub.W, by acquiring the coordinates Q.sub.SW_W (X.sub.SW_W, Y.sub.SW_W, Z.sub.SW_W, W.sub.SW_W, P.sub.SW_W, R.sub.SW_W) of the sensor coordinate system C3 of the set workpiece coordinate system C4 and transforming the coordinates Q.sub.SW_W into the robot coordinate system C1. Thus, the processor 70 acquires the workpiece position data WPD.sub.W (coordinates Q.sub.RW_W), which indicates the position and orientation of the workpiece W in the robot coordinate system C1.
[0104] Subsequently, based on the acquired workpiece position data WPD.sub.W, and teaching position data TPD.sub.0 defined in the operation program OP, the processor 70 determines the position and orientation of the hand 30 in the robot coordinate system C1 when gripping the workpiece W imaged by the vision sensor 14.
[0105] Specifically, the processor 70 obtains coordinates Q.sub.RT_0 (X.sub.RT_0, Y.sub.RT_0, Z.sub.RT_0, W.sub.RT_0, P.sub.RT_0, R.sub.RT_0) of the robot coordinate system C1, which has the positional relationship indicated by the teaching position data TPD.sub.0 relative to the workpiece coordinate system C4 represented by the coordinates Q.sub.RW_W, by using the coordinates Q.sub.RW_W acquired as the workpiece position data WPD.sub.W and the teaching position data TPD.sub.0 (specifically, coordinates Q.sub.TW_0 or Q.sub.WT_0). The processor 70 determines the position and orientation of the hand 30 when gripping the workpiece W in the robot coordinate system C1 by setting the tool coordinate system C2 to the obtained coordinates Q.sub.RT_0.
[0106] Subsequently, the processor 70 moves the hand 30, by operating the mechanical part 42 so that the hand 30 keeping the claw parts 34 and 36 closed is arranged in the position and orientation determined by the tool coordinate system C2 set to the coordinates Q.sub.RT_0 Of the robot coordinate system C1. This inserts claw parts 34 and 36 into the through hole H1 of the workpiece W.
[0107] Subsequently, the processor 70 grips the large ring W2 of the workpiece W with the claw parts 34 and 36 by operating the claw part drive part 38 to open the claw parts 34 and 36. As a result, as illustrated in
[0108] Subsequently, the processor 70 can pick up the workpiece W by operating the mechanical part 42 to evacuate the hand 30 that grips the workpiece W from the container A. Subsequently, the processor 70 performs the work of picking up the workpieces W, which are stacked in pieces in the container A, with the hand 30, by repeatedly executing the above-described sequence of operations for the respective workpieces W, which are stacked in pieces in the container A.
[0109] As described above, in the present embodiment, the processor 50 of the teaching device 18 serves as the image data acquisition unit 82, the workpiece position acquisition unit 84, the hand position acquisition unit 86, the teaching position acquisition unit 88, the robot control unit 90, and the operation program generating unit 92 to teach the position and orientation at which the robot 12 grips the workpiece W with the hand 30 in the robot coordinate system C1.
[0110] Thus, the image data acquisition unit 82, the workpiece position acquisition unit 84, the hand position acquisition unit 86, the teaching position acquisition unit 88, the robot control unit 90, and the operation program generating unit 92 constitute a device 100 (
[0111] In this device 100, the image data acquisition unit 82 acquires image data ID.sub.n in which the vision sensor 14 that is arranged at a known position of the control coordinate system C (robot coordinate system C1) images the workpiece W when the robot 12 grips the workpiece W with the hand 30. The workpiece position acquisition unit 84 acquires, based on the image data ID.sub.n, the workpiece position data WPD.sub.n indicating the position and orientation of the workpiece W in the control coordinate system C (robot coordinate system C1) at the time of imaging the image data ID.sub.n.
[0112] Additionally, the hand position acquisition unit 86 acquires hand position data HPD.sub.n indicating the position and orientation of the hand 30 in the control coordinate system C (robot coordinate system C1) at the time of imaging the image data ID.sub.n. The teaching position acquisition unit 88 acquires the teaching position data TPD.sub.n indicating the positional relationship between the hand 30 and the workpiece W in the control coordinate system C (tool coordinate system C2, workpiece coordinate system C4) at the time of imaging the image data ID.sub.n, based on the workpiece position data WPD.sub.n and the hand position data HPD.sub.n.
[0113] Thus, by acquiring the teaching position data TPD.sub.n, based on the image data ID.sub.n imaged when the hand 30 grips the workpiece W at the gripping position that the operator wants to teach, the gripping position that the operator wants to teach can be taught to the robot 12 with high accuracy.
[0114] Furthermore, in the device 100, the robot control unit 90 operates the robot 12 to repeatedly change the orientation of the hand 30 gripping the workpiece W. The image data acquisition unit 82 acquires a plurality of pieces of image data ID.sub.n imaged by the vision sensor 14 each time the robot control unit 90 changes the orientation of the hand 30. In addition, the workpiece position acquisition unit 84 acquires the workpiece position data WPD.sub.n based on each image data ID.sub.n, and the hand position acquisition unit 86 acquires the hand position data HPD.sub.n at the time of imaging each image data ID.sub.n.
[0115] Subsequently, the teaching position acquisition unit 88 acquires the teaching position data TPD.sub.n at the time of imaging each piece of image data ID.sub.n based on the corresponding workpiece position data WPD.sub.n and the corresponding hand position data HPD.sub.n. By collecting a plurality of pieces of teaching position data TPD.sub.n based on pieces of the image data ID.sub.n of the workpiece W in various orientations in this way, the gripping position of the workpiece W can be taught to the robot 12 with higher accuracy.
[0116] Additionally, in the device 100, the teaching position acquisition unit 88 obtains new teaching position data TPD.sub.0, which is to be used for the operation to cause the robot 12 to grip the workpiece W with the hand 30, based on the plurality of pieces of teaching position data TPD.sub.n. Thus, by obtaining the teaching position data TPD.sub.0 from the plurality of pieces of teaching position data TPD.sub.n corresponding to pieces of the image data ID.sub.n of the workpiece W in various orientations, the position and orientation of the hand 30 at the time of gripping each of the workpieces W in various orientations can be determined with higher accuracy by the teaching position data TPD.sub.0.
[0117] In the device 100, the teaching position data TPD.sub.n is represented as the coordinates Q.sub.TW_n, Q.sub.WT_n of the control coordinate system C (tool coordinate system C2, workpiece coordinate system C4).
[0118] The teaching position acquisition unit 88 obtains new teaching position data TPD.sub.0 by excluding coordinates that are outside of the predetermined allowable range among the coordinates Q.sub.TW_n, Q.sub.WT_n of the plurality of pieces of teaching position data TPD.sub.n and obtaining the average of the coordinates Q.sub.TW_m, Q.sub.WT_m.
[0119] According to this configuration, coordinates Q.sub.TW_n, Q.sub.WT_n acquired by false detection or the like can be excluded, and by averaging the coordinates Q.sub.TW_m, Q.sub.WT_m, more accurate teaching position data TPD.sub.0 can be obtained. Thus, the position and orientation of the hand 30 at the time of gripping each of the workpieces W in various orientations can be determined with higher accuracy. In the device 100, the operation program generating unit 92 generates the operation program OP in which the teaching position data TPD.sub.0 is defined. With this configuration, the operation program OP that defines the teaching position data TPD.sub.0 acquired as described above can be generated automatically.
[0120] In the device 100, the workpiece position acquisition unit 84 acquires, as workpiece position data WPD.sub.n, data indicating the position and orientation of the workpiece model WM in the control coordinate system C (robot coordinate system C2) when the workpiece model WM is matched to the workpiece W (three-dimensional point cloud image) shown in the image data ID.sub.n. With this configuration, the workpiece position data WPD.sub.n can be detected with high accuracy from the image data ID.sub.n imaged by the vision sensor 14.
[0121] Furthermore, in the device 100, the control coordinate system C includes the robot coordinate system C1, the workpiece coordinate system C4, the tool coordinate system C2 whose positional relationship with the robot coordinate system C1 is known, and the sensor coordinate system C3 whose positional relationship with the robot coordinate system C1 is known. The vision sensor 14 is disposed at a known position in the robot coordinate system C1.
[0122] Subsequently, the workpiece position acquisition unit 84 acquires the first coordinates Q.sub.SW_n of the workpiece coordinate system C4 in the sensor coordinate system C3, which indicates the position and orientation of the workpiece W shown in the image data ID.sub.n, and transforms the first coordinates Q.sub.SW_n into the robot coordinate system C1, thereby acquiring the second coordinates Q.sub.RW_n of the workpiece coordinate system C4 in the robot coordinate system C1 as the workpiece position data WPD.sub.n.
[0123] In addition, the hand position acquisition unit 86 acquires the third coordinates Q.sub.RT_n of the tool coordinate system C2 in the robot coordinate system C1 indicating the position and orientation of the hand 30 as the hand position data HPD.sub.n. The teaching position acquisition unit 88 acquires the teaching position data TPD.sub.n, as the coordinates Q.sub.TW_n of the workpiece coordinate system C4 in the tool coordinate system C2, or as the coordinates Q.sub.WT_n of the tool coordinate system C2 in the workpiece coordinate system C4, based on the second coordinates Q.sub.RW_n and the third coordinates Q.sub.RT_n. According to this configuration, the teaching position data TPD.sub.n can be acquired as the coordinates Q.sub.TW_n or Q.sub.WT_n in the control coordinate system C, based on the robot coordinate system C1, the tool coordinate system C2, the sensor coordinate system C3, and the workpiece coordinate system C4, which are used as the control coordinate system C.
[0124] In the above-described embodiment, the operator manually operates the teaching device 18 to change the orientation of the hand 30 that grips the workpiece W. However, without being limited to this, the processor 50 may automatically perform a series of operations such as changing the orientation of the hand 30, acquiring the image data ID.sub.n, acquiring the workpiece position data WPD.sub.n, acquiring the hand position data HPD.sub.n, and acquiring the teaching position data TPD.sub.n.
[0125] Such an embodiment will be described below with reference to
[0126] In step S1, the processor 50 sets the number n, which defines the number of pieces of teaching position data TPD.sub.n to be acquired, to 1. This number n corresponds to n of the image data ID.sub.n, the workpiece position data WPD.sub.n, the hand position data HPD.sub.n, and the teaching position data TPD.sub.n described above.
[0127] In step S2, the processor 50 operates the mechanical part 42 of the robot 12 to move the workpiece W gripped by the hand 30 so as to be within the field of view of the vision sensor 14. As an example, the processor 50 may move the hand 30 along a predetermined movement path. This movement path can be defined by teaching the robot 12 in advance by the operator. As another example, the processor 50 may acquire coordinates of tool coordinate system C2 and sensor coordinate system C3 in the robot coordinate system C1 and move the hand 30 based on the coordinates.
[0128] In step S3, the processor 50 serves as the image data acquisition unit 82 to acquire the image data ID.sub.n. Specifically, the processor 50 sends an imaging command to the vision sensor 14 and operates the vision sensor 14 to image the image data ID.sub.n of the workpiece W gripped by the hand 30. Subsequently, the processor 50 serves as the image data acquisition unit 82 to acquire the image data ID.sub.n from the vision sensor 14.
[0129] In step S4, the processor 50 serves as the workpiece position acquisition unit 84 to acquire the workpiece position data WPD.sub.n(e.g., coordinates Q.sub.RW_n) based on the image data ID.sub.n acquired in the most recent step S3 by the above-described method. In step S5, the processor 50 serves as the hand position acquisition unit 86 to acquire the hand position data HPD.sub.n(e.g., coordinates Q.sub.RT_n) at the time of imaging the image data ID.sub.n acquired in the most recent step S3 by the above-described method.
[0130] In step S6, the processor 50 serves as the teaching position acquisition unit 88 to acquire the teaching position data TPD.sub.n(e.g., coordinates Q.sub.TW_n or Q.sub.WT_n) based on the workpiece position data WPD.sub.n acquired in the most recent step S4 and the hand position data HPD.sub.n acquired in the most recent step S5 by the above-described method.
[0131] In step S7, the processor 50 determines whether or not the number n has reached a predetermined number n.sub.MAX (n=n.sub.MAX). This number n.sub.MAX can be predetermined by the operator. When n=n.sub.MAX, the processor 50 determines YES and proceeds to step S10, while when n<n.sub.MAX, determines NO and proceeds to step S8. In step S8, the processor 50 increments the number n by 1 (n=n+1).
[0132] In step S9, the processor 50 serves as the robot control unit 90 to operate the mechanical part 42 of the robot 12 to change the orientation of the hand 30 gripping the workpiece W. Specifically, the processor 50 rotates the hand 30 around the x-axis, y-axis, or z-axis of the tool coordinate system C2, which is set in the robot coordinate system C1 at this point, by a predetermined angle ?.
[0133] Alternatively, the processor 50 rotates the hand 30 around the x-axis, y-axis, (i.e., axis B of the workpiece W) or z-axis of the workpiece coordinate system C4, which is set in the robot coordinate system C1 at this point, by an angle ?. The angle ? and direction in which the hand 30 is rotated when this step S9 is performed may be predetermined by the operator. Alternatively, the processor 50 may automatically (e.g., randomly) determine each time this step S9 is performed such that the whole of the workpiece W is in the position to come within the field of view of the vision sensor 14, by taking into account the positional relationship between the tool coordinate system C2 (or workpiece coordinate system C4) and the sensor coordinate system C3, which have been set at this point.
[0134] After step S9, the processor 50 returns to step S3 and repeats the loop of steps S3 to S9 until determining YES in step S7. Thus, the processor 50 can automatically acquire a plurality of pieces of teaching position data TPD.sub.n (n=1, 2, 3, 4 . . . ).
[0135] When determining YES in step S7, the processor 50 serves, in step S10, as the teaching position acquisition unit 88 to obtain new teaching position data TPD.sub.0 (e.g., coordinates Q.sub.TW_0 or Q.sub.WT_0) based on the plurality of pieces of teaching position data TPD.sub.n acquired by the above-described method.
[0136] In step S11, the processor 50 serves as the operation program generating unit 92 to generate the operation program OP in which the teaching position data TPD.sub.0 (i.e., coordinates Q.sub.TW_0 or Q.sub.WT_0) is defined, as in the above-described embodiment. As described above, according to the present embodiment, the processor 50 can automatically generate the operation program OP by automatically executing a series of operations of steps S1 to S11. With this configuration, the process of teaching the robot 12 the position to grip the workpiece W can be accelerated and simplified.
[0137] In the above-described step S9, the processor 50 may serve as the robot control unit 90 to change the position of the hand 30 instead of changing the orientation of the hand 30. Specifically, in step S9, the processor 50 may operate the mechanical part 42 of the robot 12 to change the position of the hand 30 by translationally moving the hand 30 by a predetermined distance ? in the direction of the x-axis, y-axis or z-axis of the tool coordinate system C2 (or workpiece coordinate system C4), which is set in the robot coordinate system C1 at this point. Here, translational movement can be defined as the operation for moving the hand 30 without changing the orientation of the hand 30 (i.e., the direction of each axis of the tool coordinate system C2).
[0138] Alternatively, the processor 50 may alternately perform an operation to change the orientation of the hand 30 and an operation to change the position of the hand 30 for performing each step S9, or alternatively, the processor 50 may change the position of the hand 30 together with the orientation of the hand 30 for performing each step S9.
[0139] In the above-described embodiment, the case in which the functions of the device 100 are implemented in the teaching device 18, is described. However, without being limited to this, the functions of the device 100 can also be implemented in the controller 16 (or vision sensor 14). In this case, the processor 70 (or a processor of the vision sensor 14) of controller 16 serves as the device 100 (i.e., the image data acquisition unit 82, the workpiece position acquisition unit 84, the hand position acquisition unit 86, the teaching position acquisition unit 88, the robot control unit 90, and the operation program generating unit 92).
[0140] Additionally, some of the image data acquisition unit 82, the workpiece position acquisition unit 84, the hand position acquisition unit 86, the teaching position acquisition unit 88, the robot control unit 90, and the operation program generating unit 92 may be implemented in one of the controller 16, the teaching device 18, and the vision sensor 14, while other units may be implemented in the other one of the controller 16, the teaching device 18, and the vision sensor 14.
[0141] For example, the image data acquisition unit 82 may be implemented in the vision sensor 14, the workpiece position acquisition unit 84, the hand position acquisition unit 86, and the robot control unit 90 may be implemented in the controller 16, and the teaching position acquisition unit 88 and the operation program generating unit 92 may be implemented in the teaching device 18. In this case, the processor of the vision sensor 14, the processor 70 of the controller 16, and the processor 50 of the teaching device 18 constitute the device 100.
[0142] In the above-described embodiment, the processor 50 may serve as the hand position acquisition unit 86, acquire data indicating the position and orientation of the hand 30 in the sensor coordinate system C3 based on the image data ID.sub.n, and acquire the hand position data HPD.sub.n from the data. More specifically, as illustrated in
[0143] When serving as the image data acquisition unit 82 to acquire the image data ID.sub.n from the vision sensor 14, the processor 50 acquires a hand model 30M which models the hand 30, together with the above-described workpiece model WM. This hand model 30M is, for example, the three-dimensional CAD model, which is stored in advance in the memory 52.
[0144] The processor 50 analyzes the three-dimensional point cloud image of the hand 30 shown in the image data ID.sub.n using a predetermined pattern matching parameter and arranges the hand model 30M in a simulated manner to match the hand 30 shown in the image data ID.sub.n. Subsequently, the processor 50 sets the tool coordinate system C2 with the positional relationship illustrated in
[0145] It is also possible to make the vision sensor 14 perform these functions. For example, by applying the workpiece model WM and the hand model 30M to the imaged image data ID.sub.n, the vision sensor 14 (specifically, the processor) may acquire the coordinates Q.sub.ST_n of the tool coordinate system C2 in the sensor coordinate system C3 and the coordinates Q.sub.SW_n of the workpiece coordinate system C4 in the sensor coordinate system C3 and provide them to the teaching device 18.
[0146] Alternatively, the vision sensor 14 may serve as the workpiece position acquisition unit 84 to acquire the coordinates Q.sub.SW_n of the workpiece coordinate system C4 in the sensor coordinate system C3 as the workpiece position data WPD.sub.n and serve as the hand position acquisition unit 86 to acquire the coordinates Q.sub.ST_n of the tool coordinate system C2 in the sensor coordinate system C3 as the hand position data HPD.sub.n.
[0147] Subsequently, the vision sensor 14 may serve as the teaching position acquisition unit 88 to acquire the teaching position data TPD.sub.n(e.g., coordinates Q.sub.TW_n or Q.sub.WT_n) based on the workpiece position data WPD.sub.n (coordinates Q.sub.SW_n) and the hand position data HPD.sub.n(coordinates Q.sub.ST_n). That is, in this case, the processor of the vision sensor 14 serves as the device 100.
[0148] In the above-described embodiment, the processor 50 performs the processing PR1 to exclude for each of the plurality of coordinates Q.sub.TW_n or Q.sub.WT_n and subsequently performs the processing PR2 to average the coordinates Q.sub.TW_m or Q.sub.WT_m registered in the active coordinates group GRP. However, without being limited to this, the processor 50 may execute only one of the processing PR1 for excluding and the processing PR2 for averaging, for a plurality of coordinates Q.sub.TW_n or Q.sub.WT_n.
[0149] For example, the processor 50 may obtain new teaching position data TPD.sub.0 by executing the processing PR2 for averaging the acquired plurality of coordinates Q.sub.TW_n or Q.sub.WT_n, without performing the processing PR1 for excluding. Alternatively, the processor 50 may only perform the processing PR1 for excluding a plurality of coordinates Q.sub.TW_n or Q.sub.WT_n and automatically select, in accordance with predetermined conditions, one teaching position data TPD.sub.0 from among the coordinates Q.sub.TW_m or Q.sub.WT_m registered in the active coordinates group GRP as a result of the processing PR1.
[0150] In the above-described embodiment, the case in which the processor 50 obtains new teaching position data TPD.sub.0 based on the acquired plurality of pieces of teaching position data TPD.sub.n is described. However, without being limited to this, the processor 50 may, for example, generate image data in which the acquired plurality of pieces of teaching position data TPD.sub.n are displayed in list form and display them on the display device 56.
[0151] Subsequently, the operator operates the input device 58 to give the processor 50 input data to select the desired teaching position data TPD.sub.0 from among the plurality of pieces of teaching position data TPD.sub.n displayed on the display device 56. On receiving the input data, the processor 50 generates the operation program OP in which teaching position data TPD.sub.0 selected by the operator is defined.
[0152] In the above-described embodiment, the case is described in which the processor 50 acquires a plurality of pieces of teaching position data TPD.sub.n by repeatedly changing the orientation of the hand 30 by serving as the robot control unit 90. However, without being limited to this, the processor 50 may acquire only the above-described teaching position data TPD.sub.1 without changing the orientation of the hand 30. In this case, the processor 50 may generate the operation program OP in which the teaching position data TPD.sub.1 is defined. That is, the robot control unit 90 can be omitted from the device 100 in this case.
[0153] In the above-described embodiment, the case is described in which the processor 50 serves as the operation program generating unit 92 to generate the operation program OP. However, without being limited to this, the operator may manually generate the operation program OP based on the teaching position data TPD.sub.n acquired by the processor 50. That is, in this case, the operation program generating unit 92 can be omitted from the device 100.
[0154] In the above-described embodiment, the case is described in which the processor 50 acquires the workpiece position data WPD.sub.n by applying the workpiece model WM to the image data ID.sub.n. However, without being limited to this, the processor 50 can also acquire the workpiece position data WPD.sub.n by analyzing the image of the workpiece W shown in the image data ID.sub.n without using the workpiece model WM.
[0155] In the above-described embodiment, the case is described in which the teaching position data TPD.sub.n is acquired based on the robot coordinate system C1, the tool coordinate system C2, the sensor coordinate system C3, and the workpiece coordinate system C4, as the control coordinate system C. However, without being limited to this, for example, teaching position data TPD.sub.n can be acquired based on a world coordinate system C5. The world coordinate system C5 is the control coordinate system C that determines the three-dimensional space of a work cell and is fixedly set in the work cell.
[0156] Note that the vision sensor 14 may be fixed to the mechanical part 42 of the robot 12 (e.g., upper arm 26 or lower arm 24) instead of the retention frame 44. In this case, the vision sensor 14 is attached to the mechanical part 42 so that the workpiece W gripped by the hand 30 can be imaged. In the above-described embodiment, a case is described in which the workpiece position data WPD.sub.n, the hand position data HPD.sub.n, and the teaching position data TPD.sub.n are coordinates Q of the control coordinate system C, but without being limited to this, may be represented as any other data.
[0157] Furthermore, instead of gripping the large ring W2, the hand 30 may grip the small ring W3 by pressing the claw parts 34 and 36 against the inner wall surface of the through hole H2. In this case, the teaching device 18 teaches the position and orientation at which the hand 30 grips the small ring W3 using a method similar to the above-described embodiment.
[0158] The workpiece W is not limited to the form illustrated in
[0159] The vision sensor 14 may be a two-dimensional camera. In this case, the robot system 10 may further include a distance sensor that is fixed to the vision sensor 14 and can measure the distance d between the vision sensor 14 and the subject (workpiece W). The teaching device 18 may also be directly connected to the robot 12 (servo motor 40) or the vision sensor 14. As described above, the present disclosure is described above through the embodiment, but the above-described embodiment does not limit the invention according to the claims.
REFERENCE SIGNS LIST
[0160] 10 Robot system [0161] 12 Robot [0162] 14 Vision sensor [0163] 16 Controller [0164] 18 Teaching device [0165] 30 Hand [0166] 70, 100 Processor [0167] 82 Image data acquisition unit [0168] 84 Workpiece position acquisition unit [0169] 86 Hand position acquisition unit [0170] 88 Teaching position acquisition unit [0171] 90 Robot control unit [0172] 92 Operation program generating unit