CONTROL DEVICE AND CONTROL METHOD OF ROBOT, ROBOT SYSTEM, AND DEVICE AND METHOD OF GENERATING OPERATION PROGRAM OF ROBOT
20230191613 · 2023-06-22
Inventors
Cpc classification
B25J9/1633
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40584
PHYSICS
International classification
Abstract
A control device of a robot includes a robot control section configured to control the robot so as to sequentially position the robot at a plurality of target positions, which are set based on shape data representing a shape of a workpiece, and cause the robot to execute a work along a work target portion on the workpiece, and cause the robot to continue the work beyond a final target position of the plurality of target positions after the robot reaches the final target position, the final target position being set to correspond to an end of the work target portion in the shape data.
Claims
1. A control device of a robot, comprising a robot control section configured to control the robot so as to: sequentially position the robot at a plurality of target positions, which are set based on shape data representing a shape of a workpiece, and cause the robot to execute a work along a work target portion on the workpiece; and cause the robot to continue the work beyond a final target position of the plurality of target positions after the robot reaches the final target position, the final target position being set to correspond to an end of the work target portion in the shape data.
2. The control device of claim 1, further comprising a storage configured to store an operation program in which the plurality of target positions and an additional target position are defined, the additional target position being set at a position separated from the final target position by a predetermined distance in a predetermined extension direction, wherein the robot control section operates the robot in accordance with the operation program so as to cause the robot to continue the work to the additional target position after the robot reaches the final target position.
3. The control device of claim 1, wherein the robot control section causes the robot to execute the work by pressing a tool of the robot against the workpiece, wherein the control device further comprises: a force acquisition section configured to acquire data of a force applied from the workpiece to the robot when the robot executes the work; and a force determination section configured to determine whether or not the force is equal to or less than a predetermined threshold value while the robot continues the work beyond the final target position.
4. The control device of claim 3, wherein the robot control section causes the robot to end the work when the force determination section determines that the force is equal to or less than the threshold value.
5. The control device of claim 3, wherein the robot control section is configured to: execute copying control for controlling a pressing force, by which the robot presses the tool against the workpiece, to a predetermined target value during execution of the work, based on the force acquired by the force acquisition section; and end the copying control when the force determination section determines that the force is equal to or less than the threshold value.
6. The control device of claim 1, wherein the robot control section causes the robot to execute the work by pressing a tool of the robot against the workpiece, wherein the control device further comprises: a movement amount acquisition section configured to acquire a movement amount of the robot in a direction in which the tool is pressed against the workpiece when the robot executes the work; and a movement determination section configured to determine whether or not the movement amount exceeds a predetermined threshold value while the robot continues the work beyond the final target position.
7. The control device of claim 6, wherein the robot control section causes the robot to end the work when the movement determination section determines that the movement amount exceeds the threshold value.
8. The control device of claim 6, further comprising a force acquisition section configured to acquire data of a force applied from the workpiece to the robot when the robot executes the work, wherein the robot control section is configured to: execute copying control for controlling a pressing force, by which the robot presses the tool against the workpiece, to a predetermined target value during execution of the work, based on the force acquired by the force acquisition section; and end the copying control when the movement determination section determines that the movement amount exceeds the threshold value.
9. A robot system comprising: a robot; and the control device of claim 1.
10. A device configured to generate an operation program of a robot, the device comprising: a shape data acquisition section configured to acquire shape data representing a shape of a workpiece; a target position setting section configured to set a plurality of target positions at which the robot is to be sequentially positioned for a work onto a work target portion on the workpiece, based on the shape data; an additional target position setting section configured to automatically set an additional target position at a position separated from a final target position of the plurality of target positions by a predetermined distance in a predetermined extension direction, the final target position being set by the target position setting section so as to correspond to an end of the work target portion in the shape data; and a program generation section configured to generate the operation program in which the plurality of target positions and the additional target position are defined.
11. The device of claim 10, wherein the shape data acquisition section acquires, as the shape data, image data of the workpiece imaged by a vision sensor arranged in a known positional relationship with a control coordinate system for controlling the robot, wherein the device further comprises a position acquisition section configured to acquire a position of the work target portion in the control coordinate system based on the image data, wherein the target position setting section sets the plurality of target positions based on the position of the work target portion acquired by the position acquisition section.
12. The device of claim 10, further comprising a direction setting section configured to set the extension direction based on the shape data or the target position.
13. A control method of a robot, comprising controlling the robot so as to: sequentially position the robot at a plurality of target positions, which are set based on shape data representing a shape of a workpiece, to cause the robot to execute a work along a work target portion on the workpiece; and cause the robot to continue the work beyond a final target position of the plurality of target positions after the robot reaches the final target position, the final target position being set to correspond to an end of the work target portion in the shape data.
14. A method of generating an operation program of a robot, the method comprising: acquiring shape data representing a shape of a workpiece; setting a plurality of target positions at which the robot is to be sequentially positioned for a work onto a work target portion on the workpiece, based on the shape data; automatically setting an additional target position at a position separated from a final target position of the plurality of target positions by a predetermined distance in a predetermined extension direction, the final target position being set to correspond to an end of the work target portion in the shape data; and generating the operation program in which the plurality of target positions and the additional target position are defined.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
DESCRIPTION OF EMBODIMENTS
[0022] Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. Note that in various embodiments described below, the same elements are denoted by the same reference signs, and redundant description will be omitted. First, a robot system 10 according to an embodiment will be described with reference to
[0023] In the present embodiment, the robot 12 is a vertical articulated robot and includes a robot base 20, a swivel body 22, a robot arm 24, a wrist 26, and an end effector 28. The robot base 20 is fixed on the floor of a work cell. The swivel body 22 is provided on the robot base 20 to be able to swivel around a vertical axis. The robot arm 24 includes a lower arm 30 rotatable around a horizontal axis is provided on the swivel body 22, and an upper arm 32 rotatably provided on a tip of the lower arm 30. The wrist 26 is rotatably provided on a tip of the upper arm 32, and rotatably supports the end effector 28.
[0024] Each of the robot base 20, the swivel body 22, the robot arm 24, and the wrist 26 includes a servomotor 35 inside (
[0025] In the present embodiment, the end effector 28 performs a work of scraping and removing burrs formed on a workpiece W.sub.R (so-called deburring). Specifically, the end effector 28 includes a tool 34 and a tool driver 36 that rotationally drives the tool 34 about an axis line A. The tool 34 is a deburring tool and scrapes the workpiece W.sub.R at the conical tip thereof. The tool driver 36 includes, for example, a servomotor and rotationally rotates the tool 34 according to a command from the control device 18.
[0026] A robot coordinate system C1 (
[0027] A tool coordinate system C2 is set for the tool 34. The tool coordinate system C2 is a control coordinate system for controlling the position and orientation of the tool 34 (i.e., the end effector 28) in the robot coordinate system C1. In the present embodiment, the tool coordinate system C2 is set with respect to the tool 34 such that its origin is arranged at a tip point of the tool 34 and its z-axis matches the axis line A.
[0028] When moving the tool 34, the control device 18 sets the tool coordinate system C2 in the robot coordinate system C1, and controls each servomotor 35 of the robot 12 such that the tool 34 is arranged at the position and the orientation represented by the set tool coordinate system C2. By so doing, the control device 18 can position the tool 34 at any position and orientation in the robot coordinate system C1.
[0029] The force sensor 14 detects a force F applied from the workpiece W.sub.R to the tool 34 during the work (i.e., deburring) on the robot 12. For example, the force sensor 14 is a 6-axis force sensor including a body having a cylindrical shape and a plurality of strain gauges provided on the body. In the present embodiment, the force sensor 14 is interposed between the wrist 26 and the end effector 28 (specifically, the tool driver 36).
[0030] The vision sensor 16 is, for example, a two-dimensional camera or a three-dimensional vision sensor, and includes an optical system (a collimator lens, a focus lens, etc.), an imaging sensor (CCD, CMOS, etc.), or the like. The vision sensor 16 images an object and transmits the imaged image data to the control device 18. The vision sensor 16 is fixed at a predetermined position with respect to the end effector 28.
[0031] A sensor coordinate system C3 is set in the vision sensor 16. The sensor coordinate system C3 defines coordinates of respective pixels of the image data imaged by the vision sensor 16. In the present embodiment, the sensor coordinate system C3 is set with respect to the vision sensor 16 such that its origin is arranged at the center of a light receiving surface of the imaging sensor of the vision sensor 16, its x-axis and y-axis are arranged in parallel to the lateral direction and the longitudinal direction of the imaging sensor, and its z-axis matches an optical axis O of the vision sensor 16.
[0032] The positional relationship between the sensor coordinate system C3 and the tool coordinate system C2 is known by calibration. Thus, the coordinates of the sensor coordinate system C3 of the image data imaged by the vision sensor 16 can be transformed into coordinates of the tool coordinate system C2 via a transformation matrix M1 obtained by calibration.
[0033] Furthermore, the coordinates of the tool coordinate system C2 can be transformed into coordinates of the robot coordinate system C1 via a known transformation matrix M2 defined according to the position and orientation of the tool coordinate system C2 in the robot coordinate system C1 (i.e., the coordinates of the origin and the direction of each axis). Thus, the coordinates of the sensor coordinate system C3 can be transformed into the coordinates of the robot coordinate system C1 via the transformation matrices M1 and M2. By so doing, the vision sensor 16 is arranged in a known positional relationship with the control coordinate system (the robot coordinate system C1, the tool coordinate system C2).
[0034] The control device 18 controls the operations of the robot 12, the force sensor 14, and the vision sensor 16. Specifically, the control device 18 is a computer including a processor 40, a storage 42, an I/O interface 44, an input device 46, and a display device 48. The processor 40 includes a CPU, a GPU, or the like, and is communicably connected to the storage 42, the I/O interface 44, the input device 46, and the display device 48 via a bus 50. The processor 40 performs arithmetic processing for implementing various functions to be described later while communicating with the storage 42, the I/O interface 44, the input device 46, and the display device 48.
[0035] The storage 42 includes a RAM, a ROM, or the like, and stores various types of data temporarily or permanently. The I/O interface 44 includes, for example, an Ethernet (trade name) port, a USB port, an optical fiber connector, an HFMI (trade name) terminal, or the like and communicates data to or from an external device through wireless or wired communication under a command from the processor 40. The servomotor 35, the tool driver 36, the force sensor 14, and the vision sensor 16 described above are communicably connected to the I/O interface 44 in a wireless or wired manner.
[0036] The input device 46 includes a keyboard, a mouse, a touch panel, or the like, receives an input operation of an operator, and transmits input information to the processor 40. The display device 48 includes an LCD, an organic EL display, or the like, and displays various types of information under a command from the processor 40.
[0037]
[0038] Hereinafter, a method of generating the operation program OP in the robot system 10 will be described. First, the processor 40 acquires drawing data (e.g., 3D CAD data) DD of a workpiece model W.sub.M obtained by modeling the workpiece W.sub.R. For example, the operator operates a drawing device (CAD device etc.), generates the drawing data DD of the workpiece model W.sub.M, and supplies the drawing data DD from the drawing device to the control device 18.
[0039] The processor 40 acquires the drawing data DD via the I/O interface 44 and stores the drawing data DD in the storage 42. The drawing data DD corresponds to shape data representing the shape of the workpiece W.sub.R. Consequently, the processor 40 serves as a shape data acquisition section 102 (
[0040] Next, the processor 40 receives input information for designating a work target portion in the workpiece model W.sub.M. Specifically, the processor 40 displays the workpiece model W.sub.M on the display device 48, and the operator operates the input device 46 while visually recognizing the workpiece model W.sub.M displayed on the display device 48, and inputs input information for designating an edge model FM obtained by modeling the edge F.sub.R as a work target portion in the workpiece model W.sub.M. The processor 40 sets the edge model F.sub.M as a work target portion F.sub.M in the drawing data DD according to the input information from the operator.
[0041] Next, based on the work target portion F.sub.M set in the workpiece model W.sub.M, the processor 40 acquires the position of the work target portion F.sub.M in the robot coordinate system C1. In the present embodiment, in the real space, the workpiece W.sub.R is positioned at a known installation position in the robot coordinate system C1 such that a work target portion (edge) F.sub.R on the workpiece W.sub.R extends parallel to the x-axis of the robot coordinate system C1. The processor 40 can acquire data (specifically, coordinates) of a position P.sub.FM of the work target portion F.sub.M in the robot coordinate system C1 from the installation position data and the workpiece model W.sub.M in the robot coordinate system C1.
[0042] Next, the processor 40 images the work target portion (i.e., edge) F.sub.R on the actual workpiece W.sub.R by operating the vision sensor 16. Specifically, the processor 40 operates the robot 12 based on the acquired position P.sub.FM of the robot coordinate system C1, and positions the vision sensor 16 at an imaging position where the work target portion F.sub.R can be included in the field of view.
[0043] Next, the processor 40 images the workpiece W.sub.R by operating the vision sensor 16. At this time, the work target portion F.sub.R is reflected as a work target portion image ID.sub.F in image data ID imaged by the vision sensor 16. The image data ID imaged by the vision sensor 16 corresponds to the shape data representing the shape of the workpiece W.sub.R.
[0044] The processor 40 serves as the shape data acquisition section 102 configured to acquire the image data ID from the vision sensor 16. Next, the processor 40 extracts feature points reflected in the image data ID by analyzing the image data ID, and specifies the work target portion image ID.sub.F reflected in the image data ID.
[0045] Next, the processor 40 transforms the coordinates of the sensor coordinate system C3 of the work target portion image ID.sub.F into the robot coordinate system C1 by using the above-described transformation matrices M1 and M2, and acquires data (specifically, coordinates) of a position P.sub.FI of the work target portion F.sub.R in the robot coordinate system C1. In this way, in the present embodiment, the processor 40 serves as a position acquisition section 104 (
[0046] An error may occur between the position P.sub.FI of the work target portion F.sub.R acquired by the processor 40 based on the image data ID and the position of the work target portion F.sub.R on the workpiece W.sub.R installed in the real space. The error will be described with reference to
[0047] As illustrated in
[0048] Next, based on the acquired position P.sub.FI, the processor 40 sets a plurality of target positions TP.sub.n at which the robot 12 is to be sequentially positioned (specifically, the tip point of the tool 34) for work on the workpiece W.sub.R. Specifically, in the robot coordinate system C1, the processor 40 automatically sets the plurality of target positions TP.sub.n (n=1, 2, 3, . . . , m−2, m−1, m) along the work target portion F.sub.I so as to correspond to the position P.sub.FI of the work target portion F.sub.I (e.g., so as to match the acquired position P.sub.FI or so as to be separated from the position P.sub.FI by a predetermined distance).
[0049] The target positions TP.sub.n set in this way are schematically illustrated in
[0050] In this way, in the present embodiment, the processor 40 serves as a target position setting section 106 (
[0051] When the processor 40 generates an operation program in which the target positions TP.sub.n set as described above are defined and causes the robot 12 to execute a work in accordance with the operation program, the processor 40 causes the robot 12 to end the work when the robot 12 reaches the final target position TP.sub.m. In this case, since the final target position TP.sub.m misaligns from the vertex ER of the actual workpiece W.sub.R by an error δ, there is a possibility that a portion of the actual work target portion F.sub.R near the end E.sub.R is unworked.
[0052] In order to avoid such a situation, in the present embodiment, the processor 40 automatically sets an additional target position TP.sub.A at a position separated from the set final target position TP.sub.m by a predetermined distance Δ in a predetermined extension direction ED. Specifically, the processor 40 first sets the extension direction ED. As an example, the processor 40 specifies the extension direction of the work target portion image ID.sub.F in the vicinity of the end E.sub.R by analyzing the image data ID. The processor 40 sets the specified extension direction of the work target portion image ID.sub.F as the extension direction ED.
[0053] As another example, the processor 40 refers to the drawing data DD of the workpiece model W.sub.M, and sets, as the extension direction ED, the extension direction of the edge model F.sub.M when the workpiece model W.sub.M is arranged at a known installation position in the robot coordinate system C1. In this way, in the case of these examples, the processor 40 sets the extension direction ED based on the shape data (image data ID and drawing data DD).
[0054] As yet another example, the processor 40 may set the extension direction ED based on the set final target position TP.sub.m and the target position TP.sub.n before the final target position TP.sub.m (n<m). For example, the processor 40 obtains a vector VD1 from a target position TP.sub.m−1 immediately before the final target position TP.sub.m to the final target position TP.sub.m.
[0055] The processor 40 sets the direction of the vector VD1 as the extension direction ED. In the case of this example, the processor 40 sets the extension direction ED based on the target position TP.sub.n. In this way, in the present embodiment, the processor 40 serves as a direction setting section 108 (
[0056] Next, in the robot coordinate system C1, the processor 40 automatically sets the additional target position TP.sub.A at a position separated from the final target position TP.sub.m by the distance Δ in the set extension direction ED. The distance Δ is predetermined by the operator, and is stored in the storage 42. The processor 40 stores, in the storage 42, position data (coordinates) of the set additional target position TP.sub.A in the robot coordinate system C1. In this way, the processor 40 serves as an additional target position setting section 110 (
[0057] Next, the processor 40 uses the position data of the plurality of target positions TP.sub.n (n=1 to m) and the position data of the additional target position TP.sub.A, and generates the operation program OP in which the target positions TP.sub.n and the additional target position TP.sub.A are defined. The operation program OP is a computer program that causes the robot 12 to execute a series of operations for work.
[0058] In the operation program OP, the position data of the target positions TP.sub.n and the additional target position TP.sub.A, an instruction statement for positioning to the target position TP.sub.n, and data on the movement speed and movement path of the robot 12 (specifically, the tool 34) between two target positions TP.sub.n and TP.sub.n+1 are defined. In this way, in the present embodiment, the processor 40 serves as a program generation section 112 (
[0059] As described above, the processor 40 serves as the shape data acquisition section 102, the position acquisition section 104, the target position setting section 106, the direction setting section 108, the additional target position setting section 110, and the program generation section 112, and generates the operation program OP. Thus, the shape data acquisition section 102, the position acquisition section 104, the target position setting section 106, the direction setting section 108, the additional target position setting section 110, and the program generation section 112 constitute a device 100 (
[0060] Next, an example of an operation flow executed by the control device 18 will be described with reference to
[0061] In step S1, the processor 40 starts a work (deburring) on the workpiece W.sub.R. Specifically, the processor 40 controls the robot 12 in accordance with the operation program OP, and rotationally drives the tool 34 by the tool driver 36, thereby staring an operation of positioning the tool 34 (or the origin of the tool coordinate system C2) in the order of target positions TP.sub.1, TP.sub.2, TP.sub.3 . . . , TP.sub.m while pressing the tool 34 against the workpiece W.sub.R. By so doing, the work of scraping the workpiece W.sub.R with the tool 34 along the work target portion F.sub.R and removing burrs is started. In this way, in the present embodiment, the processor 40 serves as a robot control section 52 (
[0062] In step S2, the processor 40 starts copying control. Specifically, the processor 40 continuously acquires data of force F, which is continuously (for example, periodically) detected by the force sensor 14, from the force sensor 14. In this way, in the present embodiment, the processor 40 serves as a force acquisition section 54 (
[0063] Then, based on the acquired force F, the processor 40 controls a pressing force F.sub.P, by which the robot 12 presses the tool 34 against the workpiece W.sub.R, to a predetermined target value F.sub.T. For example, the processor 40 determines whether or not the magnitude of the acquired force F is within a range [F.sub.th1, F.sub.th2] predetermined with the target value F.sub.T as a criterion (F.sub.th1≤F≤F.sub.th2). The threshold values F.sub.th1 and F.sub.th2 that define the range [F.sub.th1, F.sub.th2] are predetermined by the operator as values satisfying, for example, F.sub.th1≤F.sub.T≤F.sub.th2, and are stored in the storage 42.
[0064] When F.sub.th1>F, the processor 40 determines that the pressing force F.sub.P is excessively smaller than the target value F.sub.T, and operates the robot 12 to displace the tool 34 in the direction toward the workpiece W.sub.R (e.g., the z-axis minus direction of the robot coordinate system C1), and increases the pressing force F.sub.P acting on the workpiece W.sub.R.
[0065] Furthermore, when F>F.sub.th2, the processor 40 determines that the pressing force F.sub.P is excessively larger than the target value F.sub.T, and operates the robot 12 to displace the tool 34 in the direction away from the workpiece W.sub.R (e.g., the z-axis plus direction of the robot coordinate system C1), and decreases the pressing force F.sub.P acting on the workpiece W.sub.R. By so doing, the processor 40 executes copying control for controlling the pressing force F.sub.P to the target value F.sub.T.
[0066] In step S3, the processor 40 determines whether or not the robot 12 has reached the final target position TP.sub.m. Specifically, the processor 40 receives position feedback FB.sub.T indicating the rotation position (or rotation angle) of the servomotor 35 from a rotation detector (encoder, Hall element, etc.) provided in each servomotor 35 of the robot 12.
[0067] Based on the position feedback FB.sub.T from each servomotor 35, the processor 40 can obtain a position P.sub.T of the tool 34 (or the origin of the tool coordinate system C2) in the robot coordinate system C1. In step S3, the processor 40 determines whether or not the obtained position P.sub.T of the tool 34 matches the final target position TP.sub.m (or is within the range determined with the final target position TP.sub.m as a criterion).
[0068] When it is determined that the position P.sub.T matches the final target position TP.sub.m (i.e., YES), the processor 40 proceeds to step S4, and when it is determined that the position P.sub.T has not reached the final target position TP.sub.m (i.e., NO), the processor 40 loops step S3. After determining YES in step S3, the processor 40 causes the robot 12 to continue the work by moving the tool 34 pressed against the workpiece W by the robot 12 toward the additional target position TP.sub.A beyond the final target position TP.sub.m.
[0069] In step S4, the processor 40 determines whether or not the magnitude of the force F most recently acquired from the force sensor 14 is equal to or less than a threshold value F.sub.th0. The threshold value F.sub.th0 is predetermined by the operator as a value close to, for example, zero (F.sub.th1>F.sub.th0÷0), and stored in the storage 42.
[0070] When it is determined that the force F is equal to or less than the threshold value F.sub.th0 (i.e., YES), the processor 40 proceeds to step S6, and when it is determined that the force F is greater than the threshold value F.sub.th0 (i.e., NO), the processor 40 proceeds to step S5. In this way, in the present embodiment, the processor 40 serves as a force determination section 56 (
[0071] In step S5, the processor 40 determines whether or not the robot 12 has reached the additional target position TP.sub.A. Specifically, the processor 40 determines whether or not the above-described position P.sub.T obtained from the position feedback FB.sub.T matches the additional target position TP.sub.A (or is within the range determined with the additional target position TP.sub.A as a criterion).
[0072] When it is determined that the position P.sub.T matches the additional target position TP.sub.A (i.e., YES), the processor 40 proceeds to step S6, and when it is determined that the position P.sub.T has not reached the additional target position TP.sub.A (i.e., NO), the processor 40 returns to step S4. By so doing, the processor 40 loops step S4 and S5 until YES is determined in step S4 or S5 and continues the work.
[0073] In step S6, the processor 40 ends the work. Specifically, the processor 40 moves the tool 34 by the robot 12 in a predetermined escape direction (e.g., the z-axis plus direction of the robot coordinate system C1) so as to be separated from the workpiece W, and stops the rotation of the tool 34 by stopping the operation of the tool driver 36. By so doing, the processor 40 ends the work on the workpiece W.
[0074] As described above, in the present embodiment, the processor 40 causes the robot 12 to continue the work beyond the final target position TP.sub.m after the robot 12 reaches the final target position TP.sub.m. According to the configuration, even though the error δ as described above occurs, it is possible to complete the work up to the end E.sub.R of the work target portion F.sub.R and prevent a portion near the end E.sub.R from being unworked. This makes it possible to improve the finish quality of the workpiece W.
[0075] Furthermore, in the present embodiment, the processor 40 operates the robot 12 in accordance with the operation program OP in which the additional target position TP.sub.A is defined in addition to the target position TP.sub.n, and causes the robot 12 to continue the work up to the additional target position TP.sub.A. According to the configuration, the processor 40 can reliably and quickly complete a work up to the end E.sub.R of the work target portion F.sub.R by the robot 12.
[0076] Furthermore, in the present embodiment, it is determined in step S4 described above that the force F is equal to or less than the threshold value F.sub.th0, and ends the work when YES is determined (step S6). Regarding this function, while the work is being executed from the final target position TP.sub.m to the additional target position TP.sub.A, the tool 34 may exceed the end E.sub.R of the work target portion F.sub.R on the workpiece W.sub.R toward the x-axis plus direction of the robot coordinate system C1. In this case, since the tool 34 is separated from the workpiece W.sub.R, the force F applied from the workpiece W.sub.R to the tool 34 is sharply reduced.
[0077] According to the present embodiment, the fact that the tool 34 has exceeded the end E.sub.R is detected by monitoring the force F, and a work is ended when it is highly probable that the tool 34 has exceeded the end E.sub.R (i.e., when YES is determined in step S4), and the robot 12 is retracted. In this way, it is possible to reduce the possibility that the robot 12 moves beyond the end E.sub.R and interferes with environmental objects around the robot 12 and to quickly end the work, so that the work cycle time can be reduced.
[0078] Furthermore, in the present embodiment, the processor 40 takes on the functions of the device 100 including shape data acquisition section 102, the target position setting section 106, the additional target position setting section 110, and the program generation section 112, and generates the operation program OP. According to the device 100, the operation program OP in which the target position TP.sub.n and the additional target position TP.sub.A are defined can be automatically generated from the shape data (drawing data DD and image data ID).
[0079] Furthermore, in the present embodiment, the device 100 further includes the position acquisition section 104 configured to acquire the position P.sub.FI of the work target portion F.sub.R in the control coordinate system (robot coordinate system C1) based on the image data ID, and sets the plurality of target positions TP.sub.n based on the acquired position P.sub.FI. According to the configuration, the plurality of target positions TP.sub.n can be set to correspond, with some accuracy, to the work target portion F.sub.R on the workpiece W.sub.R actually installed in the control coordinate system (robot coordinate system C1), which makes it possible to prevent the above error δ from becoming excessive.
[0080] Furthermore, in the present embodiment, the device 100 further includes the direction setting section 108 configured to set the extension direction ED based on the shape data (drawing data DD and image data ID) or the target position TP.sub.n. According to the configuration, the extension direction ED of a work can be set to be substantially along the extension direction of the work target portion F.sub.R, so that the work can be accurately continued up to the end E.sub.R along the work target portion F.sub.R.
[0081] Next, another example of the operation flow executed by the control device 18 will be described with reference to
[0082] In step S8, in the same manner as in step 55 described above, the processor 40 determines whether or not the robot 12 has reached the additional target position TP.sub.A, proceeds to step S6 when YES is determined, and loops step S8 when NO is determined. In this way, according to the present embodiment, when the force F is equal to or less than the threshold value F.sub.th0 while the robot 12 continues the work beyond the final target position TP.sub.m (i.e., the tool 34 exceeds the end E.sub.R), the copying control is ended. With this, it is possible to prevent the processor 40 from executing unnecessary copying control, thereby reducing the amount of calculation of the processor 40 and reducing the possibility that the robot 12 interferes with surrounding environmental objects.
[0083] Next, other functions of the control device 18 will be described with reference to
[0084] After the flow illustrated in
[0085] Specifically, the processor 40 continuously acquires the above-described position P.sub.T based on the position feedback FB.sub.T from each servomotor 35, and obtains the movement amount ξ of the robot 12 (end effector 28) from the position P.sub.T to the direction PD. By so doing, the processor 40 acquires the movement amount ξ while the robot 12 executes the work, and stores the movement amount ξ in the storage 42. Consequently, the processor 40 serves as the movement amount acquisition section 58 (
[0086] In step S9, the processor 40 determines whether or not the most recently acquired movement amount ξ exceeds a predetermined threshold value ξ.sub.th. The threshold value ξ.sub.th is predetermined by the operator and stored in the storage 42. The processor 40 determines YES when the movement amount exceeds ξ the threshold value ξ.sub.th (ξ≥ξ.sub.th) and proceeds to step S6, and determines NO when the movement amount ξ does not exceed the threshold value ξ.sub.th (ξ<ξ.sub.th) and proceeds to step S5. In this way, the processor 40 serves as the movement determination section 60 (
[0087] As described above, in the present embodiment, the processor 40 causes the robot 12 to end the work when it is determined in step S9 described above that the movement amount ξ exceeds the threshold value ξ.sub.th (step S6). Regarding this function, while the work is being executed from the final target position TP.sub.m to the additional target position TP.sub.A, when the tool 34 exceeds the end E.sub.R, the tool 34 is sharply displaced in the direction P.sub.D (for example, the z-axis minus direction of the robot coordinate system C1), thereby the movement amount ξ in the direction PD increases.
[0088] According to the present embodiment, the fact that the tool 34 has exceeded the end E.sub.R is detected by monitoring the movement amount ξ, and the work is ended when it is highly probable that the tool 34 has exceeded the end E.sub.R (i.e., when YES is determined in step S9), and the robot 12 is retracted. In this way, it is possible to reduce the possibility that the robot 12 moves beyond the end E.sub.R and interferes with environmental objects around the robot 12 and to quickly end the work, so that the work cycle time can be reduced.
[0089] Next, another example of the operation flow executed by the control device 18 illustrated in
[0090] In the embodiment illustrated in
[0091] In the above-described embodiments, the case where, in step S1, the processor 40 executes the work in accordance with the operation program OP in which the target position TP.sub.n and the additional target position TP.sub.A are defined has been described. However, the present disclosure is not limited thereto, and the processor 40 may execute the work in accordance with an operation program OP′ in which the additional target position TP.sub.A is not defined.
[0092] An example of such an operation flow is illustrated in
[0093] In step S1, the processor 40 starts a work by controlling the robot 12 in accordance with the operation program OP′. In the operation program OP′, the plurality of target positions TP.sub.n (n=1, 2, 3, . . . , m−2, m−1, m) are defined but the above-described additional target position TP.sub.A is not defined. Thereafter, the processor 40 executes steps S2 to S4.
[0094] When NO is determined in step S4, the processor 40 causes the robot 12 to continue the work in step S10. Specifically, the processor 40 first determines the extension direction ED. As an example, the processor 40 obtains the movement direction of the robot 12 (tool 34) at the time when YES is determined in step S3 described above (or when NO is determined in step S4).
[0095] For example, the processor 40 can acquire the movement direction based on the above-described position P.sub.T obtained from the position feedback FB.sub.T. The processor 40 determines the required movement direction as the extension direction ED. Alternatively, the processor 40 may determine the extension direction ED based on the shape data (image data ID and drawing data DD) or the target position TP.sub.n, similarly to the above-described direction setting section 108.
[0096] Then, the processor 40 moves the robot 12 in the determined extension direction ED and continues the rotation of the tool 34 by the tool driver 36, thereby causing the robot 12 to continue the work on the workpiece W.sub.R. Then, the processor 40 returns to step S4. Even though the robot 12 has reached the final target position TP.sub.m, when the force F applied from the workpiece W.sub.R to the tool 34 at this time is greater than the threshold value F.sub.th0 (F>F.sub.th0), there is a possibility that the tool 34 does not exceed the end E.sub.R and a portion of the work target portion F.sub.R near the end E.sub.R is unworked.
[0097] In the present embodiment, it is determined whether to continue the work based on the magnitude of the force F at the time when the robot 12 has reached the final target position TP.sub.m (i.e., when YES is determined in step S3). According to the configuration, even though the work is executed in accordance with the operation program OP′ in which the additional target position TP.sub.A is not defined, the work can be reliably completed up to the end E.sub.R.
[0098] In the above-described embodiments, the case where the device 100 is implemented in the control device 18 as a function executed by the processor 40 has been described. However, the present disclosure is not limited thereto, and the device 100 may be provided outside the control device 18. Such an aspect is illustrated in
[0099] The design support device 72 includes a CAD device 74 and a CAM device 76. The CAD device 74 is a drawing device that receives an operation by the operator and generates the drawing data DD of the workpiece model W.sub.M. The CAM device 76 is a device that generates a computer program based on the drawing data (3D CAD data).
[0100] In the present embodiment, the device 100 is implemented in the CAM device 76. The CAM device 76 receives the drawing data DD from the CAD device 74, and generate the operation program OP by serving as the shape data acquisition section 102, the position acquisition section 104, the target position setting section 106, the direction setting section 108, the additional target position setting section 110, and the program generation section 112.
[0101] The design support device 72 is communicably connected to the I/O interface 44 of the control device 18, acquires the position data of the robot 12, the image data ID imaged by the vision sensor 16, or the like from the control device 18, and transmits the generated operation program OP to the control device 18. The CAD device 74 and the CAM device 76 may be separate computers each including a processor (CPU, GPU, etc.) and a storage (ROM, RAM), or may be configured as one computer including a common processor and storage.
[0102] In the above-described embodiments, the case where the processor 40 sets the target position TP.sub.n based on the image data ID as the shape data has been described. However, the present disclosure is not limited thereto, and the processor 40 can also set the target position TP.sub.n based on the drawing data DD as the shape data.
[0103] For example, based on the above-described position PF.sub.FM acquired from the workpiece model W.sub.M, the processor 40 may set the target position TP.sub.n in the robot coordinate system C1 so as to correspond to the position PF.sub.FM (e.g., so as to match, or so as to be separated by a predetermined distance). In this case, the vision sensor 16 can be omitted from the robot systems 10 and 70.
[0104] Furthermore, in the above-described embodiments, the case where the force sensor 14 is interposed between the wrist 26 and the end effector 28 has been described, but the force sensor 14 may also be installed at any part of the robot 12 (for example, the robot base 20, the lower arm 30, and the upper arm 32). Furthermore, the force sensor 14 is not limited to a 6-axis force sensor, and may include, for example, a plurality of torque sensors that are provided on each servomotor 35 and detects a torque applied to the servomotor 35. The processor 40 can acquire torque (force) data from the plurality of torque sensors and obtain the force F applied to the tool 34 from the workpiece W.sub.R based on the torque.
[0105] Furthermore, the force sensor 14 can also be omitted from the robot systems 10 and 70. For example, the processor 40 can also serve as the force acquisition section 54 and acquire a current feedback (disturbance torque) FB.sub.C from each servomotor 35, and thus acquire the force F based on the current feedback FB.sub.C.
[0106] Furthermore, the force acquisition section 54 and the force determination section 56 can also be omitted from the control device 18 illustrated in
[0107] Furthermore, the position acquisition section 104 can also be omitted from the above-described device 100. For example, as described above, the processor 40 can also set the target position TP.sub.n based on the drawing data DD. In this case, the process of acquiring the position P.sub.FIof the work target portion F.sub.R in the control coordinate system from the image data ID can be omitted.
[0108] Furthermore, the direction setting section 108 can also be omitted from the device 100. For example, when the work target portion F.sub.R is arranged to extend in parallel to the x-axis of the robot coordinate system C1 as in the above-described embodiment (in other words, the extension direction of the work target portion F.sub.R is known), the extension direction ED may be predetermined by the operator as the x-axis plus direction (known extension direction) of the robot coordinate system C1. In this case, the process of setting the extension direction ED based on the shape data or the target position TP.sub.n can be omitted.
[0109] Furthermore, in the device 100, the additional target position setting section 110 may also change the above-mentioned distance A according to predetermined work conditions. This work conditions include, for example, the movement speed and movement path of the robot 12 (specifically, the tool 34), the specifications of the workpiece W.sub.R (dimensions, shapes, materials, etc.), and the specifications of the vision sensor 16 (resolution of the image data ID, the size of an imaging sensor, etc.).
[0110] As an example, the processor 40 may also change the distance Δ according to the resolution such that the lower the resolution of the image data ID, the larger the distance Δ. Since the above-mentioned error δ may be large as the resolution of the image data ID is low, it is possible to more reliably avoid the occurrence of an unworked portion at the work target portion F.sub.R by increasing the distance Δ according to a decrease in the resolution.
[0111] As another example, the processor 40 may also change the distance Δ according to the movement speed such that the distance Δ is reduced as the movement speed of the robot 12 increases. When the movement speed is high, the movement distance of the robot 12 tends to increase, then it is possible to reduce the possibility that the tool 34 continues a work beyond the end E.sub.R by reducing the distance Δ according to the magnitude of the movement speed.
[0112] Furthermore, the above-described control coordinate system is not limited to the robot coordinate system C1 and the tool coordinate system C2, and for example, a world coordinate system or a workpiece coordinate system may be set. The world coordinate system is a coordinate system that defines a three-dimensional space of a work cell of the robot 12, and the robot coordinate system C1 is fixed in the world coordinate system. Furthermore. The workpiece coordinate system is a coordinate system that defines the position and orientation of the workpiece W.sub.R in the robot coordinate system C1 (or the world coordinate system).
[0113] Furthermore, in the above-described embodiments, the case where the robot 12 executes deburring on the workpiece W.sub.R has been described, but the concept of the present disclosure can also be applied to any application in which the robot 12 is positioned at the plurality of target positions TP.sub.n and caused to execute a work. For example, the robot 12 may be one that polishes a surface of the workpiece W.sub.R, may be one that performs laser processing (laser cutting, laser welding) on the workpiece W.sub.R, or may be one that performs coating on a surface of the workpiece W.sub.R. Although the present disclosure is described above through the embodiments, the above-described embodiments do not limit the invention according to the claims.
REFERENCE SIGNS LIST
[0114] 10, 70 Robot system
12 Robot
[0115] 14 Force sensor
16 Vision sensor
18 Control device
40 Processor
42 Storage
[0116] 52 Robot control section
54 Force acquisition section
56 Force determination section
58 Movement amount acquisition section
60 Movement determination section
100 Device
[0117] 102 Shape data acquisition section
104 Position acquisition section
106 Target position setting section
108 Direction setting section
110 Additional target position setting section
112 Program generation section