Robot system
10813709 ยท 2020-10-27
Assignee
Inventors
Cpc classification
B25J9/1682
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1612
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40142
PHYSICS
Y10S901/27
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B25J13/088
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40139
PHYSICS
G05B2219/40182
PHYSICS
B25J9/0084
PERFORMING OPERATIONS; TRANSPORTING
B25J3/00
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1646
PERFORMING OPERATIONS; TRANSPORTING
B25J13/06
PERFORMING OPERATIONS; TRANSPORTING
B23P19/04
PERFORMING OPERATIONS; TRANSPORTING
Y10S901/41
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G05B2219/40022
PHYSICS
B25J9/163
PERFORMING OPERATIONS; TRANSPORTING
Y10S901/08
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B25J9/1633
PERFORMING OPERATIONS; TRANSPORTING
G06F3/017
PHYSICS
B23P21/00
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40146
PHYSICS
G05B2219/37297
PHYSICS
B25J13/006
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40143
PHYSICS
B25J9/1653
PERFORMING OPERATIONS; TRANSPORTING
B23Q15/12
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40163
PHYSICS
Y10S901/02
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B25J9/1664
PERFORMING OPERATIONS; TRANSPORTING
Y10S901/09
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B25J13/003
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1674
PERFORMING OPERATIONS; TRANSPORTING
Y10S901/46
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G05B2219/39439
PHYSICS
G05B2219/40134
PHYSICS
B25J9/1641
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1638
PERFORMING OPERATIONS; TRANSPORTING
B25J13/087
PERFORMING OPERATIONS; TRANSPORTING
B25J11/008
PERFORMING OPERATIONS; TRANSPORTING
B25J13/08
PERFORMING OPERATIONS; TRANSPORTING
H04N7/181
ELECTRICITY
B25J9/161
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40161
PHYSICS
B25J9/1628
PERFORMING OPERATIONS; TRANSPORTING
Y10S901/03
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G05B2219/39004
PHYSICS
H04N23/611
ELECTRICITY
G05B2219/40627
PHYSICS
Y10S901/10
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B23P21/002
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/33007
PHYSICS
B25J19/028
PERFORMING OPERATIONS; TRANSPORTING
Y10S901/47
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B25J9/1669
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1602
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40145
PHYSICS
International classification
B25J13/06
PERFORMING OPERATIONS; TRANSPORTING
G05B19/418
PHYSICS
B25J9/00
PERFORMING OPERATIONS; TRANSPORTING
B23P19/04
PERFORMING OPERATIONS; TRANSPORTING
B25J13/08
PERFORMING OPERATIONS; TRANSPORTING
B25J3/00
PERFORMING OPERATIONS; TRANSPORTING
H04N7/18
ELECTRICITY
B25J11/00
PERFORMING OPERATIONS; TRANSPORTING
B23Q15/12
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A robot system includes a robotic arm having an end effector configured to perform a work to a work object, a memory part storing information that causes the end effector to move as scheduled route information, a motion controller configured to operate the robotic arm by using the scheduled route information to move the end effector, a route correcting device configured to generate, by being manipulated, manipulating information to correct a route of the end effector during movement, a camera configured to image the work object, an image generator configured to generate a synthesized image by synthesizing a scheduled route of the end effector obtained from the scheduled route information with a captured image sent from the camera, and a monitor configured to display the synthesized image.
Claims
1. A robot system, comprising: a robotic arm having an end effector configured to perform a work to a work object; a memory part storing scheduled route information; a motion controller configured to operate the robotic arm by using the scheduled route information to move the end effector; a route correcting device configured to generate, by being manipulated, manipulating information to correct a route of the end effector during movement; a camera configured to image the work object; an image generator configured to generate a synthesized image by synthesizing a scheduled route of the end effector obtained from the scheduled route information with a captured image sent from the camera; and a monitor configured to display the synthesized image, wherein the image generator acquires present positional information on the end effector, and updates, based on the present positional information, one updated scheduled route of the end effector, the one updated scheduled route being synthesized and displayed on the monitor with the capturing image, wherein the one updated scheduled route of the end effector is a route that the present positional information, when a state where the route correcting device is manipulated is maintained from a present time point, the present time point corresponding to the present position, and wherein the one updated scheduled route is generated based on the schedule route information, the present positional information, and the manipulating information.
2. The robot system of claim 1, wherein the image generator further synthesizes with the captured image, another one updated scheduled route that the end effector is scheduled to trace, stalling from the present position, when the route correcting device is not manipulated from the present time point, the another one updated scheduled route being svnthesized and displayed on the monitor with the capturing image together with the one updated scheduled route.
3. The robot system of claim 1 wherein the image generator indicates a posture of the end effector when the end effector reaches a given position on the scheduled route.
4. The robot system of claim 2, wherein the image generator indicates a posture of the end effector when the end effector reaches a given position on the scheduled route.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1)
(2)
(3)
(4)
(5)
MODE FOR CARRYING OUT THE DISCLOSURE
(6) Hereinafter, a robot system according to one embodiment of the present disclosure is described with reference to the accompanying drawings. The robot system 100 according to this embodiment is a system including a robotic arm 1 which repeatedly performs a given work, and a manipulating device 2 which is capable of manipulating operation of the robotic arm 1 by manual operation. In the robot system 100, the operator who is located at a position distant from a workspace of the robotic arm 1 (outside of the workspace) manipulates the manipulating device 2 to input instructions so that the robotic arm 1 is capable of performing an operation corresponding to the instructions to perform a specific work. Moreover, in the robot system 100, the robotic arm 1 is also capable of automatically performing a given work, without the manipulation of the manipulating device 2 by the operator.
(7) An operating mode in which the robotic arm 1 is operated according to the instructions inputted through the manipulating device 2 is herein referred to as a manual mode. Note that the manual mode also includes a case where part of the operation of the robotic arm 1 under operation is automatically corrected based on the instructions inputted by the operator manipulating the manipulating device 2. Moreover, an operating mode in which the robotic arm 1 is operated according to a preset task program is referred to as an automatic mode.
(8) Further, the robot system 100 of this embodiment is configured so that the manipulation of the manipulating device 2 is reflectable in the automatic operation of the robotic arm 1 while the robotic arm 1 is operating automatically to correct the operation to be carried out automatically. An operating mode in which the robotic arm 1 is operated according to the preset task program while the instructions inputted through the manipulating device 2 is reflectable is herein referred to as a correctable automatic mode. Note that the automatic mode described above is distinguished from the correctable automatic mode in that the manipulation of the manipulating device 2 is not reflected in the operation of the robotic arm 1 when the operating mode in which the robotic arm 1 is operated is the automatic mode.
(9) First, with reference to
(10) (Structure of Industrial Robot 10)
(11) The industrial robot 10 includes the robotic arm 1, an end effector 16 attached to a tip end of the robotic arm 1, and a control device 3 which governs the operations of the robotic arm 1 and the end effector 16. In this embodiment, the industrial robot 10 is a welding robot which welds a work object (hereinafter, referred to as the workpiece) W, and the end effector 16 is a welding torch. The workpiece W is comprised of, for example, two members to be mutually joined by welding, which are, for example, sheet metals. The industrial robot 10 includes a welder which supplies electricity and shielding gas to the welding torch, and a filler-material feeding device which feeds a wire-like filler material to a tip end of the welding torch (none of them is illustrated). The industrial robot 10 performs the welding work to the workpiece W set onto a table 17. Note that the industrial robot 10 is not limited in particular, and may be, for example, an assembly robot, a paint robot, an application robot, an inspection robot, etc.
(12) The robotic arm 1 includes a pedestal 15, an arm part 13 supported by the pedestal 15, and a wrist part 14 which is supported by a tip end of the arm part 13 and to which the end effector 16 is attached. As illustrated in
(13) The arm part 13 of the robotic arm 1 is formed by a coupled body of the links and the joints, which is comprised of the first joint JT1, the first link 11a, the second joint JT2, the second link lib, the third joint JT3, and the third link 11c, described above. Moreover, the wrist part 14 of the robotic arm 1 is formed by a coupled body of the links and the joints, which is comprised of the fourth joint JT4, the fourth link 11d, the fifth joint JT5, the fifth link 11e, the sixth joint JT6, and the fourth link 11f, described above.
(14) The joints JT1-JT6 are provided with drive motors M1-M6 (see
(15) The control device 3 is comprised of, for example, an arithmetic part (not illustrated), such as a micro controller, an MPU or a PLC (Programmable Logic Controller), a logic circuit, and a memory part (not illustrated), such as a ROM or a RAM.
(16)
(17) The memory part 32 stores information for causing the end effector 16 to move automatically, as scheduled route information 34. The scheduled route information 34 is, for example, teaching information which is stored by operating the robotic arm 1 to perform a given work according to teaching. The scheduled route information 34 may be route information containing time-series data, or may be path information indicative of pauses at discontinuous points. Note that, in the robot system 100 according to this embodiment, the memory part 32 is provided to the control device 3, but may be provided separately from the control device 3. The motion controller 31 controls the operation of the robotic arm 1. Details of the motion controller 31 and the image generator 33 will be described later.
(18) (Manipulating Device 2)
(19) Returning to
(20) In the robot system 100 of this embodiment, when the manipulating information is sent to the control device 3 while the operating mode in which the robotic arm 1 is operated is the manual mode, the robotic arm 1 performs operation according to the manipulating information, i.e., according to the amount and direction of manipulation of the manipulating part 21. Moreover, when the manipulating information is sent to the control device 3 while the operating mode in which the robotic arm 1 is operated is the correctable automatic mode, the route of the robotic arm 1 which is operating automatically is corrected according to the manipulating information, i.e., according to the amount and direction of manipulation of the manipulating part 21. In this embodiment, the manipulating device 2 functions as a route correcting device which corrects the route of the end effector 16 during movement, as will be described later.
(21) The manipulating part 21 is configured, when the operator leaves his/her hand from the manipulating part 21, to return to a neutral state where it is not operated in any direction, by a biasing member, such as a spring. Below, a state where the manipulating part 21 is operated by the operator (i.e., a state where the manipulating part 21 is not at the neutral) is referred to as the manipulating state of the manipulating device 2, and a state where the manipulating part 21 is not operated by the operator (i.e., a state where the manipulating part 21 is at the neutral) is referred to as the non-manipulated state of the manipulating device 2. Note that the manipulating part 21 may not be configured to return to the neutral state when the operator leaves his/her hand from the manipulating part 21, but, for example, the manipulating part 21 may be configured to maintain the state before the operator leaves his/her hand from the manipulating part 21. In this case, even when the operator has left his/her hand from the manipulating part 21, the manipulating part 21 is considered to be operated unless the manipulating part 21 is in the neutral state.
(22) (Camera 4 and Monitor 5)
(23) The camera 4 images or captures images of the workpiece W and the end effector 16 which performs the given work to the workpiece W. The camera 4 is installed in the space where the robotic arm 1 is provided. The camera 4 is set so that an image captured by the camera contains the workpiece W and a tip-end part of the end effector 16 (the tip-end part of the welding torch) which directly acts on the workpiece W. In more detail, the camera 4 is set at a position where it is recognizable of a working part of the workpiece W, and a route of the end effector 16 which performs the work to the workpiece W. In this embodiment, although the camera 4 is provided to image the workpiece W from above, but it is not limited in particular, and it may be provided to image the workpiece W from obliquely upward. For example, when the working part of the workpiece W set on the table 17 extends vertically, and the end effector 16 performs the work to the workpiece W while moving vertically, the camera 4 may be set so as to image the workpiece W from sideway. Moreover, in this embodiment, the camera 4 is provided so that its relative position with respect to the workpiece W set on the table 17 may be fixed.
(24) The camera 4 is connected with the control device 3. The camera 4 and the control device 3 may be connected with each other wiredly or wirelessly.
(25) The captured image which is captured by the camera 4 is sent to the image generator 33 of the control device 3. Moreover, the scheduled route information 34 is also sent to the image generator 33 from the memory part 32. The image generator 33 generates a synthesized image which is obtained by synthesizing a scheduled route of the end effector 16 acquired from the scheduled route information 34 with the captured image sent from the camera 4. In more detail, the image generator 33 uses the captured image sent from the camera 4 as the background, and superimposes the scheduled route of the tip-end part of the end effector 16 on the captured image.
(26) Moreover, the image generator 33 acquires the present positional information of the end effector 16 in the captured image, and updates the scheduled route of the end effector 16 which is synthesized with the captured image, based on the present positional information. In this embodiment, the image generator 33 calculates the present position in the captured image based on the rotational position information of each drive motor M sent from the rotation sensor E. Note that the method of acquiring the present positional information may be any kind of method, and, for example, the image generator 33 may acquire the present positional information from the motion controller 31.
(27) In this embodiment, the image generator 33 synthesizes two scheduled routes of a first scheduled route L.sub.1 and a second scheduled route L.sub.2, with the captured image.
(28) The first scheduled route L.sub.1 is a route where the end effector 16 is scheduled to trace when the state of the manipulating device 2 at the present time point is in the non-manipulated state. In other words, the first scheduled route L.sub.1 is the route of the end effector 16 which is planned when the manipulating device 2 is not operated from the present time point. The first scheduled route L.sub.1 is generated by the image generator 33 based on the scheduled route information 34 sent from the memory part 32 and the present positional information of the end effector 16.
(29) Moreover, the second scheduled route L.sub.2 is a route where the end effector 16 is scheduled to trace when the manipulating state of the manipulating device 2 (including the amount and direction of manipulation of the manipulating part 21) at the present time point is maintained. In other words, the second scheduled route L.sub.2 is the route of the end effector 16 which is planned when the state where the manipulating device 2 is being manipulated is maintained from the present time point. The second scheduled route L.sub.2 is generated by the image generator 33 based on the scheduled route information 34 sent from the memory part 32, the present positional information of the end effector 16, and the manipulating information sent at the present time point.
(30) When the manipulating device 2 is currently in the non-manipulated state, the second scheduled route L.sub.2 is the same as the first scheduled route L.sub.1. Thus, the image generator 33 synthesizes only the first scheduled route L.sub.1 on the captured image in order to avoid duplication.
(31) The first and second scheduled routes L.sub.1 and L.sub.2 which are synthesized with the captured image may be any kind of forms, as long as the operator is visually recognizable of them. For example, as for the first and second scheduled routes L.sub.1 and L.sub.2 in the synthesized image, lines along which a representative point is expected to trace in the captured image based on the scheduled route information 34 may be expressed by solid lines, dashed lines, etc., where the representative point is the tip-end part of the end effector 16. In this embodiment, the image generator 33 displays, in the captured image, the first scheduled route L.sub.1 by a thin dashed line, and the second scheduled route L.sub.2 by a thick dashed line.
(32) Moreover, the image generator 33 synthesizes a posture image which is an image indicative of the posture of the end effector 16, with the captured image. Specifically, the image generator 33 indicates the posture of the end effector 16 when the end effector 16 reaches a given position on the scheduled route, at the given position. In this embodiment, a posture image P.sub.1 of the end effector 16 when the end effector 16 reaches a given position on the first scheduled route L.sub.11 is synthesized at the given position of the captured image. Moreover, a posture image P.sub.2 of the end effector 16 when the end effector 16 reaches a given position on the second scheduled route L.sub.2 is synthesized at the given position of the captured image.
(33) Here, the given position of the scheduled route where the posture image of the end effector 16 is displayed is not limited in particular, and, for example, it may be a scheduled position of the end effector 16 when the end effector 16 moves along the scheduled route from the present position by a given distance, or may be a scheduled position of the end effector 16 when a given period of time is passed from the present time point. Moreover, the posture images P.sub.1 and P.sub.2 may include a plurality of posture images, respectively. For example, the plurality of posture images P.sub.1 may be synthesized with the captured image so as to be spaced from each other along the first scheduled route L.sub.1.
(34) The images P.sub.1 and P.sub.2 indicative of the postures of the end effector 16 may be any kind of forms, as long as they indicate the orientation of the end effector 16 with respect to the workpiece W, and for example, they may be arrow-shaped indications with respect to the workpiece W, or may be symbols or graphic figures which imitate the end effector 16. Alternatively, the images P.sub.1 and P.sub.2 indicative of the postures of the end effector 16 may be actually-captured images of the end effector 16 imaged by a camera other than the camera 4. In this embodiment, the images P.sub.1 and P.sub.2 indicative of the postures of the end effector 16 schematically indicate the tip-end part of the welding torch which is the end effector 16, and they indicate the orientations of the welding torch with respect to the workpiece W when the welding torch is seen from the moving direction of the welding torch.
(35) Note that, in the robot system 100 according to this embodiment, although the image generator 33 is provided to the control device 3, it may be provided separately from the control device 3.
(36) The control device 3 is connected with the monitor 5, and the synthesized image generated by the image generator 33 is sent and outputted to the monitor 5. The monitor 5 and the control device 3 may be connected with each other wiredly or wirelessly. The monitor 5 is installed in the space where the manipulating device 2 is provided. The operator compares the part of the workpiece W to be worked with the scheduled route of the tip-end part of the end effector 16, while looking at the synthesized image outputted to the monitor 5, and manipulates the manipulating device 2 so that the appropriate work is performed to the workpiece W.
(37) (Input Device 7)
(38) The input device 7 is an input device which is installed outside the workspace, together with the manipulating device 2, receives the manipulating instruction from the operator, and inputs the received manipulating instruction into the control device 3. In this embodiment, although the input device 7 is a computer, but it is not limited in particular, and for example, may be a switch, an adjustment knob, a control lever, or a mobile terminal, such as a tablet computer.
(39) As illustrated in
(40) Specifically, the scheduled route information 34 stored in the memory part 32 is sent to the motion controller 31 as information for operating the robotic arm 1 automatically. Moreover, the manipulating information generated by manipulating the manipulating device 2 is sent to the motion controller 31. The motion controller 31 uses one or both of the scheduled route information 34 and the manipulating information according to the operating mode selected in the mode selecting part 71.
(41) When the operating mode selected in the mode selecting part 71 is the manual mode, the motion controller 31 uses the manipulating information. In more detail, when the operating mode in which the robotic arm 1 is operated is the manual mode, the motion controller 31 controls the operation of the robotic arm 1 using the manipulating information (inputted instructions) sent by manipulating the manipulating device 2, without using the scheduled route information 34 sent from the memory part 32.
(42) Moreover, when the operating mode selected in the mode selecting part 71 is the automatic mode, the motion controller 31 uses the scheduled route information 34. In more detail, when the operating mode in which the robotic arm 1 is operated is the automatic mode, the motion controller 31 controls the operation of the robotic arm 1 using the scheduled route information 34 sent from the memory part 32 according to the preset task program, without using the manipulating information sent from manipulating device 2.
(43) Moreover, when the operating mode selected in the mode selecting part 71 is the correctable automatic mode, the motion controller 31 uses both the scheduled route information 34 and the manipulating information. Note that, when the operating mode is the correctable automatic mode and the manipulating information has not been sent to the motion controller 31, the motion controller 31 uses only the scheduled route information 34. In more detail, when the operating mode in which the robotic arm 1 is operated is the correctable automatic mode, the motion controller 31 controls the operation of the robotic arm 1 using both the scheduled route information 34 and the manipulating information in response to the reception of the manipulating information while the robotic arm 1 is operating automatically using the scheduled route information 34. Thus, the route scheduled to be traced automatically by the robotic arm 1 based on the scheduled route information 34 is corrected.
(44) The display operating part 72 is used in order to operate the image displayed on the monitor 5. For example, the operator is able to operate the display operating part 72 to enlarge or shrink the image displayed on the monitor 5, change the way to display the scheduled route, etc.
(45) Below, the route correction of the robotic arm 1 when the operating mode in which the robotic arm 1 is operated is the correctable automatic mode is described with reference to
(46) The motion controller 31 includes an adder 31a, subtractors 31b, 31e and 31g, a position controller 31c, a differentiator 31d, and a speed controller 31f, and it controls the rotational position of the drive motor M of the robotic arm 1 according to the instruction value based on the scheduled route information 34 and the instruction value based on the manipulating information.
(47) The adder 31a generates a corrected positional instruction value by adding a correction instruction value based on the manipulating information to the positional instruction value based on the scheduled route information 34. The adder 31a sends the corrected positional instruction value to the subtractor 31b.
(48) The subtractor 31b subtracts a present position value detected by the rotation sensor E from the corrected positional instruction value to generate an angle deviation. The subtractor 31b sends the generated angle deviation to the position controller 31c.
(49) The position controller 31c generates a speed instruction value based on the angle deviation sent from the subtractor 31b by arithmetic processing based on a predefined transfer function and a predefined proportionality coefficient. The position controller 31c sends the generated speed instruction value to the subtractor 31e.
(50) The differentiator 31d differentiates the present position value information detected by the rotation sensor E to generate an amount of change in the rotational angle of the drive motor M per unit time, i.e., a present speed value. The differentiator 31d sends the generated present speed value to the subtractor 31e.
(51) The subtractor 31e subtracts the present speed value sent from the differentiator 31d, from the speed instruction value sent from the position controller 31c to generate a speed deviation. The subtractor 31e sends the generated speed deviation to the speed controller 31f.
(52) The speed controller 31f generates a torque instruction value (electric current instruction value) based on the speed deviation sent from the subtractor 31e by arithmetic processing based on a predefined transfer function and a predefined proportionality coefficient. The speed controller 31f sends the generated torque instruction value to the subtractor 31g.
(53) The subtractor 31g subtracts the present current value detected by the current sensor C, from the torque instruction value sent from the speed controller 31f to generate a current deviation. The subtractor 31g sends the generated current deviation to the drive motor M to drive the drive motor M.
(54) Thus, the motion controller 31 controls the drive motor M to control the robotic arm 1 so that the robotic arm 1 operates along a route corrected from the route related to the scheduled route information 34. Note that, when the operating mode of the robotic arm 1 is the automatic mode, the positional instruction value based on the scheduled route information 34 is sent to the subtractor 31b, and when the operating mode of the robotic arm 1 is the manual mode, the positional instruction value based on the manipulating information is sent to the subtractor 31b.
(55) Below, with reference to
(56)
(57) As illustrated in
(58) In order to perform the welding appropriately to the workpieces W.sub.A and W.sub.B, an actual route of the end effector 16 needs to be aligned with the joining line. The operator looks at the monitor 5 and checks if the route is to be corrected by the manipulating device 2. Since the first scheduled route L.sub.1 of the tip-end part of the end effector 16 is aligned with the joining line, up to the intermediate location of the route from the present position as illustrated in
(59) Moreover, in
(60) The operator looks at the monitor 5, and manipulates the manipulating device 2 from a time point at which the tip-end part of the end effector 16 reaches a location where the first scheduled route L.sub.1 illustrated in
(61) As described above, in the robot system 100 according to this embodiment, since the synthesized image where the scheduled routes of the end effector 16 is synthesized with the captured image where the workpiece W is imaged by the image generator 33 is displayed on the monitor 5, the operator is able to grasp beforehand whether the end effector 16 moves appropriately so as to perform the given work to the workpiece W. In addition, since the route of the end effector 16 during movement is correctable on real time by the manipulating device 2 as the route correcting device, the route of the end effector 16 is corrected at a point where the route correction is necessary, while the operator looks at the synthesized image on the monitor 5.
(62) Thus, in the robot system 100 according to this embodiment, since the point where the route correction is necessary is recognizable from the synthesized image on the monitor 5, it is flexibly adaptable to each work, and since the route correction of the end effector 16 is able to be made by the manipulation only at the necessary part, the operator's labors can be reduced.
(63) Moreover, in this embodiment, the scheduled route of the end effector 16 after the route correction is made by the manipulating device 2 is displayed on the monitor 5. Thus, the operator is able to grasp beforehand whether the end effector 16 moves so as to perform the given work to the workpiece W even after the route correction is made by the manipulating device 2.
(64) Moreover, in this embodiment, since the two scheduled routes L.sub.1 and L.sub.2 when the manipulating device 2 is not operated from the present time point and when the state where the route correcting device has been operated is maintained from the present time point, are indicated on the monitor 5, the operator is able to judge more accurately about how the manipulating device 2 is to be manipulated.
(65) Moreover, in this embodiment, since the postures of the end effector 16 when the end effector 16 reaches the given positions on the scheduled routes L.sub.1 and L.sub.2 are displayed on the monitor, it is possible to grasp beforehand whether the end effector 16 takes the posture by which the end effector 16 is capable of appropriately performing the work to the workpiece W when the end effector 16 traces the scheduled routes L.sub.1 and L.sub.2.
(66) The present disclosure is not limited to the embodiment described above, but various modifications thereof may be possible without departing from the spirit of the present disclosure.
(67) For example, in the embodiment described above, although the image generator 33 updates the scheduled route of the end effector 16 which is synthesized with the captured image based on the present positional information, the image generator 33 may be configured to always display the initial scheduled route indicated in the synthesized image based on the scheduled route information 34, without updating the scheduled route of the end effector 16. Moreover, in the embodiment described above, although the first scheduled route L.sub.1 and the second scheduled route L.sub.2 are indicated as the scheduled routes synthesized with the captured image, the image generator 33 may be configured to synthesize only one of the first scheduled route L.sub.1 and the second scheduled route L.sub.2 with the captured image. Moreover, the image generator 33 may not synthesize the posture images P.sub.1 and P.sub.2 which are the images indicative of the postures of the end effector 16, with the captured image.
(68) Moreover, in the embodiment described above, although the manipulating device 2 is a joystick, it may be a manipulating device 2 having another configuration, and for example, may be a manipulating device 2 provided with a direction key as the manipulating part 21. Moreover, the manipulating device 2 may be comprised of a plurality of devices, and for example, may be comprised of two joysticks.
(69) Moreover, the robot system 100 may be a system utilizing a master-slave type robot, and the manipulating device 2 may be a master arm having a similarity structure to the robotic arm 1 as a slave arm. In this case, when the operating mode in which the robotic arm 1 is operated is the manual mode, the robotic arm 1 operates so as to follow the motion of the master arm 2 which is operated manually. Moreover, in this case, the image generator 33 may be configured to synthesize only the first scheduled route L.sub.1 with the captured image.
(70) Moreover, in the embodiment described above, although the robot system 100 is configured to be provided with a single camera 4, the robot system 100 may be provided with a plurality of cameras. For example, the robot system 100 may be configured to be provided with a camera which images the working part of the workpiece W from sideway, in addition to the camera 4 which images the working part of the workpiece W from above. In this case, the image generator 33 may synthesize the scheduled route also with a captured image sent from the camera which images from sideway, and may send the synthesized image to the monitor 5. Both the synthesized image related to the camera 4 which images from above and the synthesized image related to the camera which images from sideway may be displayed on the single monitor 5, or may be displayed on separate monitors, respectively. According to this configuration, since the operator is able to grasp the relation between the workpiece W and the scheduled route three-dimensionally from the plurality of synthesized images, a more appropriate route correction can be performed.
(71) In the embodiment described above, although the input parts, such as the mode selecting part 71 and the display operating part 72, are provided with the single input device 7, they may be provided to separate input devices, respectively. Moreover, the manipulating device 2 and the input device 7 may be configured integrally.
(72) Moreover, in the embodiment described above, although it is configured so that the robotic arm 1 is capable of being operated in the manual mode by manipulating the manipulating device 2, the operating mode in which the robotic arm 1 is operated may not need to include the manual mode. In this case, the manipulating device 2 is utilized as the route correcting device which is used only in the correctable automatic mode, and the mode selecting part 71 may be used in order for the operator to select the operating mode in which the robotic arm 1 is operated from the automatic mode or the correctable automatic mode.
DESCRIPTION OF REFERENCE CHARACTERS
(73) 1 Robotic Arm 16 End Effector 2 Manipulating Device (Route Correcting Device) 3 Control Device 31 Motion Controller 32 Memory Part 33 Image Generator 34 Scheduled Route Information 4 Camera 5 Monitor 71 Mode Selecting Part