Laser processing apparatus, control apparatus, laser processing method, and method of producing image forming apparatus
11179803 · 2021-11-23
Assignee
Inventors
Cpc classification
B25J9/1679
PERFORMING OPERATIONS; TRANSPORTING
B23K26/0884
PERFORMING OPERATIONS; TRANSPORTING
B23K26/364
PERFORMING OPERATIONS; TRANSPORTING
G05B19/402
PHYSICS
B25J9/1684
PERFORMING OPERATIONS; TRANSPORTING
B23K37/006
PERFORMING OPERATIONS; TRANSPORTING
B23K26/0892
PERFORMING OPERATIONS; TRANSPORTING
B23K26/0344
PERFORMING OPERATIONS; TRANSPORTING
International classification
B23K26/08
PERFORMING OPERATIONS; TRANSPORTING
G05B19/402
PHYSICS
B23K26/70
PERFORMING OPERATIONS; TRANSPORTING
B23K26/03
PERFORMING OPERATIONS; TRANSPORTING
B23K37/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A laser processing apparatus includes a light source, a laser head configured to emit a laser beam, a robot configured to move the laser head, and a control apparatus configured to control start and stop of generation of the laser beam and control operation of the robot. The control apparatus controls the light source to generate the laser beam when a first time has elapsed after causing the robot to start an operation of accelerating the laser head such that a movement speed of the laser head with respect to a processing target object reaches a constant target speed.
Claims
1. A laser processing apparatus comprising: a light source configured to generate a laser beam; a laser head configured to emit the laser beam generated in the light source; a robot configured to move the laser head; and a programmed control apparatus comprising a computer including a processor, wherein the control apparatus is programed to control start and stop of generation of the laser beam in the light source and to control operation of the robot, wherein the control apparatus is programed to perform timekeeping to count time elapsed from a first time point, wherein the control apparatus is programed to control the light source such that the light source starts to generate the laser beam at a second time point, the second time point being a time point when a first time as a predetermined time counted by the timekeeping has elapsed from the first time point, and wherein at or after the first time point, the control apparatus is programed to cause the robot to start an operation of accelerating the laser head such that a movement speed of the laser head with respect to a processing target object reaches a target speed at or before the second time point.
2. The laser processing apparatus according to claim 1, wherein the control apparatus is programed to cause the robot to operate such that the movement speed of the laser head is kept at the target speed while the processing target object is irradiated with the laser beam from the laser head.
3. The laser processing apparatus according to claim 1, wherein the control apparatus is programed to control the light source to stop generation of the laser beam at a third time point, the third time point being a time point when a second time as a predetermined time counted by the timekeeping has elapsed from the second time point.
4. The laser processing apparatus according to claim 3, wherein the second time is a value obtained by dividing, by the target speed, a distance between a taught position at which irradiation of the laser beam is started and a taught position at which the irradiation of the laser beam is stopped.
5. The laser processing apparatus according to claim 1, wherein the control apparatus comprises: a first controller configured to control start and stop of generation of the laser beam in the light source; and a second controller configured to control operation of the robot, wherein the second controller transmits a predetermined signal to the first controller when the robot starts the operation of accelerating the laser head, and wherein the first controller controls the light source to generate the laser beam when the first time has elapsed after receiving the predetermined signal.
6. The laser processing apparatus according to claim 5, wherein a control period of the first controller is shorter than a control period of the second controller.
7. The laser processing apparatus according to claim 5, wherein the second controller controls operation of the robot in accordance with trajectory data of a predetermined section comprising a taught position at which irradiation of the laser beam is started and a taught position at which the irradiation of the laser beam is stopped.
8. The laser processing apparatus according to claim 7, wherein the second controller transmits the predetermined signal to the first controller when supply of the trajectory data is started to control the robot.
9. The laser processing apparatus according to claim 7, wherein the trajectory data comprises a plurality of pieces of trajectory data different from one another, wherein the second controller sequentially controls operation of the robot in accordance with the plurality of pieces of trajectory data, and wherein, each time the first controller receives the predetermined signal, the first controller controls the light source to generate the laser beam when the first time has elapsed.
10. The laser processing apparatus according to claim 5, wherein the first controller transmits a command to the second controller, and wherein the second controller causes the robot to start the operation of accelerating the laser head in a case where the second controller has received the command.
11. The laser processing apparatus according to claim 1, wherein the control apparatus is programed to obtain, by using a distance in which a commanded position of the robot moves in elapse of the first time and a distance derived from response delay of the robot at a time point at which the elapse of the first time is completed, an acceleration distance in which the robot moves in the elapse of the first time, and the control apparatus is programed to cause the robot to start the operation of accelerating the laser head after moving the robot to a position deviated by the acceleration distance from a taught position at which irradiation of the laser beam is started.
12. The laser processing apparatus according to claim 11, wherein the control apparatus is programed to calculate the distance derived from response delay of the robot at the time point at which the elapse of the first time is completed, by using the target speed and a predetermined constant.
13. The laser processing apparatus according to claim 11, wherein the control apparatus is programed to calculate the distance derived from response delay of the robot at the time point at which the elapse of the first time is completed, for each processing target portion.
14. The laser processing apparatus according to claim 11, wherein, in a case where an error of the position of the robot at the time point at which the elapse of the first time is completed with respect to the taught position at which the irradiation of the laser beam is started is equal to or greater than a threshold value, the control apparatus is programed to set the acceleration distance such that the error is smaller than the threshold value.
15. The laser processing apparatus according to claim 1, wherein the control apparatus is programed to cause the robot to perform a test operation before an actual operation of performing laser processing on the processing target object, and is further programmed to adjust, for each processing target portion and on a basis of vibration of the robot that has occurred in the test operation, an acceleration rate of movement of the robot to a position at which the operation of accelerating the laser head is started in the actual operation.
16. The laser processing apparatus according to claim 15, wherein the test operation comprises the control apparatus programed to cause the robot to move to the position at which the operation of accelerating the laser head is started and then to cause the robot to operate in accordance with trajectory data of a predetermined section comprising a taught position at which irradiation of the laser beam is started and a taught position at which the irradiation of the laser beam is stopped.
17. The laser processing apparatus according to claim 16, wherein the control apparatus is programed to adjust the acceleration rate on a basis of a control point obtained from an angle of a joint of the robot in a case where the robot is operating.
18. The laser processing apparatus according to claim 15, wherein the control apparatus is programed to adjust the acceleration rate on a basis of time required for a control point obtained from an angle of a joint of the robot to settle to a position where an operation of accelerating the laser head is started since a time point at which a commanded position of the robot is moved to the position at which the operation of accelerating the laser head is started in the test operation.
19. The laser processing apparatus according to claim 1, wherein the light source comprises a laser oscillator, and the laser head comprises no galvano mirror.
20. A control apparatus comprising a computer including a processor, wherein the control apparatus is programed to control start and stop of generation of a laser beam in a light source and to control operation of a robot supporting a laser head that emits the laser beam generated in the light source, wherein the control apparatus is programed to switch a signal at a first time point and to perform timekeeping to count time elapsed from the first time point, wherein the control apparatus is programed to control the light source such that the light source starts to generate the laser beam at a second time point, the second time point being a time point when a first time as a predetermined time counted by the timekeeping has elapsed from the first time point, and wherein the control apparatus is programed to cause, at or after the first time point, the robot to start an operation of accelerating the laser head such that a movement speed of the laser head with respect to a processing target object reaches a target speed at or before the second time point.
21. A laser processing method of processing a processing target object by emitting a laser beam generated in a light source from a laser head while moving the laser head by a robot supporting the laser head, wherein the laser processing method is performed by a control apparatus programed to control start and stop of generation of the laser beam in the light source and to control operation of the robot, the method comprising: timekeeping by the control apparatus so as to count time elapsed from a first time point, controlling the light source by the control apparatus such that the light source starts to generate the laser beam at a second time point, the second time point being a time point when a first time as a predetermined time counted by the timekeeping has elapsed from the first time point, and wherein at or after the first time point, the control apparatus causes the robot to start an operation of accelerating the laser head such that a movement speed of the laser head with respect to the processing target object reaches a target speed at or before the second time point.
22. The laser processing apparatus according to claim 1, further comprising: a plurality of laser heads configured to emit the laser beam; a robot apparatus configured to respectively move the plurality of laser heads, the robot apparatus including the robot; a switcher configured to switch an optical path to guide the laser beam generated in the light source to one of the plurality of laser heads; and wherein the control apparatus is programed to control a switching operation of the switcher and operation of the robot apparatus, wherein the plurality of laser heads comprise a first laser head as the laser head and a second laser head, and wherein, the control apparatus is programed to cause, before finishing irradiation of the laser beam on the processing target object by the first laser head, the robot apparatus to start an operation of accelerating the second laser head, to which the laser beam is to be guided next, such that a movement speed of the second laser head with respect to the processing target object reaches a target speed.
23. A producing method of producing an image forming apparatus in which welding of a frame body is performed by using the laser processing method according to claim 21, wherein the frame body includes the processing target object.
24. The laser processing apparatus according to claim 1, wherein the control apparatus is programed to monitor a signal, to switch the signal, and to start the timekeeping in synchronization with a switch of the signal.
25. The laser processing apparatus according to claim 22, the irradiation of the laser beam on the processing target object by the first laser head is finished at a third time point, wherein the control apparatus is programed to perform timekeeping to count time elapsed from a fourth time point, wherein the control apparatus is programed to control the light source such that the light source starts to generate the laser beam at a fifth time point, the fifth time point being a time point when a time as same length as the first time counted by the timekeeping has elapsed from the fourth time point, and wherein at or after the fourth time point, the control apparatus is programed to cause the robot to start an operation of accelerating the second laser head such that a movement speed of the laser head with respect to a processing target object reaches a target speed at or before the fifth time point.
26. The laser processing apparatus according to claim 25, wherein the fourth time point is between the first time point and the second time point, and the switcher switches the optical path between the third time point and the fifth time point.
27. The laser processing apparatus according to claim 26, wherein a time from the first time point to the fourth time point is not shorter than a time from the third time point to the fifth time point.
28. The laser processing apparatus according to claim 22, wherein the plurality of laser heads comprise no galvano mirror, and the switcher comprise a plurality of mirrors.
29. A producing method of producing a frame body comprising: preparing a first member of the frame body, wherein the first member includes a first processing target object; preparing a second member of the frame body, wherein the second member includes a second processing target object; and fixing the first member and the second member to each other by laser welding using the laser processing method according to claim 21; wherein the processing target object includes the first processing target object and the second processing target object.
30. A producing method of producing an apparatus which comprises an operation portion and a frame body which includes the processing target object, wherein laser welding to produce the frame body is performed by using the laser processing apparatus according to claim 22.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
(26)
(27)
(28)
(29)
(30)
(31)
(32)
(33)
(34)
(35)
(36)
(37)
(38)
(39)
(40)
(41)
(42)
(43)
(44)
(45)
(46)
(47)
(48)
(49)
(50)
(51)
DESCRIPTION OF THE EMBODIMENTS
(52) Exemplary embodiments of the present invention will be described in detail below with reference to drawings.
First Exemplary Embodiment
(53)
(54) The laser oscillator 103 and the laser head 102 are interconnected via an optical fiber cable 151 serving as an optical path for the laser beam. The laser oscillator 103 and the controller 121 are interconnected via a cable 153 such that a digital signal can be communicated therebetween. The robot arm 111 and the robot controller 122 are interconnected via a cable 155 including a power line and a signal line. The controller 121 and the robot controller 122 are interconnected via a cable 156 such that a digital signal can be communicated therebetween.
(55) The laser oscillator 103 is a continuous wave laser or a pulsed laser, and generates a laser beam by laser oscillation. The laser beam generated in the laser oscillator 103 is transmitted to the laser head 102 via the optical fiber cable 151. The laser head 102 emits the laser beam L generated in the laser oscillator 103. The laser beam L emitted from the laser head 102 is focused on a position in a predetermined distance from the laser head 102. The controller 121 controls start and stop of generation of the laser beam in the laser oscillator 103. That is, the controller 121 commands the laser oscillator 103 to start or stop generation of laser beam via the cable 153.
(56) The robot 101 is, for example, a vertically articulated robot, and includes a robot arm 111 and a robot hand 112. The robot hand 112 serves as an example of an end effector attached to the robot arm 111. The robot 101 supports the laser head 102. In the first exemplary embodiment, the robot 101 supports the laser head 102 by holding the laser head 102 by the robot hand 112. To be noted, for example, the laser head 102 may be supported by the robot 101 by attaching the laser head 102 to the distal end of the robot arm 111 or to the robot hand 112.
(57) Since the laser head 102 is supported by the robot 101, the laser head 102 can be moved to a desired position and orientation by moving the robot 101. By moving the robot 101 to move the laser head 102 to the desired position and orientation, the focal point of the laser beam L can be moved to a desired position in the space. By focusing the laser beam L on a position at which a welding bead is to be formed on a processing target object W, the processing target object W can be subjected to welding by the laser beam L. To be noted, a processed product can be obtained by processing the processing target object W.
(58) In the first exemplary embodiment, the control apparatus 120 controls the laser oscillator 103 and the robot 101 to perform laser seam welding. In the laser seam welding, there are a mode in which a continuous wave is used as the laser beam L and a mode in which a pulse wave is used as the laser beam L, and either of these modes may be selected. In the laser seam welding, the surface of the processing target object W needs to be scanned by the laser beam L. In the first exemplary embodiment, the laser beam L is emitted while moving the laser head 102 supported by the robot 101 to scan the surface of the processing target object W by the laser beam L without using a galvano mirror, and thus laser welding is performed. Since the galvano mirror is omitted, the cost can be reduced. If there are a plurality of welding target portions serving as processing target portions on the processing target object W, the welding target portions are sequentially subjected to laser welding.
(59)
(60) The controller 121 includes interfaces: I/Fs 311 and 312. The CPU 301, the ROM 302, the RAM 303, the HDD 304, the disk drive 305, and the I/Fs 311 and 312 are communicably interconnected via a bus 310. The I/F 312 is connected to the laser oscillator 103 via the cable 153.
(61) The CPU 301 is connected to a clock generation circuit 314. The CPU 301 operates in synchronization with a clock signal generated in the clock generation circuit 314. That is, the operating frequency of the CPU 301 is determined depending on the clock signal of the clock generation circuit 314.
(62) The HDD 304 stores, in other words, records a control program 321 for the CPU 301 to perform timekeeping processing and signal communication processing. The CPU 301 performs various processing such as the timekeeping processing and the signal communication processing in accordance with the control program 321. The HDD 304 stores, in other words, records various data 322 including data of time such as a first time T1 and a second time T2.sub.j that will be described later. Here, j is a positive integer and a serial number indicating the order of welding given in correspondence with the welding target portion serving as the processing target portion. To be noted, the data 322 may be incorporated in the control program 321.
(63) The CPU 301 functions as a software timer by executing the control program 321. Specifically, the CPU 301 functions as a timer that performs timekeeping of the first time T1 and a timer that performs timekeeping of the second time T2.sub.j. The CPU 301 controls the laser oscillator 103 by executing the control program 321 and transmitting a laser oscillation command SR1 serving as a signal to the laser oscillator 103. The control program 321 regularly performs reading from I/Fs, arithmetic processing, and output to I/Fs. The period of this regular processing will be referred to as a control period of the controller 121.
(64) To be noted, the recording medium in which the control program 321 is recorded may be any recording medium as long as the recording medium can be read by a computer. For example, as the recording medium for supplying the control program 321, the ROM 302 illustrated in
(65) The robot controller 122 is a dedicated computer for controlling the robot 101. To be noted, in
(66) The robot controller 122 includes a field-programmable gate array: FPGA 416 and a current amplifier 417. The FPGA 416 serves as an example of a servo calculation portion. The robot controller 122 includes an I/F 411. The CPU 401, the ROM 402, the RAM 403, the HDD 404, the FPGA 416, and the I/F 411 are communicably interconnected via a bus 410. The I/F 311 of the controller 121 and the I/F 411 of the robot controller 122 are interconnected via a cable 156.
(67) The CPU 401 is connected to a clock generation circuit 414, and the FPGA 416 is connected to a clock generation circuit 415. The CPU 401 operates in synchronization with a clock signal generated in the clock generation circuit 414, and the FPGA 416 operates in synchronization with a clock signal generated in the clock generation circuit 415. That is, the operating frequency of the CPU 401 is determined depending on the clock signal of the clock generation circuit 414, and the operating frequency of the FPGA 416 is determined depending on the clock signal of the clock generation circuit 415.
(68) The HDD 404 stores, in other words, records a program 421 and a robot program 422.
(69) The robot arm 111 includes a plurality of motors that drive joints thereof and a plurality of encoders serving as examples of a position sensor that detects the rotation angles or rotation positions of the motors. For example, six motors M1 to M6 and six encoders Ent to En6 are provided. The robot arm 111 includes a detection circuit 115 connected to the encoders Ent to En6 and constituted by an electronic circuit.
(70) In the configuration described above, the controller 121, specifically the CPU 301 transmits the laser oscillation command SR1 serving as a signal from the I/F 312 to the laser oscillator 103. The laser oscillator 103 that has received the laser oscillation command SR1 operates to generate a laser beam in accordance with the laser oscillation command SR1. Specifically, as the laser oscillation command SR1, the controller 121 switches the voltage of an electric signal from a low level to a high level when commanding the laser oscillator 103 to generate a laser beam, and transmits the command from the I/F 312. The switching of the voltage of the electric signal to the high level is also referred to as turning the laser oscillation command SR1 on. The controller 121 switches the voltage of the electric signal to the low level when commanding the laser oscillator 103 to stop generation of the laser beam. The switching of the voltage of the electric signal to the low level is also referred to as turning the laser oscillation command SR1 off. Therefore, the laser oscillator 103 generates the laser beam when the laser oscillation command SR1 is on, and stops generation of the laser beam when the laser oscillation command SR1 is off.
(71) The laser oscillator 103 transmits a signal SR2 indicating that a laser beam is being generated to the controller 121. Specifically, the laser oscillator 103 transmits, as the signal SR2, an electric signal whose voltage is the high level to the controller 121 when the laser beam is being generated. Setting the voltage of the electric signal to the high level is also referred to as turning the signal SR2 on. The laser oscillator 103 sets the voltage of the electric signal to the low level when the laser beam is stopped. Setting the voltage of the electric signal to the low level is also referred to as turning the signal SR2 off.
(72) The robot controller 122 controls operation of the robot arm 111 and the robot hand 112 in accordance with the robot program 422. The robot controller 122 manages a sequence in accordance with the robot program 422. That is, the time of starting operation of the robot arm 111 is managed by the robot controller 122. Operating the robot arm 111 will be expressed as operating the robot 101 hereinbelow.
(73) The robot controller 122 transmits, to the controller 121, a signal SB that is a digital signal indicating that the robot 101 is operating on the basis of trajectory data of a predetermined section including a welding target portion. Specifically, the robot controller 122 switches the voltage of the electric signal indicating the signal SB from the low level to the high level and transmits the signal SB to the controller 121 when causing the robot 101 to start an operation of accelerating the laser head 102. The robot controller 122 switches the voltage of the electric signal indicating the signal SB from the high level to the low level at a predetermined timing. Switching the voltage of the electric signal indicating the signal SB to the high level will be also referred to as turning the signal SB on hereinbelow. In addition, switching the voltage of the electric signal indicating the signal SB to the low level will be also referred to as turning the signal SB off.
(74) Orientation control of the robot arm 111, that is, position/orientation control of the laser head 102, specifically position control of the focal point of the laser beam L is performed with a motor current SC1 supplied to the motors M1 to M6 of the robot arm 111 of the robot controller 122. The robot program 422 is a program described in a robot language. A user can instruct operation of the robot 101 by describing the robot language in text data. The CPU 401 of the robot controller 122 executes the program 421 to interpret the robot program 422, generate trajectory data constituted by a plurality of commands, and output the generated trajectory data to the FPGA 416. The FPGA 416 performs servo calculation in accordance with the trajectory data. That is, the FPGA 416 generates a motor current command by the servo calculation, and transmits the generated motor current command to the current amplifier 417. The current amplifier 417 generates the motor current SC1 corresponding to the motor current command, and supplies the motor current SC1 to the motors M1 to M6 at respective joints of the robot arm 111. The motors M1 to M6 of the robot arm 111 are driven by the supplied motor current SC1. The detection circuit 115 obtains detection signals from the encoders En1 to the En6 when the motors M1 to M6 rotate. The detection circuit 115 converts the detection signals into a serial digital signal SC2, and transmits the digital signal SC2 to the FPGA 416 of the robot controller 122.
(75) The digital signal SC2 indicating rotation angles, or rotation positions of the motors M1 to M6 is used for the servo calculation in the FPGA 416. The program 421 regularly performs reading from I/Fs, arithmetic processing, and output to I/Fs. The period of this regular processing will be referred to as a control period of the robot controller 122. The detection signals of the encoders En1 to En6 are pulse signals of ABZ phase. The detection circuit 115 converts the pulse signals of the encoders En1 to En6 into the digital signal SC2 indicating a pulse number, which can be converted into a positional coordinate, and feeds the digital signal SC2 back to the FPGA 416. To be noted, the servo mechanism, that is, the FPGA 416 and the current amplifier 417 may be disposed in the robot arm 111, and a position command, that is, the trajectory data may be transmitted to the servo mechanism in the robot arm 111 from the CPU 401 via a cable. The FPGA 416 may be omitted by imparting the function of the FPGA 416 to the CPU 401. Although the pulse signals of the encoders En1 to En6 are converted into a digital signal and transmitted to the robot controller 122 has been described, the pulse signals of the encoders En1 to En6 may be directly transmitted to the robot controller 122. Resolvers may be used as position sensors instead of the encoders En1 to En6.
(76) Here, a control point of movement of the robot 101 may be a point that moves together with the tip of the hand of the robot 101, and, in the first exemplary embodiment, a focal point of the laser beam is set as the control point of movement of the robot 101. The control point is expressed by six parameters composed of three parameters X, Y, and Z indicating a position in the three-dimensional space and three parameters A, B, and C indicating an orientation in the three-dimensional space. Therefore, the control point can be regarded as one point in a six-dimensional task space. In the robot program 422, a taught point that is a movement target of the control point is described, in other words, designated by a user. The robot controller 122 interprets the robot program 422 and generates the trajectory data connecting taught points, that is, the trajectory data in which the taught points are interpolated. Examples of an interpolation method of interpolating the taught points include linear interpolation, circular interpolation, and joint interpolation, and these interpolation methods are described, in other words, designated in the robot program 422 as an interpolation command by the user.
(77) The CPU 401 of the robot controller 122 converts the trajectory data obtained by the interpolation into a command of angles of respective joints of the robot 101, and the FPGA 416 performs servo calculation. As a result of the servo calculation, the FPGA 416 determines a current command to be transmitted to the current amplifier 417. The servo calculation is performed in each cycle of the control period of the CPU 401 of the robot controller 122. The command of angles of respective joints is updated in each cycle of the control period, and the speed of the robot 101 is determined by controlling the amount of increase or decrease thereof. That is, the robot 101 moves quickly in the case where the amount of increase or decrease of commanded angles of the joints is large, and moves slowly in the case where the amount of the increase or decrease is small.
(78) A path in which the control point, which is the focal point of the laser beam, actually moves by the operation of the robot 101 can be deviated from a path commanded by the robot program 422 due to response delay of position control.
(79)
(80) A case of executing a linear interpolation command serving as an example of the interpolation command described in the robot program 422 will be described below. The linear interpolation command is a command of performing interpolation such that the control point moves along a straight line connecting a first positional coordinate and a second positional coordinate, and the path of the control point becomes a line segment in the three-dimensional space. To be noted, there are two possible options, one of which is also interpolating the orientation of the robot 101 by using the first positional coordinate and the second positional coordinate and the other of which is maintaining the orientation at the first positional coordinate to the second positional coordinate, and, in the first exemplary embodiment, the orientation is also interpolated. In both of these options, the control point of the robot 101, that is, the focal point of the laser beam moves on a line segment connecting the first positional coordinate and the second positional coordinate. To be noted, in most case, the robot program 422 uses a current commanded position of the robot 101 as the first positional coordinate and only the second positional coordinate, which is a movement destination, is designated. For the positional coordinate, a taught point, in other words, a taught position set by a user may be used, or a positional coordinate obtained by additional calculation of the taught point indicating a position different from the taught point may be used. When the CPU 401 of the robot controller 122 executes the linear interpolation command, the CPU 401 generates trajectory data connecting the current commanded position and a target position that is the movement destination, and supplies the trajectory data to the FPGA 416 in each cycle of the control period. The linear interpolation command is completed when supply of all trajectory data by the CPU 401 of the robot controller 122 is completed, and the CPU 401 executes the next command described in the robot program 422.
(81) In
(82) The robot controller 122 derives, in accordance with a predetermined algorithm, positions 54.sub.1 and 55.sub.1 positioned on an extension line passing through the taught positions 52.sub.1 and 53.sub.1. This algorithm is described in the robot program 422. The position 55.sub.1 is provided to the robot program 422 as a parameter to execute the linear interpolation command. To be noted, in order to cause the robot program 422 to execute the linear interpolation command to the position 55.sub.1, it is required that a movement command to the position 54.sub.1 has been executed and the commanded position of the robot 101 has reached the position 54.sub.1.
(83) The position 54.sub.1 is a commanded position at which movement of the control point is started. The position 55.sub.1 is a commanded position at which movement of the control point is finished. Further, the robot controller 122 generates trajectory data P.sub.1 of a predetermined section by linear interpolation. The predetermined section includes the section between the taught positions 52.sub.1 and 53.sub.1, and includes the position 54.sub.1 as a start point and the position 55.sub.1 as an end point. Similarly, the robot controller 122 generates trajectory data P.sub.2 of a predetermined section in accordance with a linear interpolation command. The predetermined section includes a position 54.sub.2 at which movement of the control is started as a start point and a position 55.sub.2 at which movement of the control point is finished as an end point. To be noted, although the predetermined algorithm is described in the robot program 422 in the present exemplary embodiment, the algorithm may be implemented inside the robot controller 122 and arithmetic processing can be performed in accordance with a command.
(84) As described above, the positions 52.sub.1, 53.sub.1, 52.sub.2, and 53.sub.2 that are included in the trajectory data P.sub.1 and P.sub.2 to be commanded to the robot 101 are taught points designated by the user. In contrast, the positions 54.sub.1, 55.sub.1, 54.sub.2, and 55.sub.2 that are included in the trajectory data P.sub.1 and P.sub.2 are commands obtained by the robot controller 122 by automatic calculation in accordance with the robot program 422, and are not taught points.
(85) For also the section between the positions 55.sub.1 and 54.sub.2, the robot controller 122 performs interpolation in accordance with the interpolation command described in the robot program 422 to generate trajectory data P.sub.1-2. To be noted, since the laser head 102 is just moved in the section between the positions 55.sub.1 and 54.sub.2, the interpolation can be performed by an arbitrary interpolation method. Therefore, an arbitrary interpolation command can be described in the robot program 422. For example, in the case where a linear interpolation command is described in the robot program 422, linear interpolation can be performed. For example, in the case where a joint interpolation command is described in the robot program 422, joint interpolation can be performed. The joint interpolation command is a command for dividing operation amount of each joint of the robot 101 by time and performing interpolation, and the path of the control point is not linear in this case. However, in this case, the operation of the robot 101 is faster than in the case where the robot 101 is operated in accordance with the linear interpolation command.
(86) In the case of performing laser seam welding, acceleration sections for the movement speed of the control point to reach target speeds VW.sub.1 and VW.sub.2 are required. In the first exemplary embodiment, the section between the positions 54.sub.1 and 52.sub.1 and the section between the positions 54.sub.2 and 52.sub.2 are acceleration sections. The target speeds VW.sub.1 and VW.sub.2 are described, in other words, designated in the robot program 422.
(87) To be noted, although the control point needs to be moved with a high precision in the sections for welding, that is, the section between the positions 52.sub.1 and 53.sub.1 and the section between the positions 52.sub.2 and 53.sub.2, the positional precision may be low in other sections in which laser seam welding is not performed, for example, the acceleration sections. Therefore, as illustrated in
(88) Here, when the robot controller 122 commands the positions 54.sub.1 and 54.sub.2 to the robot 101, the robot 101 is still or moving, and these are both acceptable.
(89)
(90) The robot controller 122 controls operation of the robot 101 by sequentially using the trajectory data P.sub.1, the trajectory data P.sub.1-2, and the trajectory data P.sub.2 in this order. However, due to the response delay of the position control, the control point moves behind the commanded time as illustrated in
(91) When the robot controller 122 executes the linear interpolation command and starts supplying the trajectory data P.sub.1 for welding on the first welding target portion, the command for the angle of the robot 101 starts to change from the position 54.sub.1 that is a start point of the trajectory data P.sub.1 to the position 55.sub.1 that is an end point of the trajectory data P.sub.1. At the time point TP1.sub.1 when this change is started, the operation of the robot 101 to accelerate the control point, that is, the laser head 102 is started.
(92) When the robot controller 122 commands the position 52.sub.1 to the robot 101, the control point passes a position corresponding to the commanded position 52.sub.1 at the time point TP2.sub.1 delayed with respect to the commanded time. When the robot controller 122 commands the position 53.sub.1 to the robot 101, the control point passes a position corresponding to the commanded position 53.sub.1 at the time point TP3.sub.1 delayed with respect to the commanded time. When the robot controller 122 commands the position 55.sub.1 that is the end point of the trajectory data P.sub.1 to the robot 101, the control point passes a position corresponding to the commanded position 55.sub.1 at the time point TP5.sub.1 delayed with respect to the commanded time. The irradiation of laser beam needs to be started when the control point actually passes the position 52.sub.1 and needs to be stopped when the control point actually passes the position 53.sub.1.
(93) Next, the robot controller 122 operates the robot 101 in accordance with the trajectory data P.sub.1-2 from the position 55.sub.1 to the position 54.sub.2 for preparation of next welding operation.
(94) When the robot controller 122 executes the linear interpolation command and starts supplying the trajectory data P.sub.2 for welding on the second welding target portion, the command for the angle of the robot 101 starts to change from the position 54.sub.2 that is a start point of the trajectory data P.sub.2 to the position 55.sub.2 that is an end point of the trajectory data P.sub.2. At the time point TP1.sub.2 when this change is started, the operation of the robot 101 to accelerate the control point, that is, the laser head 102 is started.
(95) When the robot controller 122 commands the position 52.sub.2 to the robot 101, the control point passes a position corresponding to the commanded position 52.sub.2 at the time point TP2.sub.2 delayed with respect to the commanded time. When the robot controller 122 commands the position 53.sub.2 to the robot 101, the control point passes a position corresponding to the commanded position 53.sub.2 at the time point TP3.sub.2 delayed with respect to the commanded time. When the robot controller 122 commands the position 55.sub.2 that is the end point of the trajectory data P.sub.2 to the robot 101, the control point passes a position corresponding to the commanded position 55.sub.2 at the time point TP5.sub.2 delayed with respect to the commanded time. The irradiation of laser beam needs to be started when the control point actually passes the position 52.sub.2 and needs to be stopped when the control point actually passes the position 53.sub.2.
(96) In order to perform processing by irradiating the processing target object W with the laser beam L in a state in which the movement speed of the focal point of the laser beam L is constant at the target speed VW.sub.1 or VW.sub.2, the laser head 102 needs to be accelerated before the laser head 102 reaches the position to start irradiation of the laser beam L. The first time T1 is time for accelerating the laser head 102 such that the laser head 102 moves at the constant target speed VW.sub.1 or VW.sub.2. Second times T2.sub.1 and T2.sub.2 are respectively times for irradiating the processing target object W with the laser beam L in states in which the laser head 102 is moving at the constant target speeds VW.sub.1 and VW.sub.2.
(97) In the first exemplary embodiment, a period between the time points TP1.sub.1 and TP2.sub.1 and a period between the time points TP1.sub.2 and TP2.sub.2 are set as the first time T1 for accelerating the laser head 102 with respect to the processing target object W. In addition, by performing an experiment or calculation in advance, a period between the time points TP2.sub.1 and TP3.sub.1 is set as the second time T2.sub.1 for irradiation of the laser beam L, and a period between the time points TP2.sub.2 and TP3.sub.2 is set as the second time T2.sub.2 for irradiation of the laser beam L.
(98) Settings of the first time T1 and the second time T2.sub.j will be described in detail below. The actual speed of the laser head 102 is deviated from the commanded speed thereof due to response delay of position control. Therefore, it is difficult to set the first time T1 by only the robot program 422. Therefore, the robot 101 is operated in various conditions with trial and error to measure the time in which the actual speed of the laser head 102 reaches the target speed Vw.sub.j in each condition, and the first time T1 is set on the basis of measurement results thereof.
(99) To be noted, although it is preferable that the speed of the laser head 102 reaches the target speed Vw.sub.j and be constant when the first time T1 has elapsed, an error occurs in the speed due to factors such as the position and orientation of the robot 101, the target speed Vw.sub.j, and residual deviation derived from continuous operation of the robot 101. Therefore, it is preferable that, among values obtained by the measurement in various conditions, a value with the smallest speed error, that is, the longest time is set as the first time T1. That is, the first time T1 may be set such that the speed of the laser head 102 is within a predetermined range from the target speed Vw.sub.j when the first time T1 has elapsed after starting acceleration of the laser head 102. Although the first time T1 may be varied depending on the target speed Vw.sub.j of the laser beam at each welding target portion, the processing of the controller 121 can be more simplified when the first time T1 is set to the same value.
(100)
(101) To be noted, the algorithm for obtaining the positions 54.sub.j and 55.sub.j positioned on an extension line connecting the taught positions 52.sub.j and 53.sub.j needs to be changed between the
(102) The second times T2.sub.1 and T2.sub.2 are laser irradiation times, and are calculated by the following formula (1). Here, as algebra for calculation of the formula (1), Tw, Ps, Pe, and Vw are used. Tw corresponds to the second times T2.sub.1 and T2.sub.2, Ps corresponds to the positions 52.sub.1 and 52.sub.2 at which irradiation of the laser beam is started, Pe corresponds to the positions 53.sub.1 and 53.sub.2 at which the irradiation of the laser beam is stopped, and Vw corresponds to the target speed Vw.sub.j. The second time Tw is calculated for each welding target portion by using Ps, Pe, and Vw in accordance with the following formula (1).
(103)
(104) That is, the value of Tw serving as second time is obtained by dividing the distance between Ps and Pe by Vw. Different values of Ps, Pe, and Vw can be used for different welding target portions. Therefore, the value of Tw serving as the second time T2.sub.1 or T2.sub.2 corresponds to the length, or area of the welding target portion. Values of Tw calculated in this manner are set as the second times T2.sub.1 and T2.sub.2.
(105) The positions 52.sub.1 and 52.sub.2 that are taught points are each constituted by information of six degrees of freedom about position and orientation. Specifically, the information constituting the positions 52.sub.1 and 52.sub.2 includes X, Y, and Z, which are information about the position of the robot 101 with respect to the base, and A, B, and C, which are information about the holding angle of the laser head 102. The same applies to the positions 53.sub.1 and 53.sub.2. Therefore, a distance in the three-dimensional space is obtained by only using the position information of X, Y, and Z as Ps and Pe.
(106) To be noted, this calculation may be performed by the robot controller 122, and the robot controller 122 may transfer the value of Tw serving as the second time to the controller 121. In addition, the robot controller 122 may calculate only the distance between Ps and Pe and transfer information about the distance to the controller 121, and the controller 121 may perform the remaining calculation for obtaining the value of Tw serving as the second time. Which of these is selected can be appropriately selected on the basis of by which of the robot controller 122 and the controller 121 the target speed Vw is described or designated.
(107) In this manner, the first time T1 and the second times T2.sub.1 and T2.sub.2 are set before actually starting operating the robot 101 in a production line. To be noted, the first time T1 is preferably set for both of the controller 121 and the robot controller 122 in advance.
(108) Incidentally, the welding target portions to be subjected to laser seam welding need to be irradiated with the laser beam L not at timings at which the positions 52.sub.1 and 52.sub.2 are commanded to the robot 101 but at the time points TP2.sub.1 and TP2.sub.2 at which the control point actually passes the positions 52.sub.1 and 52.sub.2. Similarly, the irradiation of the laser beam L needs to be stopped not at timings at which the positions 53.sub.1 and 53.sub.2 are commanded to the robot 101 but at the time points TP3.sub.1 and TP3.sub.2 at which the control point actually passes the positions 53.sub.1 and 53.sub.2.
(109) That is, there is a case where response delay has occurred in the position control of the robot 101 at the time point TP2.sub.1 at which the control point reaches the constant speed region. The response delay of the position control is expressed by the difference between the commanded position and the actual position. In the case where there is response delay of the position control, this response delay of the position control needs to be incorporated in the calculation of the positions 54.sub.1, 54.sub.2, 55.sub.1, and 55.sub.2.
(110) Therefore, the robot controller 122 calculates the positions 54.sub.1 and 54.sub.2 at which movement is started and the positions 55.sub.1 and 55.sub.2 at which movement is stopped such that the control point reaches a target position when the first time T1 has elapsed after starting the operation of accelerating the laser head 102. However, how the positions 54.sub.1, 54.sub.2, 55.sub.1, and 55.sub.2 are calculated is different between the case where the method of
(111) This calculation needs to be performed before executing the linear interpolation command for generating the trajectory data P.sub.1 and P.sub.2. For example, the calculation can be performed immediately before the linear interpolation command or before actually operating the robot 101 in the production line. To be noted, the calculation algorithm thereof is described in the robot program 422 and executed by the robot controller 122.
(112)
(113) When automatic operation is started and the linear interpolation command including the position 55.sub.1 as the target position is executed, the robot controller 122 supplies the trajectory data P.sub.1 for the movement from the position 54.sub.1 to the position 55.sub.1 at a predetermined control period. The robot 101 operates in accordance with the trajectory data P.sub.1. That is, the robot controller 122 causes the robot 101 to start the operation of accelerating the laser head 102 such that the movement speed of the laser head 102 with respect to the processing target object W reaches the target speed VW.sub.1. According to this, the laser head 102, that is, the control point starts moving from the position 54.sub.1 to the position 55.sub.1 and accelerating such that the movement speed thereof reaches the constant target speed VW.sub.1.
(114) When the robot controller 122 starts supplying the trajectory data P.sub.1, the robot controller 122 simultaneously transmits a signal SBA obtained by switching the signal SB from off to on. The rising occurring when switching the signal SB from off to on serves as a synchronizing signal SBA serving as a predetermined signal. That is, the robot controller 122 transmits the synchronizing signal SBA that is the rising of the signal SB to the controller 121 when causing the robot 101 to start the operation of accelerating the laser head 102. This timing is shown as the time point TP1.sub.1 in
(115) In the first exemplary embodiment, the rising of the signal SB is used as the synchronizing signal SBA. Therefore, the signal SB may shrink whenever before starting supplying the next trajectory data P.sub.2 and after the control period of the controller 121 or more time has elapsed since the signal SB has risen. In the example of
(116) The controller 121 monitors the signal SB transmitted from the robot controller 122, and starts timekeeping of the first time T1 when receiving the synchronizing signal SBA that is the rising of the signal SB. For example, the first time T1 is a fixed value such as 200 msec.
(117) When the first time T1 has elapsed, the laser head 102 has reached a constant speed state at the target speed VW.sub.1 for welding, and the control point is positioned at the commanded position 52.sub.1 illustrated in
(118) As described above, the robot controller 122 causes the robot 101 to start the operation of accelerating the laser head 102 such that the movement speed of the laser head 102 with respect to the processing target object W is constant at the target speed Vw.sub.j. Then, the controller 121 controls the laser oscillator 103 to generate the laser beam when the first time T1 has elapsed after starting the acceleration of the laser head 102. The controller 121 detects the start of acceleration of the laser head 102 caused by the control of the robot controller 122 by receiving the synchronizing signal SBA.
(119) Next, the controller 121 starts timekeeping of the second time T2.sub.1 when the timekeeping of the first time T1 is finished. The robot controller 122 causes the robot 101 to operate such that the movement speed of the laser head 102 is maintained at the target speed Vw while the processing target object W is irradiated with the laser beam L from the laser head 102. The controller 121 controls the laser oscillator 103 to stop the generation of the laser beam L when the second time T2.sub.1 has further elapsed after the first time T1 has elapsed, that is, when the timekeeping of the second time T2.sub.1 is finished. This time point is indicated as the time point TP3.sub.1 in
(120) Specifically, the controller 121 switches the laser oscillation command SR1 from on to off at the same time with finishing the timekeeping of the second time T2.sub.1. That is, the controller 121 commands the laser oscillator 103 to stop the generation of the laser beam when the timekeeping of the second time T2.sub.1 is finished. The laser oscillator 103 stops the laser oscillation when receiving the switch of the laser oscillation command SR1 from on to off.
(121) The robot controller 122 commands the trajectory data P.sub.1-2 to the robot 101 for the next welding after commanding the position 55.sub.1, which is the end point of the trajectory data P.sub.1, to the robot 101. To be noted, in the case of performing the next welding by commanding the trajectory data P.sub.2 to the robot 101, the above-described operation is repeated. As described above, the robot controller 122 sequentially controls the operation of the robot 101 in accordance with a plurality of pieces of trajectory data P.sub.1 and P.sub.2 different from each other. Each time the controller 121 receives the synchronizing signal SBA, the controller 121 controls the laser oscillator 103 to generate a laser beam when the first time T1 of the same value has elapsed. That is, the same value is used for the first time T1 instead of setting a different value for each welding target portion. Therefore, each time the controller 121 receives the synchronizing signal SBA, the controller 121 counts the same time as the first time T1 without recognizing which of the trajectory data P.sub.1 and P.sub.2 the robot controller 122 has started commanding, and thus the processing is simplified.
(122) As described above, according to the first exemplary embodiment, the movement speed of the laser head 102 has reached the target speed Vw when the first time T1 has elapsed, and the processing target object W is irradiated with the laser beam in the state of the constant speed at this target speed Vw. That is, by moving the laser head 102 supported by the robot 101 at a constant speed with respect to the processing target object W, the focal point of the laser beam L can be moved at a constant speed along the surface of the processing target object W. Therefore, the amount of input heat on the processing target object W is uniformized in the movement direction of the focal point of the laser beam L, and thus a uniform welding bead can be formed on the processing target object W in the movement direction of the focal point of the laser beam L. As a result of this, highly precise laser seam welding can be realized.
(123) Here, the control period of the robot controller 122 is set to a value suitable for controlling the operation of the robot 101, for example, several milliseconds. In the first exemplary embodiment, the controller 121 controls the start and stop of the laser beam by the laser oscillator 103 at a shorter control period than the control period of the robot controller 122. That is, in the first exemplary embodiment, the control period of the controller 121 to control the laser oscillator 103 is shorter than the control period of the robot controller 122 to control the robot 101. Therefore, the controller 121 can more accurately manage the first time T1 and the second times T2.sub.1 and T2.sub.2 than the robot controller 122. That is, since the controller 121 can control the laser oscillator 103 with a shorter control period, the controller 121 can more accurately manage the timing for causing the laser oscillator 103 to start and stop generation of the laser beam. As a result, variation of the length of the welding bead can be reduced, and thus variation of the welding strength can be reduced.
(124) According to the first exemplary embodiment, the timing at which the robot controller 122 commands the start point of the trajectory data P.sub.j and the timing at which the controller 121 starts the timekeeping of the first time T1 are configured to be synchronized on the basis of the synchronizing signal SBA. The timing at which the robot controller 122 commands the start point of the trajectory data P.sub.j is the timing at which the robot controller 122 starts supplying the trajectory data P.sub.j. That is, the robot controller 122 generates the synchronizing signal SBA synchronized with the operation of the robot 101, and the on/off switching of the irradiation of the laser beam is managed by the elapsed time from the time point at which the controller 121 is synchronized with the synchronizing signal SBA. Therefore, the controller 121 and the robot controller 122 synchronize the operation of the robot 101 with the on/off switching of the laser oscillation of the laser oscillator 103 without performing complicated arithmetic processing or the like. Therefore, the deviation between the timing of the operation of the robot 101 and the timing of the laser oscillation can be reduced. As a result of this, the difference between the actual position and the target position for starting the irradiation of the laser beam can be reduced. In addition, since the laser oscillation does not have to be controlled by performing complicated arithmetic processing while the robot 101 is operating, the operation of the robot 101 can be accelerated while securing the precision of laser processing, and thus the production efficiency of the processed product can be improved.
Second Exemplary Embodiment
(125) Next, a laser welding apparatus according to a second exemplary embodiment will be described.
(126) The control program 321A configured to manage the sequence is preset for the controller 121 illustrated in
(127) To be noted, the robot program 422A is described such that, when the robot controller 122 receives a signal indicating that the operation start command SA has been switched from off to on, the robot controller 122 starts commanding the trajectory data P.sub.1 and P.sub.2.
(128)
(129)
(130) The robot controller 122 monitors the operation start command SA, and starts commanding the trajectory data P.sub.1 when the operation start command SA is switched from off to on. That is, the robot controller 122 causes the robot 101 to start the operation of accelerating the laser head 102 in the case of receiving the operation start command SA.
(131) The robot controller 122 transmits the synchronizing signal SBA to the controller 121 when starting supplying the trajectory data P.sub.1 to control the robot 101. This timing is shown as the time point TP1.sub.1 in
(132) The controller 121 monitors the signal SB transmitted from the robot controller 122, starts timekeeping of the first time T1 when receiving the synchronizing signal SBA that is the rising of the signal SB, and then turns the operation start command SA off.
(133) When the first time T1 has elapsed, the laser head 102 has reached the constant speed state at the target speed VW.sub.1 for welding, and the control point is positioned at the commanded position 52.sub.1 illustrated in
(134) Next, the controller 121 starts timekeeping of the second time T2.sub.1 when the timekeeping of the first time T1 is finished. The controller 121 controls the laser oscillator 103 to stop the generation of the laser beam L when the second time T2.sub.1 has elapsed after the first time T1 has elapsed, that is, when the timekeeping of the second time T2.sub.1 is finished. This time point is shown as the time point TP3.sub.1 in
(135) Specifically, the controller 121 switches the laser oscillation command SR1 from on to off at the same time with finishing the timekeeping of the second time T2.sub.1. That is, the controller 121 commands the laser oscillator 103 to stop the generation of the laser beam when the timekeeping of the second time T2.sub.1 is finished. The laser oscillator 103 stops the laser oscillation when receiving the switch of the laser oscillation command SR1 from on to off.
(136) The robot controller 122 commands the trajectory data P.sub.1-2 to the robot 101 for the next welding after commanding the position 55.sub.1, which is the end point of the trajectory data P.sub.1, to the robot 101. The robot controller 122 switches the signal SB from on to off at the same time with commanding the end point of the trajectory data P.sub.1-2.
(137) After the timekeeping of the second time T2.sub.1 is finished and the laser is stopped, the controller 121 monitors the signal SB turning off. The timing at which the signal SB is turned off is the timing at which the command of the end point of the trajectory data P.sub.1-2 is completed, and indicates that the trajectory data P.sub.2 for the next welding can be executed. The controller 121 turns the operation start command SA on if the signal SB has been turned off and the timekeeping of the second time T2.sub.1 has been finished. This timing is shown as a time point TP0.sub.2 in
(138) As described above, the robot controller 122 sequentially controls the operation of the robot 101 in accordance with the plurality of pieces of trajectory data P.sub.1 and P.sub.2 different from each other. Each time the controller 121 receives the synchronizing signal SBA, the controller 121 controls the laser oscillator 103 to generate a laser beam when the first time T1 of the same value has elapsed.
(139) As illustrated in
(140) To be noted, although the signal SB is used as a signal indicating that the robot 101 is in the standby state, a signal indicating that the robot 101 is in the standby state may be set in addition to the signal SB.
(141) As described above, according to the second exemplary embodiment, laser seam welding can be performed with a high precision similarly to the first exemplary embodiment also in the case where the controller 121 manages the sequence. In addition, similarly to the first exemplary embodiment, complicated arithmetic processing does not have to be performed while the robot 101 is operating. Therefore, the operation of the robot 101 can be accelerated, and the production efficiency of the processed product can be improved.
(142) As a result of the controller 121 constituted by a general-purpose computer having a good connectability with peripheral devices managing the sequence, it becomes easier for the controller 121 to access peripheral devices such as an unillustrated database. Since the controller 121 is a general-purpose computer, it is also easy to connect the controller 121 to various field buses to transmit information of the apparatus to another apparatus or reading a value of a sensor or the like. It is also easy to communicate with an external server via Ethernet (registered trademark) or the like.
(143) The control period of the robot controller 122 is longer than the control period of the controller 121. Therefore, the timing at which the robot controller 122 recognizes the operation start command SA varies. In the second exemplary embodiment, the controller 121 starts timekeeping of the first time T1 in which the laser oscillation is performed, not at a timing at which the operation start command SA is transmitted but at a timing at which the synchronizing signal SBA is received. Therefore, the difference in the timing of operation of the robot 101 and the timing of laser oscillation of the laser oscillator 103 can be reduced.
Third Exemplary Embodiment
(144) Next, a laser processing method using a laser welding apparatus according to a third exemplary embodiment will be described. In the third exemplary embodiment, a method of controlling the robot 101 such that the control point, that is, the focal point of the laser beam L passes taught points when the timekeeping of the first time T1 is completed and the timekeeping of the second time T2.sub.j is completed as described in the first exemplary embodiment and the second exemplary embodiment. That is, a control method of the robot 101 such that the control point passes the position 52.sub.j that is the taught start point for the welding when the irradiation of the laser beam is started and a control method of the robot 101 such that the control point passes the position 53.sub.j that is the taught end point for the welding when the irradiation of the laser beam is stopped will be described. In addition, in the third exemplary embodiment, a method of controlling the robot 101 such that the control point passes the position 52.sub.j and the position 53.sub.j at the target speed Vw.sub.j will be also described. To be noted, the laser processing method of the third exemplary embodiment can be applied to either of the first exemplary embodiment and the second exemplary embodiment.
(145) Calculation for causing the control point to pass the position 52.sub.j when starting the irradiation of the laser beam and pass the position 53.sub.j when stopping the irradiation of the laser beam and for causing the scanning speed of the laser beam to be the target speed Vw.sub.j will be described.
(146) The robot controller 122 illustrated in
(147) The necessity of the acceleration distance La.sub.j and the deceleration distance Ld.sub.j will be described with reference to
(148) If the robot 101 is caused to execute the linear interpolation command to move the control point toward the position 53.sub.1 after recognizing that the control point has reached the position 52.sub.1, the control point can be linearly moved between the positions 52.sub.1 and 53.sub.1. However, in this method, the speed of the control point moving along the actual path 51 serving as the actual position when passing through the position 52.sub.1 is 0, and thus welding cannot be started at the target speed VW.sub.1. Therefore, the acceleration distance La.sub.1 and the deceleration distance Ld.sub.1 need to be provided.
(149)
(150) In the case of the first exemplary embodiment, when the robot controller 122 executes the linear interpolation command or the joint interpolation command and completes supplying trajectory data P.sub.0-1, the robot controller 122 immediately starts executing the next linear interpolation command. The target position of the linear interpolation command is the position 55.sub.1. The robot controller 122 starts supplying the trajectory data P.sub.1 by executing the linear interpolation command. The robot controller 122 transmits the synchronizing signal SBA of
(151) In the case of the second exemplary embodiment, when the robot controller 122 completes supplying the trajectory data P.sub.0-1, the robot controller 122 waits for the operation start command SA from the controller 121 illustrated in
(152) In both cases of the first exemplary embodiment and the second exemplary embodiment, an appropriate acceleration distance La.sub.1 is provided such that the control point passes the position 52.sub.1 at the same time with the controller 121 finishing the timekeeping of the first time T1. Similarly, an appropriate deceleration distance Ld.sub.1 is provided such that the control point passes the position 53.sub.1 at the same time with the controller 121 finishing the timekeeping of the second time T2.sub.1.
(153) The commanded position and the actual position do not coincide at some moments. This occurs due to response delay of the robot. The commanded position and the actual position at each moment will be described with reference to
(154) There is a first method in which the control point is caused to pass the position 54.sub.1 by starting the supply of the trajectory data P.sub.1 after the control point reaches the position 54.sub.1 and a second method in which the control point passing the position 54.sub.1 is not waited for. The first method is illustrated in
(155)
(156) According to the first method, by waiting for the control point to actually reach the position 54.sub.1 as illustrated in
(157) To use the first method in the case where the operation start command is not used as in the first exemplary embodiment, the robot controller 122 performs the following processing. That is, the robot controller 122 recognizes the control point actually reaching the position 54.sub.1 after completing the supply of the trajectory data P.sub.0-1, and then transmits the synchronizing signal SBA to the controller 121 at the same time with starting supplying the trajectory data P.sub.1.
(158) To use the first method in the case where the operation start command is used as in the second exemplary embodiment, the robot controller 122 performs the following processing. That is, the robot controller 122 recognizes the control point actually reaching the position 54.sub.1 after completing the supply of the trajectory data P.sub.0-1, and then switches the signal SB from on to off to notify the controller 121 that the robot controller 122 is ready for receiving the operation start command SA. Hereinafter, a signal SBB for notifying that the signal SB has been switched from on to off and the robot controller 122 is ready for receiving the operation start command SA will be also referred to as a robot standby signal SBB.
(159) According to the second method, there is a case where the control point does not reach the position 54.sub.1 as illustrated in
(160) In the case where the operation start command is not used as in the first exemplary embodiment, the robot controller 122 transmits the synchronizing signal SBA to the controller 121 at the same time with starting supplying the trajectory data P.sub.1 after completing the supply of the trajectory data P.sub.0-1.
(161) In the case where the operation start command is used as in the second exemplary embodiment, the robot controller 122 transmits the robot standby signal SBB to the controller 121 to notify that the robot controller 122 is ready for receiving the operation start command SA after completing the supply of the trajectory data P.sub.0-1.
(162) It is often the case that the control point has not reached the position 54.sub.1 when the robot controller 122 receives the operation start command SA from the controller 121. To be noted, in the case where the robot controller 122 does not immediately receive the operation start command SA, there is a possibility that the control point reaches the position 54.sub.1 when the robot controller 122 receives the operation start command SA. In this case, the same relationship is obtained between speed and time as
(163) As illustrated in
(164) A procedure of causing the control point to pass the positions 52.sub.1 and 53.sub.1 will be described.
(165)
(166)
(167) In the case of the second method illustrated in
(168) The second is causing the control point to pass the positions 52.sub.1 and 53.sub.1. Whether or not the control point passes the position 52.sub.1 is tested while causing the robot 101 to approach the position 54.sub.1 from various positions in various conditions with trial and error. Then, the lower limit value of the first time T1 is determined from test results of the conditions. Finally, the larger value among the two lower limit values is determined as the first time T1.
(169) A specific example of how the first time T1 is determined will be described.
(170) The robot controller 122 is caused to perform the next processing in a state in which the positions 54.sub.1 and 55.sub.1 have been set. That is, the robot controller 122 performs control of moving the control point from various positions to the position 54.sub.1 in accordance with a linear interpolation movement commend or a joint interpolation movement command and then moving the control point from the moved position to the position 55.sub.1 in accordance with the linear interpolation movement command.
(171)
(172) The first time T1 is determined on the basis of whether or not the speed is within an allowable range from the target speed VW.sub.1 in the response waveform 59 of the speed, and whether or not the position is within an allowable range in the waveforms 62q, 62r, and 62d which are response waveforms of the position. The first time T1 can be determined as a time from when the signal SBA is transmitted to when all of the response waveforms 59, 62q, 62r, and 62d are within the allowable ranges.
(173) Although only the target speed VW.sub.1 is illustrated in
(174) In this manner, the first time T1 can be determined. Next, the acceleration distance La.sub.1 is calculated by using the first time T1 such that the control point passes the position 52.sub.1 at a moment at which the elapse of the first time T1 is completed.
(175) The calculation formula for calculating the acceleration distance La.sub.1 changes in accordance with the characteristic of a control system. The control system has a concept called “type”, and the response characteristic changes in accordance with the type. The type of the feedback control system of the robot controller 122 can be obtained from the results of measurement illustrated in
(176) The feedback control system is known to have deviation shown in Table 1 in the case where feed forward control is not performed.
(177) TABLE-US-00001 TABLE 1 Input r(t) = h r(t) = vt r(t) = at.sup.2/2 Control System 0 h/(1 + K) ∞ ∞ 1 0 v/K ∞ 2 0 0 a/K
(178) Table 1 shows relationship between the type of the control system and position deviation at a time t=∞. Table 1 shows which constant value the position deviation has at t=∞ in the case where step input r(t)=h, ramp input r(t)=vt, and parabola input r(t)=at.sup.2/2 are respectively input in the control system.
(179) In
(180) The acceleration distance La.sub.j is calculated in consideration of the above.
(181) A vector 56s.sub.j is a vector indicating the commanded position with respect to the actual position at the moment at which the elapse of the first time T1 is completed. The start point of the vector 56s.sub.j corresponds to the actual position and the end point of the vector 56s.sub.j corresponds to the commanded position. At the moment at which the elapse of the first time T1 is completed, the control point needs to be actually at the position 52.sub.j. The acceleration distance La.sub.j can be expressed by a distance Lsr.sub.j and a distance Lse.sub.j. The distance Lsr.sub.j is a distance in which the commanded position of the robot 101 moves in the elapse of the first time T1, and the distance Lse.sub.j is a distance derived from the response delay of the robot 101 at the time point at which the elapse of the first time T1 is completed. The acceleration distance La.sub.j is calculated by the following formula (2). Specifically, the acceleration distance La.sub.j is obtained by subtracting the distance Lse.sub.j from the distance Lsr.sub.j. The CPU 401 of the robot controller 122 illustrated in
La.sub.j=Lsr.sub.j−Lse.sub.j (2)
(182) If the acceleration distance La.sub.j is obtained, the position 54.sub.j can be calculated. First, in order to determine an extension direction, the CPU 401 obtains a unit vector Pdir.sub.j by using the following formula (3). In the formula (3), Ps.sub.j represents the position 52.sub.j at which the irradiation of the laser beam is started, and Pe.sub.j represents the position 53.sub.j at which the irradiation of the laser beam is stopped.
(183)
(184) The calculation of the formula (3) is calculation of dividing a vector having the position 52.sub.j as the start point and the position 53.sub.j as the end point by a distance. That is, the unit vector Pdir.sub.j is a vector of a length of 1 indicating the direction in which the formation of bead progresses. The CPU 401 calculates the position 54.sub.j by using the unit vector Pdir.sub.j and the acceleration distance La.sub.j. The following formula (4) is a calculation formula for the position 54.sub.j. Here, Pa.sub.j represents the position 54.sub.j as a position at which the acceleration is started.
{right arrow over (P)}a.sub.j={right arrow over (P)}s.sub.j−La.sub.j.Math.{right arrow over (P)}dir.sub.j (4) {right arrow over (P)}a.sub.j: acceleration start position
(185) Next, how the distance Lsr.sub.j in which the commanded position of the robot 101 moves in the elapse of the first time T1 and the distance Lse.sub.j derived from the response delay of the robot 101 at the time point at which the elapse of the first time T1 is completed are actually calculated will be described.
(186) Although the distance Lsr.sub.j changes in accordance with the trajectory of the robot 101 generated by the robot controller 122, the distance Lsr.sub.j can be easily calculated in the case where a speed command has a trapezoidal shape. As an example, a case where the speed command has a trapezoidal shape and an acceleration time Ta is a fixed value will be described.
(187)
(188)
(189) How the distance Lse.sub.j is calculated varies depending of the type of the control system shown in Table 1. The robot 101 is controlled such that the movement speed is kept at the target speed until the timekeeping of the first time T1 and the timekeeping of the second time T2.sub.j are completed. Therefore, as shown in Table 1, the ramp input r(t) is vt. Therefore, in the case of the control system of Type 2, the distance Lse.sub.j is 0. In the case of the control system of Type 1, the distance Lse.sub.j is v/K. In the case of the control system of Type 0, the distance Lse.sub.j is a function of the time t. Since the control system of Type 0 is seldom employed as the control system of the robot 101, the control system of the robot 101 is Type 2 or Type 1. In the case where the control system of the robot 101 is Type 1, the calculation formula for the distance Lse.sub.j is the following formula (6) in which the target speed Vw.sub.j is divided by a predetermined constant Kv. The CPU 401 of the robot controller 122 illustrated in
(190)
(191) In this manner, the acceleration distance La.sub.j can be calculated, and the position 54.sub.j can be calculated. As a result of this, the control point can be caused to actually pass the position 52.sub.j at the moment at which the elapse of the first time T1 is completed.
(192) How the deceleration distance Ld.sub.j is calculated will be described in a similar manner. By calculating the deceleration distance Ld.sub.j and the position 55.sub.j, the control point can be caused to actually pass the position 53.sub.j at the moment at which the elapse of the second time T2.sub.j is completed.
(193)
(194) A vector 56e.sub.j is a vector indicating the commanded position with respect to the actual position at the moment at which the elapse of the second time T2.sub.j is completed. The start point of the vector 56e.sub.j corresponds to the actual position and the end point of the vector 56e.sub.j corresponds to the commanded position. At the moment at which the elapse of the second time T2.sub.j is completed, the control point needs to be actually at the position 53.sub.j. The actual speed at the moment at which the elapse of the second time T2.sub.j is completed is preferably equal to the target speed Vw.sub.j. The deceleration distance Ld.sub.j can be expressed by a distance Ler.sub.j and a distance Lee.sub.j. The distance Le.sub.j is a distance in which the commanded position moves after the time point when the elapse of the second time T2.sub.j is completed until the commanded position reaches the position 55.sub.j and stops, and the distance Lee.sub.j is a distance derived from the response delay of the robot 101 at the time point at which the elapse of the second time T2.sub.j is completed. The deceleration distance Ld.sub.j is calculated by the following formula (7). Specifically, the deceleration distance Ld.sub.j is obtained by adding the distance Ler.sub.i to the distance Lee.sub.j. The CPU 401 of the robot controller 122 illustrated in
Ld.sub.j=Ler.sub.j+Lee.sub.j (7)
(195) If the deceleration distance Ld.sub.j is obtained, the position 55.sub.j can be calculated. The CPU 401 calculates the position 55.sub.j by using the unit vector Pdir.sub.j obtained by using the formula (3). Here, Pdj represents the position 55.sub.j as a position at which the control point stops after deceleration.
{right arrow over (P)}d.sub.j={right arrow over (P)}e.sub.j+Ld.sub.j.Math.{right arrow over (P)}dir.sub.j (8) {right arrow over (P)}d.sub.j: stop position after deceleration
(196) Next, how the distance Ler.sub.j in which the commanded position moves after the time point when the elapse of the second time T2.sub.j is completed until the commanded position reaches the position 55.sub.j and the distance Lee.sub.j derived from the response delay of the robot 101 at the time point at which the elapse of the second time T2.sub.j is completed are actually calculated will be described.
(197) Although the distance Ler.sub.j changes in accordance with the trajectory of the robot 101 generated by the robot controller 122, the distance Ler.sub.j can be easily calculated in the case where a speed command has a trapezoidal shape. As an example, a case where the speed command has a trapezoidal shape and a deceleration time Td is a fixed value will be described.
(198)
(199)
(200) How the distance Lee.sub.j is calculated varies depending of the type of the control system shown in Table 1. In the case of the control system of Type 2, the distance Lee.sub.j is 0. In the case of the control system of Type 1, the distance Lee.sub.j is obtained by dividing the target speed Vw.sub.j by the predetermined constant Kv. In the case of the control system of Type 1, the calculation formula is the same as the formula (6), and Lee.sub.j is equal to Lse.sub.j in calculation. The CPU 401 of the robot controller 122 illustrated in
(201)
(202) To be noted, the constant Kv used in the formulae (6) and (10) is an unknown value. Therefore, the constant Kv needs to be identified in advance and stored in a storage device of the robot controller 122 illustrated in
(203)
(204) Although an example in which the commanded position is caused to stop at the position 55.sub.j has been described as an example of calculation of the deceleration distance Ld.sub.j, the commanded position does not have to be stopped.
(205) In this manner, the deceleration distance Ld.sub.j can be calculated, and the position 55.sub.j can be calculated. As a result of this, the control point can be caused to actually pass the position 53.sub.j at the moment at which the elapse of the second time T2.sub.j is completed.
(206) To be noted, in the case where there are a plurality of welding target portions, the distance Lse.sub.j may be measured for each welding target portion and the constant Kv may be calculated from the average value thereof. That is, the constant Kv may be calculated by the following formula (12) instead of the formula (11).
(207)
(208) As described above, according to the third exemplary embodiment, the control point of the robot 101, that is, the focal point of the laser beam L can be caused to pass taught points when the timekeeping of the first time T1 is completed and when the timekeeping of the second time T2.sub.j is completed. That is, the control point can be caused to pass, with a high position precision, the position 52.sub.j that is a taught start position for the welding when starting the laser irradiation. Similarly, the control point can be caused to pass, with a high position precision, the position 53.sub.j that is a taught end position for the welding when stopping the laser irradiation. In addition, according to the third exemplary embodiment, the control point can be caused to pass the positions 52.sub.j and 53.sub.j at the target speed Vw.sub.j.
Fourth Exemplary Embodiment
(209) Next, a laser processing method using a laser welding apparatus according to a fourth exemplary embodiment will be described. In the third exemplary embodiment, a method of obtaining the distances Lse.sub.j and Lee.sub.j derived from the response delay of the robot 101 by using the formulae (6) and (10) in accordance with the type of the control system shown in Table 1 has been described. In addition, in the third exemplary embodiment, the constant Kv in the formulae (6) and (10) is set to the same value regardless of the welding target portion.
(210) However, when the robot 101 is operated, since the inertia moment of the robot 101 varies between an orientation in which the robot arm 111 is extended and an orientation in which the robot arm 111 is contracted, the response characteristic of the robot 101 changes between these orientations. Therefore, error can occur in the case where the distances Lse.sub.j and Lee.sub.j are calculated by using the constant Kv of the same value regardless of the welding target portion. Therefore, in the fourth exemplary embodiment, a method of obtaining the distances Lse.sub.j and Lee.sub.j in advance for each welding target portion will be described.
(211)
(212) In STEPA1, the controller 121 activates “TASK1” of the robot controller 122. In STEPA2, the controller 121 stands by until receiving the signal SBB. In STEPA3, the controller 121 turns the operation start command SA on. In STEPA4, the controller 121 stands by until receiving the signal SBA. In STEPA5, the controller 121 performs timekeeping of the first time T1. In STEPA6, the controller 121 transmits a signal SDA notifying the timing for recording the distance Lse.sub.3 to “TASK2” of the robot controller 122. In STEPA7, the controller 121 performs timekeeping of the second time T2.sub.j. Again in STEPA2, the controller 121 stands by until receiving the signal SBB. “LOOP” in
(213) Processing of the robot controller 122 includes processing of “TASK1” corresponding to the processing described in the first exemplary embodiment and the second exemplary embodiment, and processing of “TASK2” newly added to record the distance Lse.sub.j.
(214) The processing of “TASK1” is as follows. In STEPB1, the robot controller 122 activates “TASK2”. In STEPB2, the robot controller 122 transmits the signal SBB. In STEPB3, the robot controller 122 stands by until receiving the switching of the operation start command SA to on. In STEPB4, the robot controller 122 executes the linear interpolation command and supplies the trajectory data P.sub.j. In STEPB5, the robot controller 122 executes the linear interpolation command or the joint interpolation command and supplies the trajectory data P.sub.j−(j+1). The trajectory data P.sub.j−(j+1) is trajectory data connecting the end point of the trajectory data P.sub.j and the start point of the trajectory data P.sub.(j+1). In STEPB6, the robot controller 122 transmits a signal SEA notifying “TASK2” that all welding target portions have been passed.
(215) The processing of “TASK2” is as follows. In STEPC1, the robot controller 122 stands by until receiving the signal SDA from the controller 121. In STEPC2, the robot controller 122 calls a function for obtaining a length of a line segment connecting the commanded position and the actual position, and stores a value of the distance Lse.sub.j in the RAM 403. Again in STEPC1, the robot controller 122 stands by until receiving the signal SDA from the controller 121. In STEPC3, the robot controller 122 records, as a file and in the HDD 404, the value of the distance Lse.sub.j for each welding target portion stored in the RAM 403.
(216) Since the signal SDA is transmitted from the controller 121 to the robot controller 122 at a timing at which the timekeeping of the first time T1 is completed, the distance Lse.sub.j can be measured. Although the control period of the robot controller 122 is longer than the control period of the controller 121, if the type of the control system is Type 1, the distance Lse.sub.j converges into a certain value as shown in the response waveform 61 illustrated in
(217) In the welding operation, the acceleration distance La.sub.j and the deceleration distance Ld.sub.j in each welding target portion can be calculated by using the distance Lse.sub.j stored in the HDD 404 and the formulae (2) and (7).
(218) According to the fourth exemplary embodiment, since the distance Lse.sub.j is actually measured for each welding target portion, the control point can be caused to, with a high position precision, pass the positions 52.sub.j and 53.sub.j without being affected by the inertia moment of the robot 101 when the first time T1 has elapsed and when the second time T2.sub.j has elapsed.
(219) To be noted, although the function for obtaining the length of the line segment connecting the commanded position and the actual position is called in STEPC2, difference between the commanded position and the actual position may be calculated by obtaining the commanded position and the actual position as long as the commanded position and the actual position can be obtained at the same timing.
(220) Although the controller 121 performs timekeeping of the second time T2.sub.j in which the laser beam is irradiated in STEPA7, since the sequence shown in
(221) According to the fourth exemplary embodiment, since the distances Lse.sub.j and Lee.sub.j are obtained, that is, measured and recorded for each welding target portion in advance, welding can be performed with a high position precision even in the case where the response characteristic changes in accordance with the orientation of the robot 101.
Fifth Exemplary Embodiment
(222) A method of measuring and recording the distances Lse.sub.j and Lee.sub.j in an HDD for each welding target portion has been described in the fourth exemplary embodiment. However, error can occur due to the response delay of the robot controller 122. The error is error in the position of the robot 101 serving as the control point when the first time T1 has elapsed with respect to the taught position at which the irradiation of laser beam is started, that is, error of the actual position with respect to the taught position.
(223) Therefore, in the fifth exemplary embodiment, to reduce the error, the acceleration distance La.sub.j and the deceleration distance Ld.sub.j are calculated by the following formulae (13) and (14) instead of the formulae (2) and (7). Since the component of the error is proportional to the movement speed of the robot 101, the component of the error can be expressed by using the target speed Vw.sub.j and a coefficient β that is a constant.
La.sub.j=Lsr.sub.j−(Lse.sub.j+β.Math.Vw.sub.j) (13)
Ld.sub.j=Ler.sub.j−(Lee.sub.j+β.Math.Vw.sub.j) (14)
(224) To be noted, the constant β used in the formulae (13) and (14) is unknown. Therefore, the constant β needs to be identified in advance and recorded in a storage device of the robot controller 122 illustrated in
(225)
(226) Identification of the constant β is performed as follows. First, the positions 52.sub.j and 53.sub.j taught by the user need to be set within a measurement range of the optical position sensors 130A to 130D. For example, the optical position sensors 130A to 130D are first disposed, and the positions 52.sub.j and 53.sub.j are taught with reference to the optical position sensors 130A to 130D. Although the taught points herein are different from actual welding target points, there is no problem because this is a test operation for identifying the constant β. The letter j indicating the order of welding is associated with the optical position sensors. For example, j=1 corresponds to the optical position sensor 130A, and j=2 corresponds to the optical position sensor 130B. When the teaching is completed, the guide light is turned on while the robot 101 is stopped at a taught position, and position data of the positions 52.sub.j and 53.sub.j is measured by using the optical position sensors 130A to 130D. Next, as described in the fourth exemplary embodiment, a sequence of obtaining the distances Lse.sub.j and Lee.sub.j for each welding target portion is executed. Next, β=0 is preliminarily set, and the welding operation is executed by using the acceleration distance La.sub.j and the deceleration distance Ld.sub.j in the formulae (13) and (14).
(227)
(228) In the present exemplary embodiment, a distance Lp.sub.j that is difference between the taught position 52.sub.j and the start position of the irradiation of the laser beam in the welding operation. To be noted, although the component of positional difference can be divided into a component in the advance direction and a component perpendicular to the advance direction, only the component in the advance direction is measured as the distance Lp.sub.j. This measurement is performed for a plurality of orientations, a plurality of speeds, and a plurality of laser scanning directions, and the constant β is identified by using the measurement results of these.
(229)
(230) According to the fifth exemplary embodiment, the error derived from the response delay of the robot controller 122 can be reduced, and thus welding can be performed with a high position precision.
Sixth Exemplary Embodiment
(231) A case where the acceleration distance La.sub.j is calculated by using the distance Lsr.sub.j in which the commanded position moves during the elapse of the first time T1 and the distance Lse.sub.j derived from the response delay has been described in the third exemplary embodiment to the fifth exemplary embodiment. Similarly, a case where the deceleration distance Ld.sub.j is calculated by using the distance Ler.sub.j in which the commanded position moves after the elapse of the second time T2.sub.j until the commanded position reaches the position 55.sub.j and stops and the distance Lee.sub.j derived from the response delay at the moment at which the elapse of the second time T2.sub.j is completed has been described. However, if the robot controller 122 can accurately measure the actual position of the control point when the timekeeping of the first time T1 is completed, the acceleration distance La.sub.j can be obtained by using the taught position 52.sub.j and the actual position of the control point when the timekeeping of the first time T1 is completed.
(232)
(233) A method of calculating the acceleration distance La.sub.j illustrated in
(234) In STEPD2, the CPU 401 calculates the position 54.sub.j by, for example, the formula (4). Then, the CPU 401 performs the welding operation, and obtains the actual position Psm.sub.j of the control point when the first time T1 has elapsed.
(235) In STEPD3, the CPU 401 calculates difference Errs.sub.j between a position Ps.sub.j corresponding to the taught position 52.sub.j and the actual position Psm.sub.j. The difference Errs.sub.j is an error of the position Psm.sub.j that is the position of the robot 101 serving as the control point when the first time T1 has elapsed from the position Ps.sub.j corresponding to the position 52.sub.j that is the taught position at which the irradiation of the laser beam is started. There are two methods for obtaining the difference Errs.sub.j. That is, a method of using only the component in the welding direction and a method of also using the component in the direction perpendicular to the welding direction. The method of using only the component in the welding direction will be described. The calculation formula for the difference Errs.sub.j is the formula (15). That is, the difference Errs.sub.j is obtained by calculating the inner product of the unit vector Pdir.sub.j and a difference vector between the actual position Psm.sub.j and the position Ps.sub.j corresponding to the taught position 52.sub.j.
Errs.sub.j=({right arrow over (P)}sm.sub.j−{right arrow over (P)}s.sub.j).Math.{right arrow over (P)}dir.sub.j (15)
(236) In STEPD4, the CPU 401 determines whether or not all the difference Errs.sub.j is equal to or below a threshold value. In the case where all the difference Errs.sub.j is equal to or below the threshold value, that is, the result of determination in STEPD4 is YES, the CPU 401 executes STEPD6 and finishes the sequence. The threshold value used in STEPD4 is a value preset in a storage device, for example, the HDD 404, and may be set by the user. In STEPD6, the CPU 401 stores the value of the acceleration distance La.sub.j in each welding target portion as a file in the HDD 404.
(237) In the case where any of the difference Errs.sub.j is greater than the threshold value, that is, where the result of determination in STEPD4 is NO, the CPU 401 executes STEPD5. In STEPD5, the CPU 401 multiplies the difference Errs.sub.j that is greater than the threshold value by a constant Ks, and adds the product to the value of the original acceleration distance La.sub.j to update the value of the acceleration distance La.sub.j. This arithmetic processing is expressed by the following formula (16). After finishing the calculation of the acceleration distance La.sub.j in STEPD5, the CPU 401 performs the processing of STEPD2 again.
La.sub.j←La.sub.j+Ks.Math.Errs.sub.j (16)
(238) By the processing described above, the CPU 401 of the robot controller 122 sets the acceleration distance La.sub.j such that the difference Errs.sub.j becomes smaller than the threshold value.
(239) A method for calculating the deceleration distance Ld.sub.j illustrated in
(240) In STEPE2, the CPU 401 calculates the position 55.sub.j by, for example, the formula (8). Then, the CPU 401 performs the welding operation, and obtains an actual speed Vem.sub.j of the control point when the second time T2.sub.j has elapsed.
(241) In STEPE3, the CPU 401 calculates difference Errve.sub.j between the target speed Vw.sub.j and the actual speed. The calculation formula for the difference Errve.sub.j is the following formula (17). Since the speed of the robot 101 serving as the control point is a three-dimensional vector, the target speed needs to be also evaluated as the three-dimensional vector. To vectorize the target speed, the target speed Vw.sub.j that is a scalar may be multiplied by the unit vector Pdir.sub.j. By subtracting the vector of the target speed from the vector of the actual speed Vem.sub.j, the vector of the difference Errve.sub.j that is the error of the actual speed with respect to the target speed can be obtained.
{right arrow over (E)}rrve.sub.j={right arrow over (V)}em.sub.j−(Vw.sub.j.Math.{right arrow over (P)}dir.sub.j) (17)
(242) In the case where all the magnitude, that is, the absolute value of the difference Errve.sub.j is equal to or smaller than the threshold value, that is, where the result of determination in STEPE4 is YES, the CPU 401 executes STEPE6 and finishes the sequence. The threshold value used in STEPE4 is a value preset in a storage device, for example, the HDD 404, and may be set by the user. In STEPE6, the CPU 401 stores the value of the deceleration distance Ld.sub.j in each welding target portion as a file in the HDD 404.
(243) In the case where any of the magnitude of the difference Errve.sub.j is greater than the threshold value, that is, where the result of determination in STEPE4 is NO, the CPU 401 executes STEPE5. In STEPE5, the CPU 401 increases the deceleration distance Ld.sub.j by the constant Ke only in a welding target portion in which the difference Errve.sub.j exceeds the threshold value. This arithmetic processing is expressed by the following formula (18). After finishing the calculation of the deceleration distance Ld.sub.j in STEPE5, the CPU 401 executes the processing of STEPE2 again.
Ld.sub.j←Ld.sub.j+Ke (18)
(244) By the processing described above, the CPU 401 of the robot controller 122 sets the deceleration distance Ld.sub.j such that the magnitude of the difference Errve.sub.j becomes smaller than the threshold value.
(245) As described above, the acceleration distance La.sub.j and the deceleration distance Ld.sub.j can be calculated by the sequence illustrated in
(246) The robot 101 includes the encoders En1 to En6 illustrated in
(247) It is preferable that the determination sequence for the deceleration distance Ld.sub.j of
(248) According to the sixth exemplary embodiment, welding can be performed with a high position precision because the control point is directly aligned with the position 52.sub.j represented by the position Ps.sub.j. In addition, since the deceleration distance Ld.sub.j is determined by confirming the actual speed Vem.sub.j, welding can be efficiently performed without wasting time.
(249) In addition, in the case where the calculation of the formulae (5) and (9) of the third exemplary embodiment is difficult, that is, where the trajectory data generated by the robot controller 122 is not a speed command having a trapezoidal shape or changes in accordance with the condition, the acceleration distance La.sub.j and the deceleration distance Ld.sub.j may be set as in the sixth exemplary embodiment. In this case, the actual position Psm.sub.j and the actual speed Vem.sub.j do not have to be used, and the commanded position when the timekeeping of the first time T1 is completed and the commanded speed when the timekeeping of the second time T2.sub.j is completed may be used in calculation instead of these.
Seventh Exemplary Embodiment
(250) In the third exemplary embodiment to the sixth exemplary embodiment, a method of calculating the acceleration distance La.sub.j and a method of obtaining the position 54.sub.j to cause the robot 101 serving as the control point to pass the position 52.sub.j when the timekeeping of the first time T1 is completed have been described. In addition, similarly, a method of calculating the deceleration distance Ld.sub.j and a method of obtaining the position 55.sub.j to cause the robot 101 serving as the control point to pass the position 53.sub.j when the timekeeping of the second time T2 is completed have been described. However, in the case where movement from a welding target portion to another welding target portion is fast, vibration occurs in the robot, and sometimes problems such as a problem that the bead becomes curved and a problem that the control point cannot pass the welding target portion at a constant speed occur. In the seventh exemplary embodiment, a method of suppressing this vibration will be described.
(251)
(252)
(253) A specific method of adjusting the acceleration rate α will be described.
(254) In STEPF1, the CPU 401 of the robot controller 122 illustrated in
(255) In STEPF3, the CPU 401 checks whether all the acceleration rates α in the list have been tested. In the case where the CPU 401 has not tested all the acceleration rates α in the list, that is, in the case whether the result of STEPF3 is NO, the CPU 401 returns to the processing of STEPF2. For example, in the case where the list of acceleration rate α includes five values of 100%, 80%, 60%, 40%, and 20%, the processing of STEPF2 is performed five times by performing the test operation while sequentially changing the acceleration rate from 100% to 20%. In the case where the CPU 401 has tested all the acceleration rates α in the list in STEPF3, that is, in the case where the result of STEPF3 is YES, the CPU 401 executes processing of STEPF4.
(256) In STEPF4, the CPU 401 executes a sequence of evaluating the vibration of the robot, and makes an OK/NG determination for the acceleration rate α. The OK/NG determination is made for each welding target portion. Table 2 shows an example of a determination result. In the example of
(257) TABLE-US-00002 TABLE 2 α j 100% 80% 60% 40% 20% 1 NG OK OK OK OK 2 NG NG NG OK OK 3 NG NG OK OK OK . . . . . . . . . . . . . . . . . .
(258) In STEPF5, the CPU 401 determines the value of the acceleration rate α for each welding target portion. That is, for each welding target portion, the CPU 401 selects a value of acceleration rate α that is determined as OK in the determination of STEPF4. It is preferable that, for each welding target portion, the largest value among values determined as OK in the determination of STEPF4 is selected as the value of the acceleration rate α. For example, in the case of Table 2, the CPU 401 selects 80% for the first welding target portion, 40% for the second welding target portion, and 60% for the third welding target portion.
(259) In STEPF6, the CPU 401 stores the value of acceleration rate α associated with each welding target portion in the HDD 404. As described above, the CPU 401 adjusts, for each welding target portion and on the basis of the vibration of the robot 101 that has occurred during the test operation, the acceleration rate α used when moving the robot 101 serving as the control point to a position at which the operation of accelerating the laser head 102 is performed in the actual operation.
(260) To be noted, in the test operation in STEPF2 of the sequence illustrated in
(261) As described above, according to the seventh exemplary embodiment, since the acceleration rate α for deceleration at the time of moving the control point from a welding target portion to the next welding target portion is adjusted for each welding operation before the actual operation such that the vibration of the robot 101 is within an allowable range, the efficiency of processing is improved. That is, the control point can be moved from a welding target portion to the next welding target portion quickly while satisfying the position precision and speed precision required for processing.
Eighth Exemplary Embodiment
(262) In the seventh exemplary embodiment, a method for adjusting the acceleration rate α for deceleration at the time of moving the control point from a welding target portion to another welding target portion such that the vibration of the robot 101 is within the allowable range has been described. In the eighth exemplary embodiment, an example of an evaluation method for the vibration of the robot 101 will be described in detail.
(263) As illustrated in
(264) Therefore, the CPU 401 illustrated in
(265) The CPU 401 of the robot controller 122 obtains a plurality of positional coordinates over time while the control point moves in the second time T2.sub.j. Then, the CPU 401 determines whether or not the plurality of positional coordinates are within a predetermined region. The CPU 401 makes “OK” determination in the case where all the positional coordinates are within the predetermined region, and makes “NG” determination in the case where not all the positional coordinates are within the predetermined region. The predetermined region can be arbitrarily set by the user such that the predetermined region includes the line segment LS.sub.j. In the present exemplary embodiment, an allowable region R1, R2, or R3 serving as the predetermined region is set as illustrated in
(266) To be noted, in the case where the direction of the vibration of the robot 101 approximately matches the movement direction of the control point of the robot 101, the vibration appears as vibration of the speed of the control point. In the case where the period of the vibration is long, the speed of the control point during the second time T2.sub.j in which irradiation of the laser beam is performed is only a part of the vibration of speed. In this case, the speed in the section corresponding to the part of vibration can be slower than the target speed Vw.sub.j. In the case where the speed of the control point is slow, the welding bead becomes short even if the positional coordinate of the control point is within the predetermined region that has been described above. Therefore, the CPU 401 calculates difference between positional coordinates of the control point obtained over time in the second time T2.sub.j, and divides the difference by the sampling time to obtain temporal speed data of the control point. Then, the CPU 401 calculates the average value of the temporal speed data to obtain the average speed of the control point. The CPU 401 may make the “OK” determination in the case where an absolute value of difference between the average speed of the control point and the target speed Vw.sub.j is equal to or smaller than a predetermined value, and make the “NG” determination in the case where the difference is larger than the predetermined value. That is, as determination of occurrence of the vibration of the robot, whether or not the absolute value of the difference between the average speed of the control point during the timekeeping of the second time T2.sub.j and the target speed Vw.sub.j is equal to or smaller than the predetermined value may be determined.
(267) The adjustment of the acceleration rate α is performed in a similar manner to the seventh exemplary embodiment. As described above, the CPU 401 adjusts the acceleration rate α such that the plurality of positional coordinates are within the predetermined region including the line segment LS.sub.j in advance. Since the acceleration rate α is adjusted for each welding target portion before the actual operation, the processing trace of the laser beam is within the predetermined region in the actual operation. That is, the control point can be moved quickly from a welding target portion to another welding target portion while satisfying the position precision and speed precision required for processing.
(268) To be noted, in the robot 101, the reduction gear elastically deforms. Therefore, error derived from the elastic deformation of the reduction gear can be prevented from being superimposed on the output values of the encoders if the encoders are provided at the output shafts of the reduction gear. Therefore, the positional coordinate of the control point may be calculated from the output values of the encoders provided at the output shafts of the reduction gear.
(269) As a method for the CPU 401 to obtain the timing at which the CPU 301 causes the laser beam to be radiated/stopped, there is a method of transmitting a signal from the controller 121 when the timekeeping of the first time T1 is completed and when the timekeeping of the second time T2.sub.j is completed. In addition to this, for example, there are a method in which the robot controller 122 executes a special command and a method in which the robot controller 122 performs the timekeeping of the first time T1 and the second time T2.sub.j. Further, it is possible to determine the occurrence of the vibration of the robot by performing an image processing to an image captured through a camera. In this case, the robot 101 supports the camera. At first, the sequence of the seventh exemplary embodiment is performed. When the occurrence of the vibration of the robot is determined through the image processing, the test operation in the seventh exemplary embodiment is a test machining with an actual laser radiation. After performing the test machining, the robot circulates and captures each welded portion so that the captured image of each welded portion is memorized in the HDD 404. The vibration of the robot is suppressed when the acceleration rate α is low. Accordingly, the occurrence of the vibration of the robot is determined by comparing images of the welding bead corresponding to the acceleration rates α on the list at each welded portion. For example, the occurrence of the vibration of the robot is determined by comparing an image of the welding bead captured in a setting of the minimum acceleration rate α and an image of the welding bead captured in a setting of another acceleration rate α at each welded portion. More specifically, the CPU 401 binarizes each captured image and extracts the welding bead portion from the other portion. Then the CPU 401 determines XY coordinate data of the center of gravity of the welding bead portion. Here, the XY coordinate system is a coordinate system in which the coordinate origin (0, 0) is located at the lower left corner of the image. After then, the CPU 401 determines a length between the center of gravity of the image of the welding bead captured in a setting of the minimum acceleration rate α and the center of gravity of the image of the welding bead captured in a setting of another acceleration rate α. If the length is loner than a predetermined threshold value, the CPU 401 determines as “OK” and if the length is equal or shorter than the predetermined threshold value, the CPU 401 determines as “NG”. As described above, the occurrence of the vibration of the robot is determined by performing the image processing to the image captured through the camera.
Ninth Exemplary Embodiment
(270) In a ninth exemplary embodiment, an example of a method of evaluating the vibration of the robot 101 different from the eighth exemplary embodiment will be described in detail.
(271) In STEPB5 of
(272) In the present exemplary embodiment, occurrence of vibration of the robot 101 can be determined by, for example, describing a command for waiting for the settlement in the robot program 422A illustrated in
(273) As described above, the CPU 401 adjusts the acceleration rate α on the basis of the time required for the control point of the robot 101 to settle after the time point at which the commanded position of the robot 101 is moved to the position 54.sub.j that is a position at which the operation of accelerating the laser head 102 is started as the test operation.
(274) According to the ninth exemplary embodiment, occurrence of vibration of the robot 101 can be determined by a simple method without requiring complicated processing, and thus the acceleration rate α can be easily adjusted.
(275) To be noted, in the first to ninth exemplary embodiments described above, sometimes abnormality occurs while the robot 101 is operating in accordance with the linear interpolation command. The robot controller 122 may periodically transmit a signal indicating the state of the robot 101 to the controller 121. The controller 121 may control the laser oscillator 103 to not oscillate the laser beam in the case of receiving a signal indicating that the robot 101 is in an abnormal state.
(276) In the first to ninth exemplary embodiments described above, the robot controller 122 may transmit a permission signal that permits laser oscillation to the controller 121. For example, while the robot controller 122 is supplying the trajectory data P.sub.j in accordance with the linear interpolation command, the permission signal is turned on. The controller 121 may perform AND calculation on the permission signal and the laser oscillation command and transmit a calculation result thereof to the laser oscillator 103. As a result of this, the laser beam is not oscillated unless the robot controller 122 turns the permission signal on. Although this AND calculation is performed by the controller 121, this AND calculation may be performed by another electronic circuit instead.
(277) In the first to ninth exemplary embodiments described above, it is preferable that the robot 101 is installed in an unillustrated light-shielding booth such that any person is not exposed to the laser beam. The laser oscillator 103 is configured to stop the laser oscillation when a person opens a door to enter the booth. In the case where the optical fiber cable 151 is not correctly connected to the laser head 102 and the laser oscillator 103, wiring in the optical fiber cable 151 is not connected, and the laser oscillation of the laser oscillator 103 is stopped. In addition, in the case where the optical fiber cable 151 is bent to a certain degree or more, the wiring therein is broken and the laser oscillation of the laser oscillator 103 is stopped. It is also possible to configure the laser oscillator 103 to stop the laser oscillation when a person is detected by a safety laser scanner, a safety light curtain, or the like. It is also possible to stop the laser oscillation by monitoring the booth by external hardware in case the robot controller 122 or the controller 121 stops responding for some reason. For example, a signal that is turned on and off at regular intervals may be output from the robot controller 122 and the controller 121, and the laser oscillation may be stopped when the output signal does not change for a certain period.
(278) Although a case where the control apparatus 120 is constituted by the controller 121 and the robot controller 122 has been described in the first to ninth exemplary embodiments described above, the configuration of the control apparatus 120 is not limited to this. The control apparatus may be realized by one computer as long as the control apparatus has both functions of the controller 121 and the robot controller 122. For example, the control apparatus can be realized by one computer if parallel processing can be performed by a plurality of processors or a plurality of cores included in a processor.
(279) Although a case where the robot 101 is a vertically articulated robot has been described in the first to ninth exemplary embodiments described above, the configuration of the robot 101 is not limited to this. For example, the robot 101 may be a horizontally articulated robot, a parallel link robot, or a cartesian coordinate robot.
(280) Although a case where a laser processing apparatus performs laser welding has been described in the first to ninth exemplary embodiments described above, the configuration is not limited to this. For example, the laser processing apparatus may perform laser grooving processing or laser cutting processing.
(281) Although a case where a laser welding apparatus includes one laser head 102 has been described in the first to ninth exemplary embodiments described above, the configuration is not limited to this. That is, the laser welding apparatus may include a plurality of laser heads 102. In this case, the laser welding apparatus may include a plurality of robots 101 such that the plurality of laser heads 102 can be individually moved.
Tenth Exemplary Embodiment
(282)
(283) The switcher 104 and the laser heads 102.sub.1 to 102.sub.N are interconnected via optical fiber cables 151.sub.1 to 151.sub.N serving as optical paths for laser beams. The switcher 104 and the laser oscillator 103 are interconnected via an optical fiber cable 152. The laser oscillator 103 and the controller 121B are interconnected via the cable 153 such that a digital signal can be communicated therebetween. The switcher 104 and the controller 121B are interconnected via a cable 154 such that a digital signal can be communicated therebetween.
(284) Robot arms 111.sub.1 to 111.sub.N and the robot controllers 122.sub.1 to 122.sub.N are interconnected via cables 155.sub.1 to 155.sub.N each including a power line and a signal line. The controller 121B and the robot controllers 122.sub.1 to 122.sub.N are interconnected via cables 156.sub.1 to 156.sub.N such that digital signals can be communicated therebetween.
(285) The laser oscillator 103 is a continuous wave laser or a pulsed laser, and generates a laser beam by laser oscillation. The laser beam generated in the laser oscillator 103 is transmitted to the switcher 104 via the optical fiber cable 152. The switcher 104 switches the optical path such that the laser beam generated in the laser oscillator 103 is guided to one of the plurality of laser heads 102.sub.1 to 102.sub.N through corresponding one of the optical fiber cables 151.sub.1 to 151.sub.N. Specifically, the switcher 104 includes a plurality of mirrors 114.sub.1 to 114.sub.N, and switches the optical path by operating the mirrors 114.sub.1 to 114.sub.N so as to guide the laser beam into different one of the laser heads 102.sub.1 to 102.sub.N at different time. Therefore, the laser oscillator 103 does not need to be provided in a plural number, and thus the cost can be reduced.
(286) The laser heads 102.sub.1 to 102.sub.N respectively emit laser beams L.sub.1 to L.sub.N guided by the switcher 104. The laser beams L.sub.1 to L.sub.N emitted from the laser heads 102.sub.1 to 102.sub.N are respectively focused on positions in predetermined distances from the laser heads 102.sub.1 to 102.sub.N. The controller 121B controls start and stop of generation of the laser beam in the laser oscillator 103 and the switching operation of the switcher 104. That is, the controller 121B commands the laser oscillator 103 to start or stop generation of laser beam via the cable 153.
(287) In the present exemplary embodiment, the robots 101.sub.1 to 101.sub.N have the same configuration. The robot 101.sub.i of the robot apparatus 110 is, for example, a vertically articulated robot. The letter i used herein represents an integer from 1 to N, and serves as a serial number assigned to each of the robots. The robot 101.sub.i includes a robot arm 111.sub.i and a robot hand 112.sub.i. The robot hand 112.sub.i serves as an example of an end effector attached to the robot arm 111.sub.i. The robot 101.sub.i supports the laser head 102.sub.i. In the present exemplary embodiment, the robot 101.sub.i supports the laser head 102.sub.i by holding the laser head 102.sub.i by the robot hand 112.sub.i. To be noted, for example, the laser head 102.sub.i may be supported by the robot 101.sub.i by attaching the laser head 102.sub.i to the distal end of the robot arm 111.sub.i or to the robot hand 112.sub.i.
(288) Since the laser head 102.sub.i is supported by the robot 101.sub.i, the laser head 102.sub.i can be moved to a desired position and orientation by moving the robot 101.sub.i. By moving the robot 101.sub.i to move the laser head 102.sub.i to the desired position and orientation, the focal point of the laser beam L.sub.i can be moved to a desired position in the space. By focusing the laser beam L.sub.i on a position at which a welding bead is to be formed on the processing target object W, the processing target object W can be subjected to welding by the laser beam L.sub.i. As described above, the robot apparatus 110 including the plurality of robots 101.sub.1 to 101.sub.N can individually move the plurality of laser heads 102.sub.1 to 102.sub.N. To be noted, a processed product can be obtained by processing the processing target object W.
(289) In the present exemplary embodiment, the control apparatus 120B controls the laser oscillator 103, the robot apparatus 110, and the switcher 104 to perform laser seam welding. In the laser seam welding, there are a mode in which a continuous wave is used as the laser beam L.sub.i and a mode in which a pulse wave is used as the laser beam L.sub.i, and either of these modes may be selected. In the laser seam welding, the surface of the processing target object W needs to be scanned by the laser beam L.sub.i. In the present exemplary embodiment, the laser beam L.sub.i is emitted while moving the laser head 102.sub.i supported by the robot 101.sub.i to scan the surface of the processing target object W by the laser beam L.sub.i without using a galvano mirror, and thus laser welding is performed. Since the galvano mirror is omitted, the cost can be reduced.
(290) Although welding may be performed on one welding target portion on the processing target object W by one robot 101.sub.i, that is, one laser head 102.sub.i, description will be given on the premise that welding is performed on a plurality of welding target portions by one laser head 102.sub.i. The serial number of i=1 to N is given to the plurality of robots 101.sub.1 to 101.sub.N, that is, the plurality of laser heads 102.sub.1 to 102.sub.N for the sake of convenience of description. A case where laser welding is sequentially performed on one welding target portion at a time while switching the laser heads 102.sub.1 to 102.sub.N in the order from 1 to N will be described below. For example, in the case where the robot 101.sub.1 is referred to as the first robot and the laser head 102.sub.1 is referred to as the first laser head, the robot 101.sub.2 is the second robot to be operated next, and the laser head 102.sub.2 is the second laser head into which the laser beam is guided next.
(291)
(292) The controller 121B includes I/Fs 311.sub.1 to 311.sub.N, 312, and 313. The CPU 301, the ROM 302, the RAM 303, the HDD 304, the disk drive 305, and the I/Fs 311.sub.1 to 311.sub.N, 312, and 313 are communicably interconnected via a bus 310. The I/F 312 is connected to the laser oscillator 103 via the cable 152. The I/F 313 is connected to the switcher 104 via the cable 154.
(293) The CPU 301 is connected to a clock generation circuit 314. The CPU 301 operates in synchronization with a clock signal generated in the clock generation circuit 314. That is, the operating frequency of the CPU 301 is determined depending on the clock signal of the clock generation circuit 314.
(294) The HDD 304 stores, in other words, records a control program 321 configured for managing the overall sequence of the apparatus such that the CPU 301 performs timekeeping processing and signal communication processing. The CPU 301 performs various processing such as the timekeeping processing and the signal communication processing in accordance with the control program 321. The HDD 304 stores, in other words, records various data 322 including data of time such as the first time T1 and a second time T2.sub.j,i, and a third time T3. To be noted, the data 322 may be incorporated in the control program 321.
(295) Here, the first time T1 is time required for accelerating the laser head 102.sub.i toward a welding target portion such that the laser head 102.sub.i is at a constant speed at the welding target portion. The second time T2.sub.j,i is time required for the laser head 102.sub.i to irradiate the welding target portion with the laser beam. As described above, the letter i represents a positive integer of 1 to N that is a serial number given to the laser heads and so forth, and corresponds to the order of operation. The letter j represents a positive integer that is a serial number given to the welding target portions and corresponds to the order of welding. That is, the second time T2.sub.j,i is time required for irradiating the j-th welding target portion with the laser beam that is subjected to welding by the laser head 102.sub.i supported by the robot 101.sub.i. The third time T3 is time required for the switching operation in the switcher 104. Hereinafter, (j.sub.,i) added to the reference signs indicates the j-th welding target portion welded by the i-th robot 101.sub.i, that is, the i-th laser head 102.sub.i.
(296) The CPU 301 functions as a software timer by executing the control program 321. Specifically, the CPU 301 functions as a timer that measures the first time T1 and a timer that measures the second time T2.sub.j,i. In addition, the CPU 301 functions as a timer that measures a total time T2.sub.j,i+T3 of the second time T2.sub.j,i and the third time T3. The CPU 301 controls the laser oscillator 103 by executing the control program 321 and transmitting a laser oscillation command SR1 serving as a signal to the laser oscillator 103. The CPU 301 controls the switcher 104 by executing the control program 321 to transmit a switching signal SS to the switcher 104. Further, CPU 301 executes the control program 321 to transmit an operation start command SA.sub.i serving as a predetermined command that commands start of operation of the robot 101.sub.i to the robot controller 122.sub.i.
(297) To be noted, the recording medium in which the control program 321 is recorded may be any recording medium as long as the recording medium can be read by a computer. For example, as the recording medium for supplying the control program 321, the ROM 302 illustrated in
(298) The robot controller 122.sub.i is a dedicated computer for controlling the robot 101.sub.i. To be noted, in
(299) The robot controller 122.sub.i includes an FPGA 416.sub.i and a current amplifier 417.sub.i. The FPGA 416.sub.i serves as an example of a servo calculation portion. The robot controller 122.sub.i includes an I/F 411.sub.i. The CPU 401.sub.i, the ROM 402.sub.i, the RAM 403.sub.i, the HDD 404.sub.i, the FPGA 416.sub.i, and the I/F 411.sub.i, are communicably interconnected via a bus 410.sub.i. The I/F 311.sub.i of the controller 121B and the I/F 411.sub.i of the robot controller 122.sub.i are interconnected via a cable 156.sub.i.
(300) The CPU 401.sub.i is connected to a clock generation circuit 414.sub.i, and the FPGA 416.sub.i is connected to a clock generation circuit 415.sub.i The CPU 401.sub.i operates in synchronization with a clock signal generated in the clock generation circuit 414i, and the FPGA 416.sub.i operates in synchronization with a clock signal generated in the clock generation circuit 415.sub.i. That is, the operating frequency of the CPU 401.sub.i is determined depending on the clock signal of the clock generation circuit 414.sub.i, and the operating frequency of the FPGA 416.sub.i is determined depending on the clock signal of the clock generation circuit 415.sub.i.
(301) The HDD 404.sub.i stores, in other words, records a program 421.sub.i and a robot program 422.sub.i.
(302) The robot arm 111.sub.i includes a plurality of motors that drive joints thereof and a plurality of encoders serving as examples of a position sensor that detects the rotation angles or rotation positions of the motors. For example, six motors M1.sub.i to M6.sub.i and six encoders En1.sub.i to En6.sub.i are provided. The robot arm 111.sub.i includes a detection circuit 115.sub.i connected to the encoders En1.sub.i to En6.sub.i and constituted by an electronic circuit.
(303) In the configuration described above, the controller 121B, specifically the CPU 301 transmits the laser oscillation command SR1 serving as a signal from the I/F 312 to the laser oscillator 103. The laser oscillator 103 that has received the laser oscillation command SR1 operates to generate a laser beam in accordance with the laser oscillation command SR1. Specifically, as the laser oscillation command SR1, the controller 121B switches the voltage of an electric signal from a low level to a high level when commanding the laser oscillator 103 to generate a laser beam, and transmits the command from the I/F 312. The switching of the voltage of the electric signal to the high level is also referred to as turning the laser oscillation command SR1 on. The controller 121B switches the voltage of the electric signal to the low level when commanding the laser oscillator 103 to stop generation of the laser beam. The switching of the voltage of the electric signal to the low level is also referred to as turning the laser oscillation command SR1 off. Therefore, the laser oscillator 103 generates the laser beam when the laser oscillation command SR1 is on, and stops generation of the laser beam when the laser oscillation command SR1 is off.
(304) The laser oscillator 103 transmits a signal SR2 indicating that a laser beam is being generated to the controller 121B. Specifically, the laser oscillator 103 transmits, as the signal SR2, an electric signal whose voltage is the high level to the controller 121B when the laser beam is being generated. Setting the voltage of the electric signal to the high level is also referred to as turning the signal SR2 on. The laser oscillator 103 sets the voltage of the electric signal to the low level when the laser beam is stopped. Setting the voltage of the electric signal to the low level is also referred to as turning the signal SR2 off.
(305) Further, the controller 121B, specifically the CPU 301 transmits a switching signal SS from the I/F 313 to the switcher 104. The switching signal SS is a digital signal composed of a plurality of bits, in other words, a bit stream, serving as a unique sign assigned to each laser head 102.sub.i. The switcher 104 that has received the switching signal SS switches the optical path in accordance with the bit stream of the switching signal SS.
(306) Furthermore, the controller 121B, specifically the CPU 301 transmits the operation start command SA.sub.i that commands the start of operation of the robot 101.sub.i from the I/F 311.sub.i to the robot controller 122.sub.i. Specifically, when transmitting the operation start command SA.sub.i to start the operation of the robot 101.sub.i, the controller 121B transmits an electric signal having a voltage of a high level to the robot controller 122.sub.i. Setting the voltage of the electric signal to the high level is also referred to as turning the operation start command SA.sub.i on. After setting the voltage of the electric signal indicating the operation start command SA.sub.i to the high level, the controller 121B sets the voltage to a low level at a predetermined timing. Setting the voltage of the electric signal to the low level is also referred to as turning the operation start command SA.sub.i off.
(307) The robot controller 122.sub.i receives the operation start command SA.sub.i, and controls the operation of the robot 101.sub.i in accordance with the robot program 422.sub.i. That is, the robot controller 122.sub.i monitors the operation start command SA.sub.i, and causes the robot 101.sub.i to start the operation of accelerating the laser head 102.sub.i to perform laser seam welding by the laser head 102.sub.i when the operation start command SA.sub.i is switched from off to on.
(308) The robot controller 122.sub.i transmits, to the controller 121B, a signal SB.sub.i that is a digital signal indicating that the robot 101.sub.i is operating on the basis of trajectory data of a predetermined section including a welding target portion. Specifically, the robot controller 122.sub.i switches the voltage of the electric signal indicating the signal SB.sub.i from the low level to the high level and transmits the signal SB.sub.i to the controller 121B when causing the robot 101.sub.i to start the operation of accelerating the laser head 102.sub.i. The robot controller 122.sub.i switches the voltage of the electric signal indicating the signal SB.sub.i from the high level to the low level at a predetermined timing. Switching the voltage of the electric signal indicating the signal SB.sub.i to the high level will be also referred to as turning the signal SB.sub.i on hereinbelow. In addition, switching the voltage of the electric signal indicating the signal SB.sub.i to the low level will be also referred to as turning the signal SB.sub.i off.
(309) Orientation control of the robot arm 111.sub.i, that is, position/orientation control of the laser head 102.sub.i, specifically position control of the focal point of the laser beam L.sub.i is performed with a motor current SC1.sub.i supplied to the motors M1.sub.i to M6.sub.i of the robot arm 111.sub.i of the robot controller 122.sub.i. The robot program 422.sub.i is a program described in a robot language. A user can instruct operation of the robot 101.sub.i by describing the robot language in text data. The CPU 401.sub.i of the robot controller 122.sub.i executes the program 421.sub.i to interpret the robot program 422.sub.i, generate trajectory data constituted by a plurality of commands, and output the generated trajectory data to the FPGA 416.sub.i. The FPGA 416.sub.i performs servo calculation in accordance with the trajectory data. That is, the FPGA 416.sub.i generates a motor current command by the servo calculation, and transmits the generated motor current command to the current amplifier 417.sub.i. The current amplifier 417.sub.i generates the motor current SC1.sub.i corresponding to the motor current command, and supplies the motor current SC1.sub.i to the motors M1.sub.i to M6.sub.i at respective joints of the robot arm 111.sub.i. The motors M1.sub.i to M6.sub.i of the robot arm 111.sub.i are driven by the supplied motor current SC1.sub.i. The detection circuit 115.sub.i obtains detection signals from the encoders En1.sub.i to the En6.sub.i when the motors M1.sub.i to M6.sub.i rotate. The detection circuit 115.sub.i converts the detection signals into a serial digital signal SC2.sub.i, and transmits the digital signal SC2.sub.i to the FPGA 416.sub.i of the robot controller 122.sub.i.
(310) The digital signal SC2.sub.i indicating rotation angles, or rotation positions of the motors M1.sub.i to M6.sub.i is used for the servo calculation in the FPGA 416.sub.i. The program 421.sub.i regularly performs reading from I/Fs, arithmetic processing, and output to I/Fs. The period of this regular processing will be referred to as a control period of the robot controller 122.sub.i. The detection signals of the encoders En1.sub.i to En6.sub.i are pulse signals of ABZ phase. The detection circuit 115.sub.i converts the pulse signals of the encoders En1.sub.i to En6.sub.i into the digital signal SC2.sub.i indicating a pulse number, which can be converted into a positional coordinate, and feeds the digital signal SC2.sub.i back to the FPGA 416.sub.i. To be noted, the servo mechanism, that is, the FPGA 416.sub.i and the current amplifier 417.sub.i may be disposed in the robot arm 111.sub.i, and a position command, that is, the trajectory data may be transmitted to the servo mechanism in the robot arm 111.sub.i from the CPU 401.sub.i via a cable. The FPGA 416.sub.i may be omitted by imparting the function of the FPGA 416.sub.i to the CPU 401.sub.i. Although the pulse signals of the encoders En1.sub.i to En6.sub.i are converted into a digital signal and transmitted to the robot controller 122.sub.i has been described, the pulse signals of the encoders En1.sub.i to En6.sub.i may be directly transmitted to the robot controller 122.sub.i. Resolvers may be used as position sensors instead of the encoders En1.sub.i to En6.sub.i.
(311) Here, a control point of movement of the robot 101.sub.i may be a point that moves together with the tip of the hand of the robot 101.sub.i, and, in the present exemplary embodiment, a focal point of the laser beam is set as the control point of movement of the robot 101.sub.i. The control point is expressed by six parameters composed of three parameters X, Y, and Z indicating a position in the three-dimensional space and three parameters A, B, and C indicating an orientation in the three-dimensional space based on the base of the robot 101.sub.i. Therefore, the control point can be regarded as one point in a six-dimensional task space. In the robot program 422.sub.i, a taught point that is a movement target of the control point is described, in other words, designated by a user. The robot controller 122.sub.i interprets the robot program 422.sub.i and generates the trajectory data connecting taught points, that is, the trajectory data in which the taught points are interpolated. Examples of an interpolation method of interpolating the taught points include linear interpolation, circular interpolation, and joint interpolation, and these interpolation methods are described, in other words, designated in the robot program 422.sub.i as an interpolation command by the user.
(312) The CPU 401.sub.i of the robot controller 122.sub.i converts the trajectory data obtained by the interpolation into a command of angles of respective joints of the robot 101.sub.i, and the FPGA 416.sub.i performs servo calculation. As a result of the servo calculation, the FPGA 416.sub.i determines a current command to be transmitted to the current amplifier 417.sub.i. The servo calculation is performed in each cycle of the control period of the CPU 401.sub.i of the robot controller 122.sub.i. The command of angles of respective joints is updated in each cycle of the control period, and the speed of the robot 101.sub.i is determined by controlling the amount of increase or decrease thereof. That is, the robot 101.sub.i moves quickly in the case where the amount of increase or decrease thereof is large, and moves slowly in the case where the amount of increase or decrease thereof is small.
(313) A path in which the control point, which is the focal point of the laser beam, actually moves by the operation of the robot 101.sub.i can be deviated from a path commanded by the robot program 422.sub.i due to response delay of position control.
(314)
(315) A case of executing a linear interpolation command serving as an example of the interpolation command described in the robot program 422.sub.i will be described below. The linear interpolation command is a command of performing interpolation such that the control point moves along a straight line connecting a first positional coordinate and a second positional coordinate, and the path of the control point becomes a line segment in the three-dimensional space. To be noted, there are two possible options one of which is also interpolating the orientation of the robot 101.sub.i by using the first positional coordinate and the second positional coordinate and the other of which is maintaining the orientation at the first positional coordinate to the second positional coordinate, and, in the present exemplary embodiment, the orientation is also interpolated. In both of these options, the control point of the robot 101.sub.i, that is, the focal point of the laser beam moves on a line segment connecting the first positional coordinate and the second positional coordinate. To be noted, in most case, the robot program 422.sub.i uses a current commanded position of the robot 101.sub.i as the first positional coordinate and only the second positional coordinate, which is a movement destination, is designated. For the positional coordinate, a taught point, in other words, a taught position set by a user may be used, or a positional coordinate obtained by additional calculation of the taught point indicating a position different from the taught point may be used. When the CPU 401.sub.i of the robot controller 122.sub.i executes the linear interpolation command, the CPU 401.sub.i generates trajectory data connecting the current commanded position and a target position that is the movement destination, and supplies the trajectory data to the FPGA 416.sub.i in each cycle of the control period. The linear interpolation command is completed when supply of all trajectory data by the CPU 401.sub.i of the robot controller 122.sub.i is completed, and the CPU 401.sub.i executes the next command described in the robot program 422.sub.i.
(316) In
(317) The robot controller 122.sub.i derives, in accordance with a predetermined algorithm, positions 54.sub.j,i, and 55.sub.j,i positioned on an extension line passing through the taught positions 52.sub.j,i, and 53.sub.j,i. This algorithm is described in the robot program 422.sub.i. The position 55.sub.j,i, is provided to the robot program 422.sub.i as a parameter to execute the linear interpolation command. To be noted, in order to cause the robot program 422.sub.i to execute the linear interpolation command to the position 55.sub.j,i, it is required that a movement command to the position 54.sub.j,i has been executed and the commanded position of the robot 101.sub.i has reached the position 54.sub.j,i.
(318) The position 54.sub.j,i is a commanded position at which movement of the control point is started. The position 55.sub.j,i is a commanded position at which movement of the control point is finished. Further, the robot controller 122.sub.i generates trajectory data P.sub.j,i of a predetermined section by linear interpolation. The predetermined section includes the section between the taught positions 52.sub.j,i and 53.sub.j,i, and includes the position 54.sub.j,i as a start point and the position 55.sub.j,i as an end point.
(319) As described above, the positions 52.sub.j,i, and 53.sub.j,i that are included in the trajectory data P.sub.j,i to be commanded to the robot 101.sub.i are taught points designated by the user. In contrast, the positions 54.sub.j,i and 55.sub.j,i that are included in the trajectory data P.sub.j,i commanded to the robot 101.sub.i are commands obtained by the robot controller 122.sub.i by automatic calculation in accordance with the robot program 422.sub.i, and are not taught points.
(320) For also the section between the positions 55.sub.1,i and 54.sub.2,i, the robot controller 122.sub.i performs interpolation in accordance with the interpolation command described in the robot program 422.sub.i to generate trajectory data P.sub.1-2,i. To be noted, since the laser head 102.sub.i is just moved in the section between the positions 55.sub.1,i and 54.sub.2,i, the interpolation can be performed by an arbitrary interpolation method. Therefore, an arbitrary interpolation command can be described in the robot program 422.sub.i. For example, in the case where a linear interpolation command is described in the robot program 422.sub.i, linear interpolation can be performed. For example, in the case where a joint interpolation command is described in the robot program 422.sub.i, joint interpolation can be performed. The joint interpolation command is a command for dividing operation amount of each joint of the robot 101.sub.i by time and performing interpolation, and the path of the control point is not linear in this case. However, in this case, the operation of the robot 101.sub.i is faster than in the case where the robot 101.sub.i is operated in accordance with the linear interpolation command.
(321) In the case of performing laser seam welding, an acceleration section for the movement speed of the control point to reach a target speed Vw.sub.j,i is required. In the present exemplary embodiment, the section between the positions 54.sub.j,i and 52.sub.j,i is an acceleration section. The target speed Vw.sub.j,i of each welding target portion is described, in other words, designated in the robot program 422.sub.i.
(322) To be noted, although the control point needs to be moved with a high precision in the section for welding, that is, the section between the positions 52.sub.j,i and 53.sub.j,i, the position precision may be low in sections in which laser seam welding is not performed, for example, the acceleration section. Therefore, as illustrated in
(323) Here, when the robot controller 122.sub.i commands the position 54.sub.j,i to the robot 101.sub.i, the robot 101.sub.i is still or moving, and these are both acceptable.
(324) The robot controller 122.sub.i that has received the operation start command SA.sub.i starts commanding the trajectory data P.sub.j,i for performing welding in accordance with the operation start command SA.sub.i. That is, the robot controller 122.sub.i starts commanding the trajectory data P.sub.j,i in response to the operation start command SA.sub.i switching from off to on. In the example of
(325) To be noted, the robot program 422.sub.i is described such that, when the robot controller 122.sub.i receives a signal indicating that the operation start command SA.sub.i has been switched from off to on, the robot controller 122.sub.i starts commanding the trajectory data P.sub.j,i.
(326)
(327) The robot controller 122.sub.i controls operation of the robot 101.sub.i by sequentially using the trajectory data P.sub.1,i, the trajectory data P.sub.1-2,i, and the trajectory data P.sub.2,i. However, due to the response delay of the position control, the control point moves behind the commanded time as illustrated in
(328) When the robot controller 122.sub.i executes the linear interpolation command and starts supplying the trajectory data P.sub.1,i for welding of the first portion, the command for the angle of the robot 101.sub.i starts to change from the position 54.sub.1,i that is a start point to the position 55.sub.1,i, that is an end point of the trajectory data P.sub.1,i. At the time point TP1.sub.1,i when this change is started, the operation of the robot 101.sub.i to accelerate the control point, that is, the laser head 102.sub.i is started.
(329) When the robot controller 122.sub.i commands the position 52.sub.1,i to the robot 101.sub.i, the control point passes a position corresponding to the commanded position 52.sub.1,i at the time point TP2.sub.1,i delayed with respect to the commanded time. When the robot controller 122.sub.i commands the position 53.sub.1,i to the robot 101.sub.i, the control point passes a position corresponding to the commanded position 53.sub.1,i at the time point TP3.sub.1,i delayed with respect to the commanded time. When the robot controller 122.sub.i commands the position 55.sub.1,i that is the end point of the trajectory data P.sub.1,i to the robot 101.sub.i, the control point passes a position corresponding to the commanded position 55.sub.1,i at the time point TP5.sub.1,i delayed with respect to the commanded time. The irradiation of laser beam needs to be started when the control point actually passes the position 52.sub.1,i and needs to be stopped when the control point actually passes the position 53.sub.1,i.
(330) Next, the robot controller 122.sub.i operates the robot 101.sub.i in accordance with the trajectory data P.sub.1-2,i from the position 55.sub.1,i to the position 54.sub.2,i for preparation for next welding operation.
(331) When the robot controller 122.sub.i executes the linear interpolation command and starts supplying the trajectory data P.sub.2,i for welding of the second portion, the command for the angle of the robot 101.sub.i starts to change from the position 54.sub.2,i that is a start point of the trajectory data P.sub.2,i to the position 55.sub.2,i that is an end point of the trajectory data P.sub.2,i. At the time point TP1.sub.2,i when this change is started, the operation of the robot 101.sub.i to accelerate the control point, that is, the laser head 102.sub.i is started.
(332) When the robot controller 122.sub.i commands the position 52.sub.2,i to the robot 101.sub.i, the control point passes a position corresponding to the commanded position 52.sub.2,i at the time point TP2.sub.2,i delayed with respect to the commanded time. When the robot controller 122.sub.i commands the position 53.sub.2,i to the robot 101.sub.i, the control point passes a position corresponding to the commanded position 53.sub.2,i at the time point TP3.sub.2,i delayed with respect to the commanded time. When the robot controller 122.sub.i commands the position 55.sub.2,i that is the end point of the trajectory data P.sub.2,i to the robot 101.sub.i, the control point passes a position corresponding to the commanded position 55.sub.2,i at the time point TP5.sub.2,i delayed with respect to the commanded time. The irradiation of laser beam needs to be started when the control point actually passes the position 52.sub.2,i and needs to be stopped when the control point actually passes the position 53.sub.2,i.
(333) In order to perform processing by irradiating the processing target object W with the laser beam L.sub.i in a state in which the movement speed of the focal point of the laser beam L.sub.i is constant at the target speed Vw.sub.j,i, the laser head 102.sub.i needs to be accelerated before the laser head 102.sub.i reaches the position to start irradiation of the laser beam L.sub.i. The first time T1 is time for accelerating the laser head 102.sub.i such that the laser head 102.sub.i moves at the constant target speed Vw.sub.j,i. The second time T2.sub.j,i is time for irradiating the processing target object W with the laser beam L.sub.i in a state in which the laser head 102.sub.i is moving at the constant target speed Vw.sub.j,i.
(334) In the present exemplary embodiment, a period between the time points TP1.sub.j,i and TP2.sub.j,i is set as the first time T1 for accelerating the laser head 102.sub.i with respect to the processing target object W. In addition, by performing an experiment or calculation in advance, a period between the time points TP2.sub.j,i and TP3.sub.j,i is set as the second time T2.sub.j,i for irradiation of the laser beam L.sub.i.
(335) Settings of the first time T1, the second time T2.sub.j,i, and the third time T3 will be described in detail below. The actual speed of the laser head 102.sub.i is deviated from the commanded speed thereof due to response delay of position control. Therefore, it is difficult to set the first time T1 by only the robot program 422.sub.i. Therefore, the robot 101.sub.i is operated in various conditions with trial and error to measure the time in which the actual speed of the laser head 102.sub.i reaches the target speed Vw.sub.j,i in each condition, and the first time T1 is set on the basis of measurement results thereof.
(336) To be noted, although it is preferable that the speed of the laser head 102.sub.i reaches the target speed Vw.sub.j,i and be constant when the first time T1 has elapsed, an error occurs in the speed due to factors such as the position and orientation of the robot 101.sub.i, the target speed Vw.sub.j,i, and residual deviation derived from continuous operation of the robot 101.sub.i. Therefore, it is preferable that, among values obtained by measurement in various conditions, a value with the smallest speed error, that is, the longest time is set as the first time T1. That is, the first time T1 may be set such that the speed of the laser head 102.sub.i is within a predetermined range based on the target speed Vw.sub.j,i when the first time has elapsed after starting acceleration of the laser head 102.sub.i. Although the first time T1 may be varied depending on the target speed Vw.sub.j,i of the laser beam at each welding target portion, the processing of the controller 121B can be more simplified when the first time T1 is set as the same value. Further, the same first time T1 is used for acceleration of all of the laser heads 102.sub.1 to 102.sub.N.
(337)
(338) To be noted, the algorithm for obtaining the positions 54.sub.j,i and 55.sub.j,i positioned on an extension line connecting the taught positions 52.sub.j,i and 53.sub.j,i needs to be changed between the
(339) The second time T2.sub.j,i is laser irradiation time, and is calculated by the following formula (19). Here, as algebra for calculation of the formula (19), Tw, Ps, Pe, and Vw are used. Tw corresponds to the second times T2.sub.j,i, Ps corresponds to the positions 52.sub.1,i and 52.sub.2,i, at which irradiation of the laser beam is started, Pe corresponds to the positions 53.sub.1,i and 53.sub.2,i, at which the irradiation of the laser beam is stopped, and Vw corresponds to the target speed Vw.sub.j,i. The second time Tw is calculated for each welding target portion by using Ps, Pe, and Vw in the following formula (19).
(340)
(341) That is, the value of Tw serving as second time is obtained by dividing the distance between Ps and Pe by Vw. Different values of Ps, Pe, and Vw can be used for different welding target portions. Therefore, the value of Tw serving as the second time T2.sub.1,i or T2.sub.2,i corresponds to the length, or area of the welding target portion. Values of Tw calculated in this manner are set as the second times T2.sub.1,i and T2.sub.2,i.
(342) The positions 52.sub.1,i and 52.sub.2,i that are taught positions are each constituted by information of six degrees of freedom about position and orientation. Specifically, the information constituting the positions 52.sub.1,i and 52.sub.2,i includes X, Y, and Z, which are information about the position of the robot 101.sub.i with respect to the base, and A, B, and C, which are information about the holding angle of the laser head 102.sub.i. The same applies to the positions 53.sub.1,i and 53.sub.2,i. Therefore, a distance in the three-dimensional space is obtained by only using the position information of X, Y, and Z as Ps and Pe.
(343) To be noted, this calculation may be performed by the robot controller 122.sub.i, and the robot controller 122.sub.i may transfer the value of Tw serving as the second time to the controller 121B. In addition, the robot controller 122.sub.i may calculate only the distance between Ps and Pe and transfer information about the distance to the controller 121B, and the controller 121B may perform the remaining calculation for obtaining the value of Tw serving as the second time. Which of these is selected can be appropriately selected on the basis of by which of the robot controller 122.sub.i and the controller 121B the target speed Vw is described or designated.
(344) The third time T3 is a time required for switching operation of the switcher 104, and is obtained by conducting an experiment in advance. For example, time required for the switching operation is measured by performing the switching operation of the switcher 104 a plurality of times, and a value obtained by adding a margin to the maximum value of measured values is set as the third time T3. The reason why the value obtained by adding the margin to the maximum value is set is because the switching needs to be certainly completed when the third time T3 has elapsed since the switching is commanded. To be noted, in the case where data of switching time is available in advance, a value obtained by adding a margin to a value of the data may be set as the third time T3. As described above, the first time T1, the second time T2.sub.j,i, and the third time T3 are set in advance before actually operating the robot 101.sub.i in a production line. To be noted, it is preferable that the first time T1 is set for both of the controller 121B and the robot controller 122.sub.i in advance.
(345) Incidentally, the welding target portion to be subjected to laser seam welding need to be irradiated with the laser beam L.sub.i not at a timing at which the position 52.sub.j,i is commanded to the robot 101.sub.i but at the time point TP2.sub.j,i at which the control point actually passes the position 52.sub.j,i. Similarly, the irradiation of the laser beam L.sub.i needs to be stopped not at a timing at which the position 53.sub.j,i is commanded to the robot 101.sub.i but at the time point TP3.sub.j,i at which the control point actually passes the position 53.sub.j,i.
(346) That is, there is a case where response delay has occurred in the position control of the robot 101.sub.i at the time point TP2.sub.j,i at which the control point reaches the constant speed region. The response delay of the position control is expressed by the difference between the commanded position and the actual position. In the case where there is response delay of the position control, this response delay of the position control needs to be incorporated in the calculation of the positions 54.sub.j,i and 55.sub.j,i.
(347) Therefore, the robot controller 122.sub.i calculates the position 54.sub.j,i at which movement is started and the position 55.sub.j,i at which movement is stopped such that the control point reaches a target position when the first time T1 has elapsed after starting the operation of accelerating the laser head 102.sub.i. However, how the positions 54.sub.j,i and 55.sub.j,i are calculated is different between the case where the method of
(348)
(349) To be noted, as a precondition, which welding target portion one robot 101.sub.i takes charge of is predetermined before starting the automatic operation. In addition, in what order the robots 101.sub.1 to 101.sub.N operate is also predetermined before the automatic operation is started. That is, the order in which start of movement for welding is commanded to the robots 101.sub.1 to 101.sub.N is preset in the controller 121B. Hereinafter, a case where the robots 101.sub.1 to 101.sub.N are operated in the order of the robot 101.sub.1, the robot 101.sub.2, . . . the robot 101.sub.N, the robot 101.sub.1, the robot 101.sub.2, . . . will be described as an example. That is, the laser beam is guided to the laser heads 102.sub.1 to 102.sub.N by the switcher 104 in the order of the laser head 102.sub.1, the laser head 102.sub.2, . . . the laser head 102.sub.N, the laser head 102.sub.1, the laser head 102.sub.2, . . . .
(350) The controller 121B mainly manages two sequences SM1 and SM2. Specifically, as management of the first sequence SM1, the controller 121B commands timings at which the laser heads 102.sub.1 to 102.sub.N start moving to welding target portions to the robot controllers 122.sub.1 to 122.sub.N.
(351) As management of the second sequence SM2, the controller 121B commands the timing of on/off switching of the laser oscillator 103 and the timing of the switching operation of the switcher 104. Specifically, the controller 121B turns the laser oscillation command SR1 on when the first time T1 has elapsed after receiving the rising of the signal SB.sub.i from the robot controller 122.sub.i. Then, the controller 121B turns the laser oscillation command SR1 off when the second time T2.sub.j,i has elapsed since the time point at which the elapse of the first time T1 is completed. Further, after turning the laser oscillation command SR1 off, the controller 121B changes the switching signal SS to switch the optical path of the laser beam for the next irradiation of the laser beam.
(352) The management of the two sequences SM1 and SM2 described above is performed in synchronization with the timing at which the signal SB.sub.i of the robot 101.sub.i is turned on. Hereinafter, the management of the sequences SM1 and SM2 will be described in detail. When the automatic operation is started, the controller 121B turns the operation start command SA.sub.1 on in accordance with the control program 321 in the sequence SM1. The timing at which the operation start command SA.sub.1 is turned on is indicated as a time point TP0.sub.1,1 in
(353) The robot controller 122.sub.1 monitors the operation start command SA.sub.1, and the linear interpolation command including the position 55.sub.1,1 as the target position is executed after receiving the switching of the operation start command SA.sub.1 from off to on. Then, trajectory data P.sub.1,1 from the position 54.sub.1,1 to the position 55.sub.1,1 is supplied to the robot 101.sub.1 at a predetermined control period. That is, the robot controller 122.sub.1 causes the robot 101.sub.1 to start the operation of accelerating the laser head 102.sub.1 such that the movement speed of the laser head 102.sub.1 with respect to the processing target object W reaches the target speed VW.sub.1,1. According to this, the laser head 102.sub.1, that is, the control point starts moving from the position 54.sub.1,1 to the position 55.sub.1,1 and accelerating such that the movement speed thereof reaches the constant target speed VW.sub.1,1.
(354) When the robot controller 122.sub.1 starts supplying the trajectory data P.sub.1,1, the robot controller 122.sub.1 simultaneously transmits a signal SBA.sub.1 obtained by switching the signal SB.sub.1 from off to on to the controller 121B.
(355) The rising occurring when switching the signal SB.sub.1 from off to on serves as a synchronizing signal SBA.sub.1 serving as a predetermined signal. That is, the robot controller 122.sub.1 transmits the synchronizing signal SBA.sub.1 that is the rising of the signal SB.sub.1 to the controller 121B when causing the robot 101.sub.1 to start the operation of accelerating the laser head 102.sub.1. This timing is shown as the time point TP1.sub.1,1 in
(356) In the present exemplary embodiment, the rising of the signal SB.sub.1 is used as the synchronizing signal SBA.sub.1. Therefore, the signal SB.sub.1 may shrink whenever before starting supplying the next trajectory data P.sub.2,1 and after the control period of the controller 121B or more time has elapsed since the signal SB.sub.1 has risen. In addition, although the rising of the signal SB.sub.1 is used as the synchronizing signal SBA.sub.1, the synchronizing signal SBA.sub.1 is not limited to this, and the shrinkage of the signal SB.sub.1 may be used as the synchronizing signal SBA.sub.1. To be noted, in the present exemplary embodiment, turning the signal SB.sub.i off is used as a signal indicating that the robot 101.sub.i is in a preparation complete state. The preparation complete state is a state in which the robot controller 122.sub.i has completed supplying the trajectory data P.sub.j−(i+1),i between the trajectory data P.sub.j and the trajectory data P.sub.j+1 and is prepared to wait for the signal SA.sub.i.
(357) The controller 121B monitors the signal SB.sub.1 transmitted from the robot controller 122.sub.1, and starts timekeeping of the first time T1 corresponding to the laser head 102.sub.1 in the sequence SM2 when receiving the synchronizing signal SBA.sub.1 that is the rising of the signal SB.sub.1. For example, the first time T1 is a fixed value such as 200 msec.
(358) When the first time T1 has elapsed, the laser head 102.sub.1 has reached a constant speed state at the target speed VW.sub.1,1 for welding, and the control point is positioned at the commanded position 52.sub.1,1 illustrated in
(359) As described above, the robot controller 122.sub.i causes the robot 101.sub.i to start the operation of accelerating the laser head 102.sub.i such that the movement speed of the laser head 102.sub.i with respect to the processing target object W is constant at the target speed Vw.sub.j,i in accordance with the operation start command SA.sub.i. Meanwhile, the controller 121B controls the laser oscillator 103 to generate the laser beam when the first time T1 has elapsed after starting the acceleration of the laser head 102.sub.i. The controller 121B detects the start of acceleration of the laser head 102.sub.i caused by the control of the robot controller 122.sub.i by receiving the synchronizing signal SBA.sub.i.
(360) The controller 121B starts timekeeping of the second time T2.sub.1,1 when the timekeeping of the first time T1 is finished. The robot controller 122.sub.1 causes the robot 101.sub.1 to operate such that the movement speed of the laser head 102.sub.1 is maintained at the target speed VW.sub.1,1 while the processing target object W is irradiated with the laser beam L.sub.1 from the laser head 102.sub.1. The controller 121B controls the laser oscillator 103 to stop the generation of the laser beam L.sub.1 when the second time T2.sub.1,1 has further elapsed after the first time T1 has elapsed, that is, when the timekeeping of the second time T2.sub.1,1 is finished. This time point is indicated as the time point TP3.sub.1,1 in
(361) Specifically, the controller 121B switches the laser oscillation command SR1 from on to off at the same time with finishing the timekeeping of the second time T2.sub.1,1. That is, the controller 121B commands the laser oscillator 103 to stop the generation of the laser beam when the timekeeping of the second time T2.sub.1,1 is finished. The laser oscillator 103 stops the laser oscillation when receiving the switch of the laser oscillation command SR1 from on to off.
(362) The controller 121B changes the switching signal SS at the same time with turning the laser oscillation command SR1 off, and commands the switcher 104 such that the laser head to which the laser beam is guided is switched from the first laser head 102.sub.1 to the second laser head 102.sub.2.
(363) The switcher 104 monitors the switching signal SS, and performs the switching operation of changing the optical path by moving mirrors 114.sub.1 to 114.sub.N in accordance with the command of the switching signal SS. This switching operation takes the third time T3, and a time point at which the switching operation is completed is indicate as a time point TP4.sub.1,1 in
(364) Here, although the laser head to which the laser beam is guided is switched from the laser head 102.sub.1 to the laser head 102.sub.2 by the switcher 104, the operation rate of the laser oscillator 103 is low, that is, the production efficiency of the processed product is low, if the operation of the next robot 101.sub.2 is started after the switching operation.
(365) Therefore, the control apparatus 120B causes the robot 101.sub.2 to start the operation of accelerating the laser head 102.sub.2 such that the movement speed of the laser head 102.sub.2 reaches the target speed VW.sub.1,2 before the irradiation of the laser beam L.sub.1 on the processing target object W by the laser head 102.sub.1 is finished. The period “before the irradiation of the laser beam L.sub.1 on the processing target object W by the laser head 102.sub.1 is finished” includes a period before the start of irradiation. According to this, the operation rate of the laser oscillator 103, that is, the production efficiency of the processed product improves.
(366) Hereinafter, management at a timing at which the controller 121B turns on the operation start command SA.sub.2 to be transmitted to the robot controller 122.sub.2, that is, management of the sequence SM1 by the controller 121B will be described in detail.
(367) In the present exemplary embodiment, the first time T1 counted each time the laser head 102.sub.i is accelerated is the same time, that is, a fixed value such as 200 msec. The controller 121B causes the next laser head 102.sub.2 to radiate the laser beam after the third time T3, which is the time required for the switcher 104 to perform the switching operation, further elapses since the time point when the second time T2.sub.1,1, which is the time in which the laser head 102.sub.1 radiates the laser beam, has elapsed. Therefore, it is preferable that the timing at which the operation start command SA.sub.2 is transmitted to the next robot controller 122.sub.2 is after the total time T2.sub.1,1+T3 that is the sum of the second time T2.sub.1,1 and the third time T3 has elapsed since the acceleration of the laser head 102.sub.1 is started. To be noted, the start of the acceleration of the laser head 102.sub.1 is detected via the synchronizing signal SBA.sub.1, by the controller 121B. The second time T2.sub.1,1 is time for irradiating the first welding target portion with the laser beam from the laser head 102.sub.1.
(368) In the present exemplary embodiment, as the sequence SM1, the controller 121B starts timekeeping of the total time T2.sub.1,1+T3 that is the sum of the second time T2.sub.1,1 and the third time T3 when receiving the synchronizing signal SBA.sub.1, from the robot controller 122.sub.1. The timing at which this timekeeping is started is the same as the timing at which the timekeeping of the first time T1 for accelerating the laser head 102.sub.1 is started in the sequence SM2.
(369) When the timekeeping of the total time T2.sub.1,1+T3 is completed, the controller 121B checks whether or not the robot 101.sub.2 to be operated next is in the preparation complete state. In the present exemplary embodiment, the signal SB.sub.i being off indicates the preparation complete state. In the case where the signal SB.sub.i is not off, the controller 121B waits for the signal SB.sub.i to be on. In the case where the signal SB.sub.i is off, the controller 121B turns on the operation start command SA.sub.2 to be transmitted to the robot controller 122.sub.2 that controls the robot 101.sub.2 to be operated next. The timing at which the operation start command SA.sub.2 is turned on is indicated as the time point TP0.sub.1,2 in
(370) The robot controller 122.sub.2 monitors the operation start command SA.sub.2, and commands the trajectory data P.sub.1,2 to the robot 101.sub.2 at a predetermined control period when the operation start command SA.sub.2 is switched from off to on. That is, the robot controller 122.sub.2 causes the robot 101.sub.2 to start the operation of accelerating the laser head 102.sub.2 such that the movement speed of the laser head 102.sub.2 with respect to the processing target object W reaches the target speed VW.sub.1,2. As a result of this, the laser head 102.sub.2, that is, the control point starts moving and accelerating such that the movement speed becomes constant at the target speed VW.sub.1,2.
(371) When the robot controller 122 starts supplying the trajectory data P.sub.1,2, the robot controller 122.sub.2 simultaneously transmits a signal SBA.sub.2 obtained by switching the signal SB.sub.2 from off to on. That is, the robot controller 122.sub.2 transmits the synchronizing signal SBA.sub.2 that is the rising of the signal SB.sub.2 to the controller 121B when causing the robot 101.sub.2 to start the operation of accelerating the laser head 102.sub.2. This timing is shown as the time point TP1.sub.1,2 in
(372) The controller 121B monitors the signal SB.sub.2 transmitted from the robot controller 122.sub.2, and starts timekeeping of the first time T1 for accelerating the laser head 102.sub.2 when receiving the synchronizing signal SBA.sub.2 that is the rising of the signal SB.sub.2. At the same time, the controller 121B starts timekeeping of the total time T2.sub.1,2+T3 that is the sum of the second time T2.sub.1,2 and the third time T3. When the timekeeping of the total time T2.sub.1,2+T3 is completed, the controller 121B checks whether the robot 101.sub.3 is in the preparation complete state. In the case where the robot 101.sub.3 is in the preparation complete state, further the next operation start command SA.sub.3 is turned on. Similar processing is repeated for the robot 101.sub.1, . . . the robot 101.sub.N, the robot 101.sub.1, . . . . As described above, the controller 121B manages the timing of operation of the robot 101.sub.i via the synchronizing signal SBA.sub.i in the sequence SM1.
(373) Meanwhile, when the controller 121B receives the synchronizing signal SBA.sub.2 that is the rising of the signal SB.sub.2, timekeeping of the first time T1 or the second time T2.sub.1,1 for the laser head 102.sub.1 is being performed. In the example of
(374) The controller 121B changes the switching signal SS at the same timing as turning the laser oscillation command SR1 off, and commands the switcher 104 such that the laser head to which the laser beam is guided is switched from the second laser head 102.sub.2 to the third laser head 102.sub.3. The switcher 104 performs the switching operation in accordance with the switching signal SS. This switching operation takes the third time T3, and a time point at which the switching operation is completed is indicated as the time point TP4.sub.1,2 in
(375) In each of the laser heads 102.sub.1 to 102.sub.N, after the laser welding on the first welding target portion is finished, the same sequences SM1 and SM2 are performed on the second welding target portion and subsequent welding target portions.
(376) As described above, the controller 121B performs the sequences SM1 and SM2 independently from each other. The operation of the sequence SM1 and the operation of the sequence SM2 are synchronized by using the synchronizing signal SBA.sub.i, and are not synchronized by any other means. That is, the controller 121B performs the sequences SM1 and SM2 on the basis of time elapsed since the time point at which the synchronizing signal SBA.sub.i is received. As described above, by managing the sequences SM1 and SM2 by using the synchronizing signal SBA.sub.i, the management does not become complicated, and the sequences SM1 and SM2 can be stably performed.
(377) According to the present exemplary embodiment, the robots 101.sub.i are sequentially operated by the sequence SM1 and SM2 without waiting for the welding to be completed. Therefore, time loss can be reduced, and the operation rate of the laser oscillator 103, that is, the production efficiency of the processed product can be improved.
(378) In the present exemplary embodiment, since the same first time T1 is counted each time the controller 121B receives the synchronizing signal SBA.sub.i, the processing is simplified. That is, in the operation in which the controller 121B commands the timing of starting the operation of the robot 101.sub.i to move the laser head 102.sub.i to a welding target portion, the timing can be commanded by using only the second time T2.sub.j,i and the third time T3 without using the first time T1.
(379) In
(380)
(381) To be noted, in the case of waiting for the preparation of the robot 101.sub.i to be completed, that is, waiting for the signal SB.sub.i to be turned off, although the operation rate of the laser oscillator 103 decreases, the sequence does not fail. If the controller 121B waits for the preparation of the robot 101.sub.i to be completed, activation of the sequence SM2 is delayed. In this case, the next laser oscillation in accordance with the sequence SM2 performed after the switcher 104 switches the optical path to guide the laser beam to the laser head 102.sub.i is delayed, and therefore the timing of the laser oscillation becomes late. Therefore, the opportunity for the laser oscillation is missed in correspondence with the delay in the activation of the sequence SM2. In the case where the controller 121B waits for the preparation of the robot 101.sub.i to be completed and actually waiting time occurs, the laser oscillator 103 and the switcher 104 cannot operate during the waiting time. Still, this processing is necessary for realizing the sequences. To be noted, even if the time from the time point TP0.sub.2,1 to the time point TP1.sub.2,1 in the
(382) The waiting time of the laser oscillator 103 and the switcher 104 will be described. The robot controller 122.sub.i that controls the robot 101.sub.i starts supplying the trajectory data P.sub.j when receiving the operation start command SA.sub.i from the controller 121B. Although the robot controller 122.sub.i does not perform timekeeping, the robot controller 122.sub.i completes the supply of the trajectory data P.sub.j when the controller 121B completes the timekeeping of the first time T1 and the second time T2.sub.j,i. Next, the robot controller 122.sub.i starts supplying the trajectory data P.sub.j−(j+1). This time in which the trajectory data P.sub.j−(j+1) is supplied is defined as a rough movement time T4.sub.j,i. In addition, the waiting time of the laser oscillator 103 and the switcher 104 is defined as a time T5.sub.j,i. The waiting time T5.sub.j,i is time for the laser oscillator 103 and the switcher 104 to wait for the operation of a robot to complete. Time in which the robot 101.sub.i moves from the position 53.sub.j,i, at which irradiation of the laser beam on the welding target portion is finished, to the position 52.sub.j,i, at which irradiation of the laser beam on the next welding target portion, will be described. This time is time between completion of timekeeping of the second time T2.sub.j,i to the start of timekeeping of the second time T2.sub.j+1,i. This time is defined as a non-welding time T6.sub.j,i of the robot 101.sub.i.
(383)
(384)
(385) Similarly to
(386) This can be expressed by the following formula (20) in a generalized form. A set A is defined as an arithmetic sign for the formula (20). The set A is a set of numbers of the robots 101.sub.i that operate during the non-welding time T6.sub.j,i. For example, in the case of
(387)
(388) In the formula (20), T4.sub.j,i+T1 is time required for the control point of the robot 101.sub.i to move from the j-th welding target portion to the (j+1)-th welding target portion.
(389) In the case where the value of Ttemp.sub.j,i in the formula (20) is a positive value, the waiting time T5.sub.j,i of the calculated value of Ttemp.sub.j,i occurs. In the case where the value of Ttemp.sub.j,i in the formula (20) is a negative value or 0, the waiting time T5.sub.j,i does not occur. That is, by reducing the rough movement time T4.sub.j,i or increasing the total time of the second time T2.sub.j,i or the total time of the third time T3 during the non-welding time T6.sub.j,i, the waiting time T5.sub.j,i can be reduced.
(390) To reduce the waiting time T5.sub.j,i, the welding target portions which each robot 101.sub.i takes charge of can be devised. The robots 101.sub.i can be disposed at such positions that part of the welding target portions of the processing target object W can be welded by a plurality of robots 101.sub.i. Then, which welding target portions the robots 101.sub.i each perform welding on is determined such that the waiting time T5.sub.j,i becomes shorter. A specific example of determining the welding target portions of the plurality of robots 101.sub.i to reduce the waiting time T5.sub.j,i in this manner will be described.
(391) As illustrated in
(392) In addition, it is assumed that the robot 101.sub.1 is moved from the fourth welding target portion to the fifth welding target portion and that the robot 101.sub.2 and the robot 101.sub.3 respectively perform welding on the fourth welding target portions and the robot 101.sub.4 does not perform welding. In this case, the waiting time T5.sub.5,1 is {(T4.sub.4,1+T1)−{(3×T3)+(T2.sub.4,2+T2.sub.4,3)}.
(393) The waiting time T5.sub.5,1 is compared between the example in which the other three robots 101.sub.i perform welding while the robot 101.sub.1 is moved from the fourth welding target portion to the fifth welding target portion and the example in which the other two robots 101.sub.i perform welding while the robot 101.sub.1 is moved from the fourth welding target portion. The waiting time T5.sub.5,1 is longer by T3+T2.sub.4,4 in the case where the other two robots 101.sub.i perform welding. That is, the waiting time T5.sub.j,i can be reduced more in the case where the number of other robots 101.sub.i that perform welding during the non-welding time T6.sub.j,i of a robot 101.sub.i is larger.
(394) To be noted, in both of these examples, the waiting time T5.sub.4,1 is a positive value. When the waiting time T5.sub.4,1 is 0, the waiting time T5.sub.4,1 remains 0 even in the case where the number of robots is increased.
(395) Next, it is assumed that the processing target object W illustrated in
(396) In the case where difference between the numbers of welding target portions on which the four robots 101.sub.1, 101.sub.2, 101.sub.3, and 101.sub.4 respectively perform welding is larger, the number of other robots 101.sub.i that perform welding during the non-welding time T6.sub.j,i becomes smaller, and the waiting time T5.sub.j,i becomes longer. Therefore, the welding target portions on the processing target object W are distributed among the four robots 101.sub.1, 101.sub.2, 101.sub.3, and 101.sub.4 such that the difference between the numbers of welding target portions on which the four robots 101.sub.1, 101.sub.2, 101.sub.3, and 101.sub.4 respectively perform welding is smallest. In the example of
(397) As described above, by disposing the robots 101.sub.i such that part of the plurality of welding target portions on the processing target object W can be subjected to welding by two or more of the robots 101.sub.i, the difference between the numbers of welding target portions on which the robots 101.sub.i respectively take charge of can be reduced. Therefore, the number of other robots 101.sub.i that perform welding during the non-welding time T6.sub.j,i can be increased, and thus the waiting time T5.sub.j,i can be reduced.
(398) In addition, in the case where the processing target object W is a polyhedron such as a rectangular parallelepiped as in the example described above, interference between the robots 101.sub.i can be easily prevented by a simple measure like assigning each robot 101.sub.i to operation on different surface among side surfaces and a top surface.
(399) Here, although a case where which surface, that is, which welding target portion of the processing target object W each robot 101.sub.i takes charge of is determined such that operation areas of the four robots 101.sub.i do not overlap with one another has been described in the example of
(400)
(401) The robot 101.sub.1 takes charge of 68 welding target portions in total including all 51 welding target portions on the side surface W.sub.1 and 17 welding target portions on the side surface W.sub.2. The robot 101.sub.2 takes charge of 68 welding target portions in total including 33 welding target portions on the side surface W.sub.2 and 35 welding target portions on the side surface W.sub.3. The robot 101.sub.3 takes charge of 67 welding target portions in total including 16 welding target portions on the side surface W.sub.3, all 47 welding target portions on the side surface W.sub.4, and all 4 welding target portions on the top surface W.sub.5. That is, two robots 101.sub.1 and 101.sub.2 take charge of the side surface W.sub.2, and two robots 101.sub.2 and 101.sub.3 take charge of the side surface W.sub.3.
(402) The order of welding is determined so as to avoid interference between two robots 101.sub.i, that is, such that the two robots 101.sub.i do not perform welding on the same side surface W.sub.2 or W.sub.3. For example, the robot 101.sub.1 is configured to start welding from the side surface W.sub.1 and perform welding on the side surface W.sub.2 after finishing the welding on the side surface W.sub.1. The robot 101.sub.2 is configured to start welding from the side surface W.sub.2 and perform welding on the side surface W.sub.3 after finishing the welding on the 33 welding target portions on the side surface W.sub.2 that the robot 101.sub.2 takes charge of. The robot 101.sub.3 is configured to start welding from the side surface W.sub.3, perform welding on the side surface W.sub.4 after finishing the welding on the 16 welding target portions on the side surface W.sub.3 that the robot 101.sub.3 takes charge of, and finally perform welding on the top surface W.sub.5. The number of welding target portions on the surface that each of the robots 101.sub.1, 101.sub.2, and 101.sub.3 performs welding first among the side surfaces W.sub.1, W.sub.2, and W.sub.3 is configured to be different therebetween. That is, the robot 101.sub.1 performs welding on 51 welding target portions on the side surface W.sub.1, the robot 101.sub.2 performs welding on 33 welding target portions on the side surface W.sub.2, which are part of the welding target portions on the side surface W.sub.2, and the robot 101.sub.3 performs welding on 16 welding target portions on the side surface W.sub.3, which are part of the welding target portions on the side surface W.sub.3. Therefore, the robot 101.sub.3 can complete welding on the 16 welding target portions on the side surface W.sub.3, which are part of the welding target portions on the side surface W.sub.3, and move to the side surface W.sub.4 before the robot 101.sub.2 completes welding on the 33 welding target portions on the side surface W.sub.2, which are part of the welding target portions on the side surface W.sub.2, and moves to the side surface W.sub.3. In addition, the robot 101.sub.2 can complete welding on the 33 welding target portions on the side surface W.sub.2, which are part of the welding target portions on the side surface W.sub.2, and move to the side surface W.sub.3 before the robot 101.sub.1 completes welding on all the 51 welding target portions on the side surface W.sub.1 and moves to the side surface W.sub.2. As described above, interference between the robots 101.sub.i can be avoided by considering the order of welding of the robots 101.sub.i even in the case where the operation areas of the robots 101.sub.i overlap with one another.
(403) As described above, in the case where the number N of the robots 101.sub.i is larger, the total time of the second time T2.sub.j,i and the total time of the third time T3 during the non-welding time T6.sub.j,i can be increased, and the waiting time T5.sub.j,i can be reduced. In the N robots 101.sub.i, at most the other (N−1) robots 101.sub.i can perform welding during the non-welding time T6.sub.j,i of a robot 101.sub.i. Therefore, the waiting time T5.sub.j,i can be shortened by determining the number N such that the total time of the second times T2.sub.j,i, of (N−1) robots and the third times T3 of N robots is longer than the time T4.sub.j,i+T1.
(404) The more the number N of the robots 101.sub.i increases, the more the operation rate of the laser oscillator 103 increases and the production efficiency of the processed product is improved. In contrast, the size and cost of the apparatus can be reduced more in the case where the number N of the robots 101.sub.i is smaller. Therefore, the number N of robots can be determined in consideration of these. To increase the operation rate of the laser oscillator 103, the order of welding on welding target portions that the robots 101.sub.i take charge of can be devised, and the order of operation of the plurality of robots can be devised. If the waiting time T5.sub.j,i is shortened by devising these orders, the operation rate of the laser oscillator 103 is increased, and thus the production efficiency of the processed product can be improved.
(405) To be noted, since the timing of the laser oscillation is managed by using the synchronizing signal SBA.sub.i, the variation of length of the welding bead does not change even in the case where the time between the time points TP0.sub.2,1 and TP1.sub.2,1 changes, and welding strength does not change either. Therefore, welding can be performed stably.
(406) Each time the controller 121B receives the synchronizing signal SBA.sub.i transmitted from the robot controller 122.sub.i, the controller 121B controls the laser oscillator 103 to generate a laser beam when the same time of the first time T1 has elapsed. That is, the same time is used as the first time T1 instead of setting a different time for each welding target portion. Therefore, each time the controller 121B receives the synchronizing signal SBA.sub.i, the controller 121B counts the same time as the first time T1 without recognizing which trajectory data P.sub.j,i the robot controller 122.sub.i has started commanding, and thus the processing is simplified.
(407) The movement speed of the laser head 102.sub.i has reached the target speed Vw.sub.j,i when the first time T1 has elapsed, and the processing target object W is irradiated with the laser beam in the state of the constant speed at this target speed Vw.sub.j,i. That is, by moving the laser head 102.sub.i at a constant speed with respect to the processing target object W, the focal point of the laser beam L.sub.i can be moved at a constant speed along the surface of the processing target object W. Therefore, the amount of input heat on the processing target object W is uniformized in the movement direction of the focal point of the laser beam L.sub.i, and thus a uniform welding bead can be formed on the processing target object W in the movement direction of the focal point of the laser beam L.sub.i. As a result of this, highly precise laser seam welding can be realized.
(408) Here, the control period of the robot controller 122.sub.i is set to a value suitable for controlling the operation of the robot 101.sub.i, for example, several milliseconds. In the present exemplary embodiment, the controller 121B controls on/off switching of laser oscillation of the laser oscillator 103 and the switching operation of the switcher 104 at a shorter control period than the control period of the robot controller 122.sub.i. That is, in the present exemplary embodiment, the control period of the controller 121B to control the laser oscillator 103 and the switcher 104 is shorter than the control period of the robot controller 122.sub.i to control the robot 101.sub.i. Therefore, the controller 121B can more accurately manage the first time T1, the second time T2.sub.j,i, and the third time T3 than the robot controller 122.sub.i. That is, since the controller 121B can control the laser oscillator 103 and the switcher 104 with a shorter control period, the controller 121B can more accurately manage the timing for causing the laser oscillator 103 to start and stop generation of the laser beam and the timing of the switching of the switcher 104. As a result, variation of the length of the welding bead can be reduced, and thus variation of the welding strength can be reduced.
(409) According to the present exemplary embodiment, the timing at which the robot controller 122.sub.i commands the start point of the trajectory data P.sub.j,i and the timing at which the controller 121B starts the timekeeping of the first time T1 are configured to be synchronized on the basis of the synchronizing signal SBA.sub.i. The timing at which the robot controller 122.sub.i commands the start point of the trajectory data P.sub.j,i is the timing at which the robot controller 122.sub.i starts supplying the trajectory data P.sub.j,i. That is, the robot controller 122.sub.i generates the synchronizing signal SBA.sub.i synchronized with the operation of the robot 101.sub.i, and the on/off switching of the irradiation of the laser beam is managed by the elapsed time from the time point at which the controller 121B is synchronized with the synchronizing signal SBA.sub.i. Therefore, the controller 121B and the robot controller 122.sub.i synchronize the operation of the robot 101.sub.i with the on/off switching of the laser oscillation of the laser oscillator 103 without performing complicated arithmetic processing. Therefore, the deviation between the timing of the operation of the robot 101.sub.i and the timing of the laser oscillation can be reduced. As a result of this, the difference between the actual position and the target position for starting the irradiation of the laser beam can be reduced. In addition, since the laser oscillation does not have to be controlled by performing complicated arithmetic processing while the robot 101.sub.i is operating, the operation of the robot 101.sub.i can be accelerated while securing the precision of laser processing, and thus the production efficiency of the processed product can be improved.
(410) The control period of the robot controller 122.sub.i is longer than the control period of the controller 121B. Therefore, the timing at which the robot controller 122.sub.i recognizes the operation start command SA.sub.i varies. In the present exemplary embodiment, the controller 121B starts timekeeping of the first time T1 in which the laser oscillation is performed, not at a timing at which the operation start command SA.sub.i is transmitted but at a timing at which the synchronizing signal SBA.sub.i is received. Therefore, the difference in the timing of operation of the robot 101.sub.i and the timing of laser oscillation of the laser oscillator 103 can be reduced.
(411) To be noted, the robot controller 122, may be configured to turn on an unillustrated preparation completion signal indicating that the preparation is completed. In this case, the controller 121B may turn on the operation start command SA.sub.i after confirming that the preparation completion signal has been turned on. In this case, time points after the time point TP0.sub.2,1 are delayed.
(412) The first time T1 may be a count value that varies depending on the corresponding robot 101.sub.i, although the arithmetic processing for obtaining the timing at which the operation start command SA.sub.i is turned on becomes more complex. For example, variation of the first time T1 may be added to or subtracted from a value obtained by adding up the second time T2.sub.j,i and the third time T3. This calculation can be performed backward by determining timings starting from the timing of laser irradiation on the last welding target portion to the timing of laser irradiation on the first welding target portion. In the case where the controller 121B performs this calculation, the controller 121B needs to obtain information about the order of all the welding target portions and the robots.
(413) In the present exemplary embodiment, the order in which the operation start command SA.sub.i is transmitted to the robots 101.sub.1 to 101.sub.N is determined in advance. However, a configuration in which the robot controller 122.sub.i transmits the second time T2.sub.j,i to the controller 121B together with the synchronizing signal SBA.sub.i such that the operation start command SA.sub.i is transmitted to a robot 101.sub.i that is in the preparation complete state may be also employed. The times used for timekeeping in the sequences SM1 and SM2 managed by the controller 121B are the first time T1, the second time T2.sub.j,i, and the third time T3. Among these, the first time T1 and the third time T3 can be stored in the controller 121B in advance because these times are fixed values. Since the second time T2.sub.j,i varies depending on the welding target portion, the robot controller 122.sub.i may transmit the second time T2.sub.j,i to the controller 121B together with the synchronizing signal SBA.sub.i. The second time T2.sub.j,i that is transmitted is used in the sequences SM1 and SM2 activated in accordance with the synchronizing signal SBA.sub.i. By using this method, the order in which the operation start command SA.sub.i is transmitted to robots does not have to be determined in advance, and thus the independence of the robot apparatus 110 can be enhanced.
(414) To be noted, in the tenth exemplary embodiment described above, abnormality can occur while the robot 101.sub.i is operating in accordance with the linear interpolation command. The robot controller 122.sub.i may periodically transmit a signal indicating the state of the robot 101.sub.i to the controller 121B. The controller 121B may control the laser oscillator 103 not to oscillate a laser beam in the case where a signal indicating that the robot 101.sub.i is in an abnormal state is received.
(415) In the tenth exemplary embodiment, the robot controller 122.sub.i may transmit a permission signal permitting laser oscillation to the controller 121B. For example, the permission signal is turned on while the robot controller 122.sub.i is supplying the trajectory data P.sub.j in accordance with the linear interpolation command. The controller 121B may perform AND calculation of the permission signal and the laser oscillation command and transmit the calculation result thereof to the laser oscillator 103. As a result of this, the laser beam is not oscillated unless the robot controller 122.sub.i turns on the permission signal. Although this AND calculation is performed by the controller 121B, this AND calculation may be performed by another electronic circuit.
(416) In the tenth exemplary embodiment described above, it is preferable that the robot 101.sub.i is installed in an unillustrated light-shielding booth such that any person is not exposed to the laser beam. The laser oscillator 103 is configured to stop the laser oscillation when a person opens a door to enter the booth. In addition, in the case where the optical fiber cable 151.sub.i is not correctly connected to the laser head 102.sub.i and the switcher 104, wiring in the optical fiber cable 151.sub.i is not connected, and the laser oscillation of the laser oscillator 103 is stopped. In addition, in the case where the optical fiber cable 152 is not correctly connected to the laser oscillator 103 and the switcher 104, wiring in the optical fiber cable 152 is not connected, and the laser oscillation of the laser oscillator 103 is stopped. In addition, in the case where the optical fiber cables 151.sub.i and 152 are bent to a certain degree or more, the wiring therein is broken and the laser oscillation of the laser oscillator 103 is stopped. It is also possible to configure the laser oscillator 103 to stop the laser oscillation when a person is detected by a safety laser scanner, a safety light curtain, or the like. It is also possible to stop the laser oscillation by monitoring the booth by external hardware in case the robot controller 122.sub.i or the controller 121B stops responding for some reason. For example, a signal that is turned on and off at regular intervals may be output from the robot controller 122.sub.i and the controller 121B, and the laser oscillation may be stopped when the output signal does not change for a certain period.
(417) Although a case where the control apparatus 120B is constituted by the controller 121B and the robot controllers 122.sub.1 to 122.sub.N has been described in the tenth exemplary embodiment described above, the configuration of the control apparatus 120B is not limited to this. The control apparatus may be realized by one computer as long as the control apparatus has functions of the controller 121B and the robot controllers 122.sub.1 to 122.sub.N. For example, the control apparatus can be realized by one computer if parallel processing can be performed by a plurality of processors or a plurality of cores included in a processor.
(418) Although a case where the robots 101.sub.1 to 101.sub.N are each a vertically articulated robot has been described in the tenth exemplary embodiment described above, the configuration of the robots 101.sub.1 to 101.sub.N is not limited to this. For example, the robots 101.sub.1 to 101.sub.N may be each a horizontally articulated robot, a parallel link robot, or a cartesian coordinate robot. Further, the robots 101.sub.1 to 101.sub.N may each have a different configuration.
(419) Although a case where a laser processing apparatus performs laser welding has been described in the tenth exemplary embodiment described above, the configuration is not limited to this. For example, the laser processing apparatus may perform laser grooving processing or laser cutting processing.
Eleventh Exemplary Embodiment
(420) Next, a method of producing an image forming apparatus by a laser processing method using the laser processing apparatus according to any one of the first to tenth exemplary embodiments described above will be described.
(421) The image forming apparatus 800 illustrated in
(422) In the image forming apparatus 800, a recording material is conveyed, and an image is formed on the recording material. Therefore, image defects or malfunction can occur in the case where the frame body of the image forming apparatus 800 is distorted. Therefore, suppressing the distortion of the frame body of the image forming apparatus 800 is important for suppressing image defects and malfunction.
(423)
(424) First, a prop 904 and a stay 701 that constitute the frame body 900 are prepared. The prop 904 has a first side wall 904A and a second side wall 904B that are parallel in the vertical direction and cross each other at a right angle. The stay 701 is disposed such that an end portion thereof comes into contact with the first side wall 904A and the second side wall 904B. The first side wall 904A is disposed to extend parallel to the front-rear direction, and the second side wall 904B is disposed to extend parallel to the right-left direction. Therefore, the stay 701 is disposed to be vertically movable in a state in which the position thereof in the front-rear direction and the right-left direction is defined by the first side wall 904A and the second side wall 904B.
(425) After adjusting the position of the stay 701 in the vertical direction, welding target portions 941, 942, 943, and 944 are subjected to laser seam welding. The stay 701 and the prop 904 are fixed to each other by laser welding to produce the frame body 900.
(426) As described above, the frame body 900 of the image forming apparatus 800 includes many welding target portions. According to the laser processing method using the laser processing apparatus according to any one of the first to tenth exemplary embodiments, welding can be performed efficiently, in a short time, and with a high precision. As a result of this, distortion of the frame body 900 can be suppressed, and thus defects of images formed on a sheet and malfunction of the image forming apparatus 800 can be suppressed.
(427) The present invention is not limited to the exemplary embodiments described above, and can be modified in many ways within the technical concept of the present invention. In addition, the effects described in the exemplary embodiments are merely most preferable effects that can be achieved by the present invention, and the effect of the present invention is not limited to the effects described in the exemplary embodiments.
Other Embodiments
(428) Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
(429) While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
(430) This application claims the benefit of Japanese Patent Application No. 2017-162865, filed Aug. 25, 2017, Japanese Patent Application No. 2017-162864, filed Aug. 25, 2017, and Japanese Patent Application No. 2018-134972, filed Jul. 18, 2018, which are hereby incorporated by reference herein their entirety.