PRODUCTION SYSTEM
20220314455 · 2022-10-06
Assignee
Inventors
Cpc classification
B25J9/1664
PERFORMING OPERATIONS; TRANSPORTING
B23Q17/249
PERFORMING OPERATIONS; TRANSPORTING
B25J9/162
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1653
PERFORMING OPERATIONS; TRANSPORTING
B23Q17/24
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40604
PHYSICS
G05B2219/39394
PHYSICS
B25J9/163
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A production system includes a machine tool (10), a robot (25) having a camera (31), an automatic guided vehicle (35) having the robot (25) mounted thereon, and a controller (40) controlling the automatic guided vehicle (35) and the robot (25), and has an identification figure arranged in a machining area of the machine tool (10). The controller (40) stores, as a reference image, an image of the identification figure captured by the camera (31) with the robot (25) in an image capturing pose in a teaching operation. When repeatedly operating the automatic guided vehicle (35) and the robot (25), the controller (40) estimates an amount of error between a pose of the robot (25) in the teaching operation and a current pose of the robot (25) based on the reference image and an image of the identification figure captured by the camera (31) with the robot (25) in the image capturing pose, and corrects operating poses of the robot (25) based on the estimated amount of error.
Claims
1. A production system comprising: a machine tool performing predetermined machining on a workpiece; a robot having a camera for image capturing and configured to perform an operation with respect to the machine tool; an automatic guided vehicle having the robot mounted thereon, and configured to move to an operation position set with respect to the machine tool; and a controller configured to, in accordance with an operation program containing a preset operation command, move the automatic guided vehicle to the operation position, then bring the robot from an operation starting pose into an image capturing pose allowing an image of an identification figure for pose correction provided on the machine tool to be captured by the camera, and then bring the robot into operating poses in sequence, the operation starting pose, the image capturing pose, and the operating poses being set in advance by performing a teaching operation to the robot, the identification figure being displayed on a structure provided to be able to change its position within a machining area of the machine tool, or being projected on the structure by means of a projector, the controller previously storing, as a reference image, an image of the identification figure captured by the camera with the robot in the image capturing pose in the teaching operation, and the controller being configured to, when repeatedly operating the automatic guided vehicle and the robot in accordance with the operation program, estimate an amount of error between a pose of the robot in the teaching operation and a current pose of the robot based on the reference image and an image of the identification figure captured by the camera with the robot brought into the image capturing pose from the operation starting pose, and correct the operating poses based on the estimated amount of error.
2. (canceled)
3. The production system of claim 1, wherein the machine tool includes a tool presetter provided to be movable into and out of the machining area, and the structure is the tool presetter.
4. The production system of claim 1, wherein the machine tool includes a tool spindle configured to hold a tool, and the structure is a holder provided to be attachable to and detachable from the tool spindle.
5. A production system comprising: a machine tool performing predetermined machining on a workpiece; a robot having a camera for image capturing and configured to perform an operation with respect to the machine tool; an automatic guided vehicle having the robot mounted thereon, and configured to move to an operation position set with respect to the machine tool; and a controller configured to, in accordance with an operation program containing a preset operation command, move the automatic guided vehicle to the operation position, then bring the robot from an operation starting pose into an image capturing pose allowing an image of an identification figure for pose correction provided on the machine tool to be captured by the camera, and then bring the robot into operating poses in sequence, the operation starting pose, the image capturing pose, and the operating poses being set in advance by performing a teaching operation to the robot, the identification figure being displayed on a display provided to be arrangeable in a machining area of the machine tool, the controller previously storing, as a reference image, an image of the identification figure captured by the camera with the robot in the image capturing pose in the teaching operation, and the controller being configured to, when repeatedly operating the automatic guided vehicle and the robot in accordance with the operation program, estimate an amount of error between a pose of the robot in the teaching operation and a current pose of the robot based on the reference image and an image of the identification figure captured by the camera with the robot brought into the image capturing pose from the operation starting pose, and correct the operating poses based on the estimated amount of error.
6. The production system of claim 5, wherein the machine tool includes a tool presetter provided to be movable into and out of the machining area, and the display is attached to the tool presetter.
7. The production system of claim 5, wherein the machine tool includes a tool spindle configured to hold a tool, and the display is attached to a holder provided to be attachable to and detachable from the tool spindle.
8. The production system of claim 1, wherein the identification figure has a matrix structure having a plurality of pixels arranged two-dimensionally.
9. The production system of claim 3, wherein the identification figure has a matrix structure having a plurality of pixels arranged two-dimensionally.
10. The production system of claim 4, wherein the identification figure has a matrix structure having a plurality of pixels arranged two-dimensionally.
11. The production system of claim 5, wherein the identification figure has a matrix structure having a plurality of pixels arranged two-dimensionally.
12. The production system of claim 6, wherein the identification figure has a matrix structure having a plurality of pixels arranged two-dimensionally.
13. The production system of claim 7, wherein the identification figure has a matrix structure having a plurality of pixels arranged two-dimensionally.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
DESCRIPTION OF EMBODIMENTS
[0048] Hereinafter, specific embodiments of the present invention will be described with reference to the drawings.
First Embodiment
[0049] Firstly, a first embodiment of the present invention is described. As illustrated in
[0050] As illustrated in
[0051] The material storage 20 is disposed on the left of the machine tool 10 in
[0052] As illustrated in
[0053] Further, the automatic guided vehicle 35 has a sensor (for example, a distance measurement sensor using a laser beam) which enables recognition of the position of the automatic guided vehicle 35 in a plant, and the automatic guided vehicle 35 is configured to travel tracklessly in the plant, including the area where the machine tool 10, the material storage 20, and the product storage 21 are disposed, under control by the controller 40. The automatic guided vehicle 35 in this embodiment moves to operation positions respectively set with respect to the machine tool 10, the material storage 20, and the product storage 21.
[0054] The robot 25 is an articulated robot having three arms, namely, a first arm 26, a second arm 27, and a third arm 28. The third arm 28 has a hand 29 as an end effector attached to a distal end thereof, and also has two cameras 31 attached to the distal end thereof through a support bar 30.
[0055] As illustrated in
[0056] Note that the controller 40 is composed of a computer including a CPU, a RAM, and a ROM. The manual-operation control unit 46, the automatic-operation control unit 47, the map information generator 48, the position recognition unit 49, the correction amount calculator 50, and the input and output interface 51 are functionally implemented by a computer program to carry out the processes described later. The operation program storage 41, the moving position storage 42, the operation pose storage 43, the map information storage 44, and the reference image storage 45 are composed of an appropriate storage medium, such as a RAM.
[0057] The manual-operation control unit 46 is a functional unit that operates the automatic guided vehicle 35, the robot 25, and the cameras 31 in accordance with operation signals input through the operation panel 37 by an operator. That is to say, an operator can manually operate the automatic guided vehicle 35, the robot 25, and the cameras 31 through the operation panel 37, which is controlled by the manual-operation control unit 46.
[0058] The operation program storage 41 is a functional unit that stores therein an automatic operation program for causing the automatic guided vehicle 35 and the robot 25 to operate automatically during production, and a map generation program for causing the automatic guided vehicle 35 to operate during generation of map information of the plant, which is described later. The automatic operation program and the map generation program are stored into the operation program storage 41, for example, by being input through the input and output unit of the operation panel 37.
[0059] The automatic operation program contains command codes regarding a moving position as a target position to which the automatic guided vehicle 35 is moved, a moving speed of the automatic guided vehicle 35, and an orientation of the automatic guided vehicle 35. The automatic operation program further contains command codes regarding operations to be carried out in sequence by the robot 25 and command codes regarding operation of the cameras 31. The map generation program contains command codes for causing the automatic guided vehicle 35 to travel tracklessly all over the plant to cause the map information generator 48 to generate map information.
[0060] The map information storage 44 is a functional unit that stores therein map information including information on arrangement of machines, devices, instruments, etc. (hereinafter, collectively referred to as “devices”) arranged in the plant where the automatic guided vehicle 35 travels. The map information is generated by the map information generator 48.
[0061] The map information generator 48 obtains spatial information of the plant from distance data detected by the sensor when the automatic guided vehicle 35 travels in accordance with the map generation program stored in the operation program storage 41 under control by the automatic-operation control unit 47, which is described in detail later, of the controller 40. The map information generator 48 also recognizes planar shapes of the devices arranged in the plant, and, for example, based on previously registered planar shapes of the devices, recognizes the positions, planar shapes, etc. of particular devices (in this embodiment, the machine tool 10, the material storage 20, and the product storage 21) arranged in the plant (arrangement information). The map information generator 48 stores the obtained spatial information and arrangement information as map information of the plant into the map information storage 44.
[0062] The position recognition unit 49 is a functional unit that recognizes the position of the automatic guided vehicle 35 in the plant based on distance data detected by the sensor and the map information of the plant stored in the map information storage 44. The position of the automatic guided vehicle 35 recognized by the position recognition unit 49 is used in control of operation of the automatic guided vehicle 35 by the automatic-operation control unit 47.
[0063] The moving position storage 42 is a functional unit that stores therein specific moving positions. The moving positions are specific target positions to which the automatic guided vehicle 35 is moved, and correspond to the above-mentioned command codes contained in the operation program. The moving positions include the above-mentioned operation positions set with respect to the machine tool 10, the material storage 20, and the product storage 21. Note that the moving positions are set, for example, as follows: the automatic guided vehicle 35 is manually operated through the operation panel 37 such that it is moved to each targeted position under control by the manual-operation control unit 46, and position data recognized by the position recognition unit 49 at each targeted position is stored into the moving position storage 42. This operation is generally called “teaching operation”.
[0064] The operation pose storage 43 is a functional unit that stores therein data regarding poses (operation poses) of the robot 25, into which the robot 25 is brought in sequence when it is operated in a predetermined sequence. The operation poses correspond to the command codes contained in the operation program. This operation pose data is composed of rotational angle data of joints (motors) of the robot 25 in each targeted pose. This rotational angle data is obtained by, in the teaching operation using the operation panel 37, manually operating the robot 25 such that the robot 25 is brought into each targeted pose under control by the manual-operation control unit 46. The obtained rotational angle data is stored as operation pose data into the operation pose storage 43.
[0065] Specific operation poses of the robot 25 are set with respect to each of the material storage 20, machine tool 10, and product storage 21. For example, a set of extraction poses is set with respect to the material storage 20, the set of extraction poses consisting of an operation starting pose (extraction starting pose) for starting an operation with respect to the material storage 20, operating poses (extracting poses) for causing the hand 29 to grip an unmachined workpiece stored in the material storage 20 and extract the unmachined workpiece from the material storage 20, and a pose for finishing the extraction (extraction finishing pose; in this embodiment, this pose is identical to the extraction starting pose).
[0066] A set of workpiece-removal poses for removing a machined workpiece from the machine tool 10 and a set of workpiece-attachment poses for attaching an unmachined workpiece to the machine tool 10 are set with respect to the machine tool 10.
[0067] Specifically, the set of workpiece-removal poses consists of, for example, an operation starting pose preceding entrance into the machine tool 10 (see
[0068] The set of workpiece-attachment poses consists of, for example, an operation starting pose preceding insertion into the machine tool 10 (see
[0069] A set of storage poses is set with respect to the product storage 21, the set of storage poses consisting of an operation starting pose for starting an operation with respect to the product storage 21 (storage starting pose), operating poses for storing a machined workpiece gripped by the hand 29 into the product storage 21 (storing poses), and a pose for finishing the storage (storage finishing pose; in this embodiment, this pose is identical to the storage starting pose).
[0070] The automatic-operation control unit 47 is a functional unit that operates the automatic guided vehicle 35, the robot 25, and the cameras 31 in accordance with the automatic operation program or map generation program stored in the operation program storage 41. In this process, the data stored in the moving position storage 42 and the operation pose storage 43 are used as necessary.
[0071] The reference image storage 45 is a functional unit that stores therein, as reference images, images of the identification figure provided on the support bar 15 of the tool presetter 13 captured by the two cameras 31 when the automatic guided vehicle 35 is at the operation position set with respect to the machine tool 10 and the robot 25 is in the image capturing pose in the teaching operation.
[0072] Once images of the identification figure are captured by the two cameras 31 with the robot 25 in the image capturing pose when the robot 25 operates automatically in accordance with the automatic operation program stored in the operation program storage 41 under control by the automatic-operation control unit 47, the correction amount calculator 50 estimates an amount of error between the image capturing pose in this automatic operation and the image capturing pose in the teaching operation based on the images of the identification figure captured in this automatic operation and the reference images stored in the reference image storage 45, and calculates a correction amount for the set of workpiece-removal poses and the set of workpiece-attachment poses of the robot 25 based on the estimated amount of error. An example of the images of the identification figure captured in the automatic operation is shown in
[0073] The cameras 31 are of the type called “stereo camera”; therefore, it is possible to calculate a relative positional relation between the cameras 31 and the identification figure and angles of rotation, e.g., angles of rotation around three orthogonal axes, of the cameras 31 with respect to the identification figure based on images captured by the cameras 31. Accordingly, it is possible to estimate an amount of error between the image capturing pose in the teaching operation and the image capturing pose in the automatic operation based on the positional relation and angles of rotation calculated based on the reference images and the positional relation and angles of rotation calculated based on images captured in the automatic operation.
[0074] In the production system 1 according to this embodiment having the above-described configuration, unmanned and automated production is performed in the following manner.
[0075] That is to say, the automatic operation program stored in the operation program storage 41 is executed under control by the automatic-operation control unit 47 of the controller 40, so that, for example, the automatic guided vehicle 35 and the robot 25 operate in the following manner in accordance with the automatic operation program.
[0076] First, the automatic guided vehicle 35 moves to the operation position set with respect to the machine tool 10, and the robot 25 assumes the operation starting pose for the workpiece removal. At this time, the machine tool 10 has finished a predetermined machining operation, and a door cover thereof has been opened so that the robot 25 can enter the machining area. Further, the support bar 15 of the tool presetter 13 has been moved into the machining area upon receipt of a command from the automatic-operation control unit 47.
[0077] Subsequently, the robot 25 shifts to the image capturing pose to cause the cameras 31 to capture images of the identification figure provided on the support bar 15. Once the images of the identification figure are captured by the cameras 31, the correction amount calculator 50 estimates an amount of error between the current image capturing pose of the robot 25 and the image capturing pose of the robot 25 in the teaching operation based on the captured images of the identification figure and the reference images stored in the reference image storage 45, and calculates a correction amount for the rest of the set of workpiece-removal poses of the robot 25 based on the estimated amount of error.
[0078] Based on the correction amount calculated by the correction amount calculator 50, the automatic-operation control unit 47 controls the rest of the set of workpiece-removal poses, namely, the removal preparing pose, the gripping pose, the pulling pose, and the operation finishing pose, so that a machined workpiece clamped by the chuck of the machine tool 10 is gripped by the hand 29 and removed from the machine tool 10. Note that, after bringing the robot 25 into the gripping pose, the automatic-operation control unit 47 transmits a chuck open command to the machine tool 10 to open the chuck.
[0079] Subsequently, the automatic-operation control unit 47 moves the automatic guided vehicle 35 to the operation position set with respect to the product storage 21 and brings the robot 25 in sequence into the storage starting pose for starting storage with respect to the product storage 21, the storing poses for storing the machined workpiece gripped by the hand 29 into the product storage 21, and the storage finishing pose for finishing the storage. Thereby, the machined workpiece gripped by the hand 29 is stored into the product storage 21.
[0080] Subsequently, the automatic-operation control unit 47 moves the automatic guided vehicle 35 to the operation position set with respect to the material storage 20 and brings the robot 25 in sequence into the extraction starting pose for starting extraction with respect to the material storage 20, the extracting poses for causing the hand 29 to grip an unmachined workpiece stored in the material storage 20 and extract the unmachined workpiece from the material storage 20, and the extraction finishing pose for finishing the extraction. Thereby, an unmachined workpiece is gripped by the hand 29.
[0081] Subsequently, the automatic-operation control unit 47 moves the automatic guided vehicle 35 to the operation position set with respect to the machine tool 10 again, and brings the robot 25 into the operation starting pose for the workpiece attachment. Subsequently, the automatic-operation control unit 47 brings the robot 25 into the image capturing pose to cause the cameras 31 to capture images of the identification figure provided on the support bar 15. Once the images of the identification figure are captured by the cameras 31, the correction amount calculator 50 estimates an amount of error between the current image capturing pose of the robot 25 and the image capturing pose of the robot 25 in the teaching operation based on the captured images of the identification figure and the reference images stored in the reference image storage 45, and calculates a correction amount for the rest of the set of workpiece-attachment poses of the robot 25 based on the estimated amount of error.
[0082] Based on the correction amount calculated by the correction amount calculator 50, the automatic-operation control unit 47 controls the rest of the set of workpiece-attachment poses, namely, the attachment preparing pose, the attaching pose, the moving-away pose, and the operation finishing pose, of the robot 25, thereby causing the robot 25 to attach the unmachined workpiece gripped by the hand 29 to the chuck of the machine tool 10 and then move out of the machine tool 10. Thereafter, the automatic-operation control unit 47 transmits a machining start command to the machine tool 10 to cause the machine tool 10 to perform a machining operation. Note that, after bringing the robot 25 into the attaching pose, the automatic-operation control unit 47 transmits a chuck close command to the machine tool 10 to close the chuck, so that the unmachined workpiece is clamped by the chuck.
[0083] In the production system 1 according to this embodiment, unmanned and automated production is continuously performed by repeating the above-described series of processes.
[0084] As described above, the production system 1 according to this embodiment is configured to correct the operating poses of the robot 25 using the identification figure that is arranged in the machining area of the machining tool 10 where the robot 25 actually performs the operations; therefore, the operating poses are accurately corrected. This enables the robot 25 to accurately perform even an operation which requires high operating accuracy.
[0085] Since the robot 25 accurately performs the operations, the production system 1 operates with high availability without unnecessary interruption. Consequently, the production system 1 achieves an unmanned production system with high reliability and high production efficiency.
[0086] Further, the identification figure in this embodiment is provided on the support bar 15 of the tool presetter 13 that is stored outside the machining area while machining is performed by the machine tool 10; therefore, the identification figure is prevented from being soiled by chips or the like produced during machining. Consequently, the correction is carried out accurately.
[0087] Note that the identification figure in this example is provided by adhering a sheet with the identification figure drawn thereon to the display board 16 provided on the support bar 15; however, the present invention is not limited thereto and the identification figure may be drawn directly on the display board 16.
Second Embodiment
[0088] Next, a second embodiment of the present invention is described. In this second embodiment, as shown in
[0089] Similarly to the tool presetter 13, the holder 17 can be stored outside the machining area while machining is performed by the machine tool 10. Therefore, the identification figure can be prevented from being soiled by chips or the like produced during machining; consequently, the correction is carried out accurately.
[0090] As for the manner of providing the identification figure on the holder 17, the identification figure may be provided by adhering a sheet with the identification figure drawn thereon to the holder 17 or may be provided by drawing the identification figure directly on the holder 17.
Third Embodiment
[0091] Next, a third embodiment of the present invention is described. In this third embodiment, as shown in
[0092] Alternatively, as a variation of this third embodiment, as shown in
Fourth Embodiment
[0093] Next, a fourth embodiment of the present invention is described. In this fourth embodiment, as shown in
[0094] Alternatively, as a variation of this fourth embodiment, as shown in
[0095] Hereinbefore, the first through fourth embodiments of the present invention have been described. However, it should be understood that the present invention is not limited to these embodiments and can be implemented in other manners.
[0096] For example, in the above-described embodiments, the identification figure has a matrix structure having a plurality of pixels arranged two-dimensionally. However, the identification figure is not limited to such a figure and may be any other type of figure which is such that a captured image thereof is usable for calculation of a correction amount for poses of the robot 25. Further, in the above-described embodiments, two cameras 31 are provided. However, the present invention is not limited thereto and the production system may include only one camera 31 if it is possible to calculate a correction amount for poses of the robot 25 based on an image captured by the camera 31. Further, in the above-described embodiments, the object on which the identification figure is placed, projected or displayed is a holder or a tool presetter. However, the present invention is not limited thereto and the object may be any other object which is arranged in the machining area, such as a tailstock, a bed, or a table.
[0097] As already mentioned above, the foregoing description of the embodiments is not limitative but illustrative in all aspects. One skilled in the art would be able to make variations and modifications as appropriate. The scope of the invention is not defined by the above-described embodiments, but is defined by the appended claims. Further, the scope of the invention encompasses all modifications made from the embodiments within the scope equivalent to the scope of the claims.
REFERENCE SIGNS LIST
[0098] 1 Production system [0099] 10 Machine tool [0100] 11 Workpiece spindle [0101] 12 Tool spindle [0102] 13 Tool presetter [0103] 14 Contactor [0104] 15 Support bar [0105] 16 Display board [0106] 20 Material storage [0107] 21 Product storage [0108] 25 Robot [0109] 29 Hand [0110] 31 Camera [0111] 35 Automatic guided vehicle [0112] 37 Operation panel [0113] 40 Controller [0114] 41 Operation program storage [0115] 42 Moving position storage [0116] 43 Operation pose storage [0117] 44 Map information storage [0118] 45 Reference image storage [0119] 46 Manual-operation control unit [0120] 47 Automatic-operation control unit [0121] 48 Map information generator [0122] 49 Position recognition unit [0123] 50 Correction amount calculator [0124] 51 Input and output interface