WORKER TERMINAL FOR ROBOT OPERATION
20170136627 ยท 2017-05-18
Assignee
Inventors
- Ryuichiro TAKAICHI (Otsu-shi, SHIGA, JP)
- Yasushi KAWASHIMA (Kusatsu-shi, SHIGA, JP)
- Masayoshi ABE (Kyoto-shi, Kyoto, JP)
- Takayuki EDA (Joyo-shi, KYOTO, JP)
Cpc classification
B25J9/1694
PERFORMING OPERATIONS; TRANSPORTING
B25J9/0084
PERFORMING OPERATIONS; TRANSPORTING
B25J9/0096
PERFORMING OPERATIONS; TRANSPORTING
B25J13/06
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1656
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40413
PHYSICS
International classification
Abstract
A worker terminal that sends an operation commands to robots used in work includes: a first sensor that detects a muscle potential of a worker; a second sensor that detects a head movement of the worker; a processing unit that determines whether or not operation instructions, defined by a combination of the head movement and change in the muscle potential, have been input by the worker, on the basis of the detection results of the first sensor and the second sensor; and a communications unit that sends an operation command to the robot is a determination has been made that an operation instruction has been input by the worker.
Claims
1. A worker terminal, sending an operation commands to robots used in work, wherein the worker terminal comprises: a first sensor that detects a muscle potential of a worker; a second sensor that detects a head movement of the worker; a processing unit that determines whether or not operation instructions, defined by a combination of the head movement and change in the muscle potential, have been input by the worker, on the basis of the detection results of the first sensor and the second sensor; and a communications unit that sends an operation command to the robot is a determination has been made that an operation instruction has been input by the worker.
2. The worker terminal according to claim 1, further comprising a prompt unit that prompts the change in the muscle potential sensed by the first sensor and the head movement sensed by the second sensor to the worker.
3. The worker terminal according to claim 2, wherein when the change in the muscle potential satisfies a first condition and the head movement satisfies a second condition, the processing unit judges that the operation instruction has been input, and the prompt unit prompts the first condition and the second condition to the worker.
4. The worker terminal according to claim 3, wherein within a specified limit time after the muscle potential satisfies the first condition, the processing unit judges that the operation instruction has been input when the head movement satisfies the second condition, and when the muscle potential satisfies the first condition, the prompt unit further prompts countdown of a limit time for inputting the head movement.
5. The worker terminal according to claim 3, further comprising a setting change unit that changes the first condition and/or the second condition by the worker.
6. The worker terminal according to claim 2, wherein the worker terminal comprises a see-through head-mounted display, and the prompt unit is a graphic image displayed on the head-mounted display.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
DESCRIPTION OF THE EMBODIMENTS
[0035] The invention relates to a technology for causing humans to send an operation commands to robots used in work feasibly with required timing, especially suitable for a production mode in a new form in which humans and robots cooperate for one work while giving play to their own advantages. In the following embodiment, an example of a cell production line applied to product assembling, inspection and packing in the invention is described.
[0036] (Cell Production Line)
[0037] The cell production mode refers to that working platforms of various procedures such as machining, assembling, inspection and packing are configured in a manner of surrounding a work space of workers, and one or several workers manufacture products while moving between the working platforms. There are several changes in configuration of the working platforms, but the most common form is configuring the working platforms to be in a U-shaped form.
[0038]
[0039] The working platforms 11-16 are places where various operation procedures such as (1) parts assembling, (2) cable bundling, (3) screwing, (4) inspection, (5) packing preparation and (6) packing and moving out are performed respectively. (1) parts assembling, (2) cable bundling and (6) packing and moving out are undertaken by the worker 10a, and (3) screwing, (4) inspection and (5) packing preparation are undertaken by the worker 10b.
[0040] (1) Parts Assembling
[0041] On the working platform 11, the robot 11a picks up parts from a part rack according to an instruction (signal) of the worker 10a, and transmits parts to the worker 10a (or configures the parts in specified positions). The worker 10a assembles the parts inside the housing of the workpiece, and moves towards the working platform 12 along with the workpiece.
[0042] (2) Cable Bundling
[0043] On the working platform 12, the robot 12a picks up a bundling band from a stocker according to an instruction (signal) of the worker 10a, and transmits the bundling band to the worker 10a. The worker 10a clusters cables inside the housing of the workpiece and bundles them with a bundling band. Then, the worker 10a hands over the workpiece to the next working platform 13.
[0044] (3) Screwing
[0045] On the working platform 13, the worker 10b sends an instruction (signal) to the robot 13a in a state that the workpiece is configured in a specified position and parts as screwing objects are pressed or the cables are fixed. Then, the robot 13a makes an electric driver vertically drop, and performs screwing. When there are multiple screwing positions, the same work is repeated.
[0046] (4) Inspection
[0047] On the working platform 14, the worker 10b visually inspects the screwed workpiece. In the example, parts assembling, cable bundling, screw fastening, stain and scratches of the appearance of the workpiece are confirmed, and if there is no problem, the workpiece is configured on a finishing stand between the working platform 14 and the working platform 15.
[0048] (5) Packing Preparation
[0049] On the working platform 15, after the worker 10b assemblies a packing box and configures it in a specified position, if an instruction (signal) is sent to the robot 15a, the robot 15a picks up the workpiece from the finishing rack, disposes it in the packing box, and places a packing component into the packing box.
[0050] (6) Packing and Moving Out
[0051] On the working platform 16, after the worker 10a bends an upper cover plate of the packing box and fixes it, if an instruction (signal) is sent to the robot 16a, the robot 16a, after clamping the upper cover of the packing box, disposes it on a specified move-out rack.
[0052] As described above, two workers 10a and 10b cooperatively work with the required robots while moving between the working platforms, to perform the work of assembling products and packing the products. In addition, the composition of the cell production line, the number of the working platforms or the workers, the work contents, the work in the charge of the workers and the robots and the like described herein are merely one example.
[0053] However, when the production mode is implemented, there are several issues to be solved for interaction between the workers and the robots. The first issue is to implement a composition of transferring a worker's instruction (signal) to the robots with required timing (that is, the timing at which the workers complete preparations). The second issue is to implement an operating interface that sends an instruction to the robots according to a natural operation flow (that is, a method in which the worker's activity or time almost has no loss). The compositions are important for the workers and the robots to tacitly cooperate and achieve efficient and accurate production.
[0054] In the robot control system of the embodiment, to solve the issues, a composition that the workers use a worker terminal with a wireless communication function and send an operation command (operation trigger) to the robots through wireless communication is employed. Specific compositions of the compositions are described below.
[0055] (Working Platform and Robot)
[0056] The composition of the working platforms (11, 12, 13, 15 and 16 in
[0057] The working platform 20 is formed by connecting a metal tube with a joint and assembling a top plate 21 or a required frame plate. A horizontal rail 22 is disposed on an upper unit of the working platform 20, and a robot 23 is mounted to the horizontal rail 22.
[0058] The robot 23 of the embodiment does not need an advanced function as the dual-arm robot as long as it can perform simple assisted operations such as transferring objects or screwing as described above. Therefore, a simple and low-cost robot (e.g., a single-arm multi-joint robot) can be used. When cooperative work between the worker and the robot is implemented, preferably, the robot 23 is disposed on an upper unit or a top plate of the working platform 20 or the like according to an operation route of the worker and the requirement of ensuring a work space. Herein, the robot may be lightweight by simplifying the function (effect) and composition of the robot 23. Therefore, it is easy to dispose the robot 23 on the horizontal rail 22 or the top plate (not shown) of the working platform 20.
[0059] The robot 23 has an identification color display unit 24. Identification colors (e.g., red, green, blue, dark red, yellow) different from each other for identifying robots are assigned to the five robots (11a, 12a, 13a, 15a, 16a) shown in
[0060] Moreover, an integrated circuit (IC) tag 25 as a robot identification mark is mounted to the top plate 21 of the working platform 20. A robot identification (ID) assigned to the robot 23 is recorded on the IC tag 25. For the robot ID, like the identification color, different IDs are assigned to respective robots to identify the five robots.
[0061] Moreover, the working platform 20 is provided with a human-sensing sensor 26. The human-sensing sensor 26 is a sensor for sensing whether the worker is near the working platform 20 (that is to say, within an available range of the robot 23). For example, an infrared sensor, a scattered reflection sensor or the like can be used.
[0062] (Worker Terminal)
[0063] Next, the composition of the worker terminal used by the worker is described. In the embodiment, a wearable worker terminal available for the worker to wear is used. Specifically, the worker terminal includes a head-mounted head unit (
[0064] (1) Head Unit
[0065] As shown in
[0066] The head unit body 31 is provided with a power switch 33, a front camera 34, and a gyro sensor 35. Moreover, a computer (control unit) with functions of a signal processing/image processing unit, a central processing unit, a storage unit, a wireless communications unit and so on is disposed in the head unit body 31, which will be described in detail in
[0067] The power switch 33 is a switch for switching power ON/OFF of the head unit 30, and is configured in a position that may not be incorrectly touched by the worker in work such as an edge of a helmet. The front camera 34 is a camera that shoots an image in a gazing direction (the direction that the head faces directly) of the worker. When it is of a video see through type, the image acquired from the front camera 34 is displayed on the see-through display 32. The gyro sensor 35 is an angular velocity sensor for sensing a head movement of the worker and is mounted to the top of the head. In addition, a three-axis acceleration sensor may also be disposed in place of the gyro sensor 35 or disposed together with the gyro sensor 35. The acceleration sensor may also be used for sensing the head movement.
[0068] The see-through display 32 is provided with a target robot prompt unit 36 along an edge thereof. The target robot prompt unit 36 is a luminous body lit on with an identification color of a robot as a cooperative object (target robot), including, for example, an LED. In addition, in the embodiment, the see-through display 32 and the target robot prompt unit 36 are composed of different devices, but the see-through display 32 may also have the function of the target robot prompt unit by making a part thereof display an mage in a color the same as the identification color.
[0069] (2) Arm Unit
[0070] As shown in
[0071] The arm unit body 41 is provided with a power switch 43, a muscle potential sensor 44, a Radio Frequency Identification (RFID) reader 45 and a target robot prompt unit 46. Moreover, a computer (control unit) with functions of a signal processing unit, a central processing unit, a storage unit, a wireless communications unit and so on is disposed in the arm unit body 41, which will be described in detail in
[0072] The power switch 43 is a switch for switching power ON/OFF of the arm unit 40, and configured in a position that may not be incorrectly touched by the worker in work such as on an inner side surface of the arm unit body 41. The muscle potential sensor 44 is a sensor that senses a muscle potential of the front arm of the worker, and mounted to a part of the arm unit body 41 in contact with a skin surface of the front arm. The RFID reader 45 is a sensor for reading a robot ID from the IC tag 25 (refer to
[0073] The fixing band 42 includes a front arm band 42a wound around the front arm, an upper arm band 42a wound around the upper arm, and a connecting band 42c that can elastically connect the front arm band 42a with the upper arm band 42a. The structure of the fixing band 42 has the following function: fixing the arm unit body 41 without being detached from the front arm, mounting the arm unit body 41 to the front arm towards a correct direction, and only mounting one arm unit 40. Use of the fixing band 42 causes the arm unit 40 not to be incorrectly mounted physically. Thus, safety of the cooperative work with the robot in the system can be improved.
[0074] (Functional Composition)
[0075]
[0076] The robot 23 includes a central processing unit 230, a storage unit 231, a driving unit 232, a sensor unit 233, an actuating unit 234 and a wireless communications unit 235. The central processing unit 230 is a processor that performs various operational processing or makes control over blocks of the robot 23 by reading and executing a program stored in the storage unit 231. The storage unit 231 includes a non-volatile memory that stores a program such as firmware or various set parameters, and a volatile memory that can be used as a working memory of the central processing unit 230.
[0077] The driving unit 232 is a control circuit including a circuit that inputs a sensor signal from the sensor unit 233 and a circuit that outputs a driving signal to the actuating unit 234. The sensor unit 233 is an input device for acquiring information used in the control over the robot 23. The actuating unit 234 is an output device that drives the arm, the hand, the tool and the like of the robot 23. The sensor includes various types of sensors such as a light sensor, a sound sensor, a vibration sensor, a temperature sensor, a force sensor (tactile sensor), and a distance sensor, and a desired number and type of the sensor can be set according to the composition or operation content of the robot 23. Moreover, the actuator also includes various types such as a servo motor, a linear actuator, and a solenoid, and a desired number and type of the actuator can be set according to the composition or operation content of the robot 23. The wireless communications unit 235 is a module for conducting wireless communication with the head unit 30 of the worker terminal.
[0078] The head unit 30 of the worker terminal includes a see-through display 32, a front camera 34, a gyro sensor 35, a signal processing/image processing unit 300, a central processing unit 301, a storage unit 302, a wireless communications unit 303 and a target robot prompt unit 36. The signal processing/image processing unit 300 is a circuit that inputs a sensor signal of the gyro sensor 35 and an image signal of the front camera 34 and performs amplification, filtering, analog digital (AD) conversion and the like. The central processing unit 301 is a processor that performs various operational processing or makes control over the see-through display 32, the wireless communications unit 303, the target robot 36 and the like by reading and executing a program stored in the storage unit 302. The storage unit 302 includes a non-volatile memory that stores a program such as firmware, set parameters such as threshold values set by the worker and reference image data for gesture identification, and a volatile memory that can be used as a working memory of the central processing unit 301. The wireless communications unit 303 is a module for conducting wireless communication with the arm unit 40 and the robot 23.
[0079] The arm unit 40 of the worker terminal includes a muscle potential sensor 44, an RFID reader 45, a signal processing unit 400, a central processing unit 401, a storage unit 402, a wireless communications unit 403 and a target robot 46. The signal processing unit 400 is a circuit that inputs a sensor signal of the muscle potential sensor 44 and performs amplification, filtering, AD conversion and the like. The central processing unit 401 is a processor that performs various operational processing or makes control over the wireless communications unit 403, the target robot prompt unit 46 and the like by reading and executing a program stored in the storage unit 402. The storage unit 402 includes a non-volatile memory that stores a program such as firmware or various set parameters, and a volatile memory that can be used as a working memory of the central processing unit 401. The wireless communications unit 403 is a module for conducting wireless communication with the head unit 30.
[0080] The wireless communication between the robot 23, the head unit 30 and the arm unit 40 may be conducted in any manner. For example, Institute of Electrical and Electronics Engineers (IEEE)802.11, IEEE802.15, infrared communication and the like are suitable.
[0081] (Identification and Control Over Target Robots)
[0082] Next, a flow of identifying and controlling target robots in the robot control system in the embodiment is described with reference to the sequence diagram of
[0083] Firstly, the worker moves to the working platform 20 that performs a screwing operation, and uses the arm unit 40 to touch the IC tag 25 of the working platform 20 (S10). Then, the RFID reader 45 of the aim unit 40 reads the robot ID of the robot 23 recorded in the IC tag 25 (S40). The central processing unit 401 of the arm unit 40 sends the read robot ID to the head unit 30 through the wireless communications unit 403 (S41). The central processing unit 301 of the head unit 30 stores the robot ID received from the arm unit 40 in the storage unit 302 (S30).
[0084] Then, the central processing unit 301 of the head unit 30 reads out the identification color corresponding to the robot ID from the storage unit 302, making the target robot prompt unit 36 become an ON status after lighting on and off several times according to the identification color (S31). Moreover, the central processing unit 301 notifies the arm unit 40 about the robot ID or the identification color through the wireless communications unit 303 (S32). Then, the central processing unit 401 of the arm unit 40 makes the target robot prompt unit 46 disposed in the arm unit 40 become an ON status after lighting on and off several times according to the identification color (S32). Afterwards, the worker terminal becomes an operation mode, and becomes a status that an operation instruction of the worker can be received.
[0085]
[0086] Back to
[0087] The worker configures a workpiece in a specified position of the working platform 20, and presses parts or cables as screwing objects with hands and fixes them (S11). Then, if the worker inputs a specified operation instruction (S12), the worker terminal (the head unit 30 and the arm unit 40) receives the operation instruction (S43, S33), and the central processing unit 301 of the head unit 30 sends an operation command to the robot 23 (S34). At this point, the command message records the robot ID stored in S30. The operation command may be received by multiple robots in the cell, but the command message specifies a robot ID, and thus each robot can judge whether it is an operation command for itself, and refuses the operation command not for itself.
[0088] In the example, after receiving the operation command, the robot 23 makes an electric driver on a front end of the arm vertically drop, to screw the parts or cables pressed by the worker (S20). When there are multiple screwing positions, the processing described in S11-S20 is repeated. According to the above contents, cooperative work between the worker and the robot 23 can be performed smoothly.
[0089] Before and after the worker accomplishes the operation and leaves the working platform 20 (S13), the human-sensing sensor 26 senses the absence of the worker (there is no worker within an available range of the robot 23) (S21). Then, the central processing unit 230 of the robot 23 notifies the worker terminal to cut off communication (S22). After receiving the notification, the worker terminal cuts off the communication with the robot 23, making the target robot prompt units 36 and 46 lit off (S35, S44).
[0090] (Input of Operation Instructions)
[0091] Next, a specific example of operation instruction input and processing reception in S12, S33 and S44 of
[0092] The worker terminal of the embodiment inputs an operation instruction according to a combination of change in the muscle potential and the head movement. Specifically, when two conditions are satisfied, i.e., the muscle potential exceeds a threshold value (a first condition) and a nodding operation (an operation of shaking the head longitudinally) is performed within a specified limit item T greater than or equal to X times (a second condition), it is judged that the worker has input an operation instruction to the robot. The values of the threshold value, the limit time T and X can be set by the worker arbitrarily (the set mode will be stated hereinafter). Description is given below by supposing that the limit time T=5 s and X=3 times.
[0093]
[0094] The central processing unit 301 monitors gyro signals (angular velocity) output from the gyro sensor 35, and when detecting a gyro signal exceeding a certain threshold value, judges that a nodding operation is performed, and incrementally counts the number of nodding detection (S83). The number of nodding detection is X times (within (S85; YES), and gyro signals are monitored continuously. When the number of nodding detection reaches X times (3 times) (S84; YES), the central processing unit 301 judges that the worker has input an operation instruction, and sends an operation command to the robot 23 (S86) (refer to D33 in
[0095] Then, the central processing unit 301 resets a timer value and the number of nodding detection (S87), and ends processing. In addition, when the number of nodding detection does not reach X times within the limit time T (S85; NO), the timer value and the number of nodding detection are also reset (S87), and processing is ended.
[0096]
[0097] In the embodiment, as shown in
[0098] In the example of
[0099] In addition, the worker can move the monitor screen 70 to any position on the see-through monitor 32 (in default, the monitor screen is configured in the center of the display). Moreover, the display size or transparency of the monitor screen 70 may change freely. Therefore, the worker can change the display pattern of the monitor screen 70 without affecting the operation.
[0100] (Set Mode)
[0101] Next, the set mode of the worker terminal is described. Switching between the operation mode and the set mode is performed by using a gesture input of the front camera 34, and a mode switching switch may also be provided for the head unit 30 or the arm unit 40. In the set mode, the set parameter (e.g., the threshold value of the muscle potential signal, the nodding detection threshold value, the limit time T, the number of nodding number X, etc.) stored in the storage unit of the head unit 30 or the arm unit 40 can be changed.
[0102]
[0103] A curve graph of a measurement value of the muscle potential sensor 44 (the muscle potential value) is real-time displayed in the setting GUI102 used by the muscle potential sensor. A current threshold value is displayed on the curve graph with a point connecting line. A threshold value changing method includes two types, i.e., auto set and manual set. After auto set is selected by using a hand gesture, guide language of please apply a force to the front arm appears on the set screen 100, and the worker inputs the muscle potential. After the input is performed multiple times (e.g., 5 to 10 times) according to the guide language, their mean and dispersion are calculated, and an appropriate threshold value is calculated according to the resultant value. On the other hand, when manual set is selected, the threshold value on the curve graph rises and falls by using hand gestures.
[0104] Although not shown, the nodding detection threshold value may also be set on the same setting GUI. Moreover, the limit time T or the nodding number X may be input by using a hand gesture or a nodding operation. The changed set parameter is stored in the storage unit of the head unit 30 or the arm unit 40 in a covering way.
Advantages of the Embodiment
[0105] According to the composition of the embodiment, the worker has to intentionally input two types of entirely different operations, i.e., change in muscle potential and head movement, and thus the robot's malfunction caused by an accidental activity or misoperation can be avoided as much as possible. Moreover, as long as they are the muscle potential (apply a force to the arm) and the head movement (nodding), they can be input as a natural activity flow of the worker, even if the worker holds things with two hands, and thus operation efficiency may not be affected.
[0106] Moreover, in the embodiment, a monitor screen 70 is displayed on the see-through display 32, the change in the muscle potential, the nodding detection number, the limit time and so on can be seen, and thus the worker can self-detect whether the input operation is proper, which helps to make the input operation feasible, improve the input skill, suppress the incorrect input and so on. Moreover, owing to the use of the see-through display 32, the worker can make confii iation on the monitor screen 70 when his/her sight does not leave the workpiece or the robot. So, input operations can be performed more safely and feasibly.
[0107] As the body size or body composition has individual differences, the value of the muscle potential or the movement mode of the body of each worker varies greatly. Therefore, the same condition (threshold value or the like) is not employed for all the workers, but input operations can be performed more safely and feasibly by setting a set mode to adjust conditions according to the workers' body features or activity characteristics.
Other Embodiments
[0108] The embodiment indicates a specific example of the invention, not aimed at limiting the scope of the invention to the specific examples.
[0109] For example, the muscle potential sensor 44 may also be mounted to a place other than the front arm. Moreover, in the embodiment, after the muscle potential exceeds a threshold value, a nodding operation is required to be performed multiple times within a specified time, but this is only an example, and any operation can be used as long as it is an operation instruction defined by a combination of the change in the muscle potential and the head movement. Also, in the embodiment, a worker terminal including a head-mounted head unit 30 and an arm unit 40 is illustrated, but the composition is also only an example, and the worker terminal can be any composition as long as it includes a sensor that senses the muscle potential and a sensor that senses the head movement. For example, it is not limited to a wearable form, but may also be a worker terminal available for the worker to hold (handhold), and preferably, a portable computer such as a smart phone or a tablet terminal is used as the worker terminal.
DESCRIPTION OF SYMBOLS
[0110] 10a, 10b: worker [0111] 11-16, 20: working platform [0112] 11a, 12a, 13a, 15a, 16a, 23: robot [0113] 24: identification color display unit [0114] 25: IC tag [0115] 26: human sensing sensor [0116] 30: head assembly [0117] 32: see-through display [0118] 34: front camera [0119] 35: gyro sensor [0120] 36: target robot prompt unit [0121] 40: arm unit [0122] 44: muscle potential sensor [0123] 45: RFID reader [0124] 46: target robot prompt unit