Terminal device

12617090 ยท 2026-05-05

Assignee

Inventors

Cpc classification

International classification

Abstract

A terminal device is configured to photograph a first image including a work target and a marker for the work target, and a second image including a robot marker, the work target, and the marker for the work target, which maintain a positional relation in the first image, set a user coordinate system using the marker for the work target as a reference, set an operation path based on a work spot of the work target included in the first image, create a work program based on the set operation path of the industrial robot, calculate a robot coordinate system based on the robot marker included in the second image and position attitude information of the industrial robot, and modify the work program by converting a coordinate system of the operation path of the industrial robot from the user coordinate system to the robot coordinate system.

Claims

1. A terminal device, comprising: a photographing portion configured to photograph a first image that includes a work target and a marker for the work target, and a second image that includes a robot marker that is attached to an industrial robot, the work target, and the marker for the work target, wherein the work target and the marker for the work target are configured to maintain a positional relation in the first image, and wherein the photographing portion is configured to capture the first image at a first location so that the first image does not include the robot marker, and to capture the second image at a second location that is different from the first location so that the robot marker is included in the second image; a user-coordinate system setting portion configured to set a user coordinate system, using the marker for the work target that is included in the first image as a reference; an operation-path setting portion configured to set an operation path of the industrial robot on the user coordinate system on the basis of a work spot of the work target included in the first image; a program creating portion configured to create a work program for causing the industrial robot to operate on the basis of the set operation path of the industrial robot; a robot-coordinate system calculating portion configured to calculate a robot coordinate system on the basis of the robot marker included in the second image and position attitude information of the industrial robot; and a program modifying portion configured to modify the work program by converting a coordinate system of the operation path of the industrial robot from the user coordinate system to the robot coordinate system.

2. The terminal device according to claim 1, wherein the program modifying portion is configured to convert the coordinate system of the operation path of the industrial robot on the basis of a difference between an origin of the user coordinate system and an origin of the robot coordinate system.

3. The terminal device according to claim 1, wherein the user-coordinate system setting portion is configured to calculate an origin of the user coordinate system and coordinate axes using the marker for the work target as a reference.

4. The terminal device according to claim 1, wherein the robot-coordinate system calculating portion is configured to calculate an origin and coordinate axes of the robot coordinate system on the basis of the robot marker and position attitude information of the industrial robot.

5. The terminal device according to claim 1, wherein the position attitude information of the industrial robot includes angle information of each axis of the industrial robot obtained from a robot control device which controls an operation of the industrial robot.

Description

BRIEF DESCRIPTION OF DRAWINGS

(1) FIG. 1 is a diagram exemplifying a system configuration of a welding robot system including a terminal device according to an Embodiment;

(2) FIG. 2 is a diagram exemplifying a physical configuration of the terminal device and a robot control device shown in FIG. 1;

(3) FIG. 3 is a schematic diagram for explaining a first image photographed by a photographing portion;

(4) FIG. 4 is a schematic diagram for explaining a second image photographed by a photographing portion;

(5) FIG. 5 is a diagram exemplifying a functional configuration in a control portion of the terminal device shown in FIG. 2; and

(6) FIG. 6 is a flowchart for explaining an example of an operation of the terminal device.

DETAILED DESCRIPTION

(7) With reference to the attached drawings, a preferred embodiment of this disclosure will be explained. It is to be noted that, in each drawing, those with the same signs have the same or similar configurations. Moreover, since the drawings are schematic, dimensions and ratios of each of constituent elements are different from actual ones.

(8) FIG. 1 is a diagram exemplifying a system configuration of a welding robot system 100 including a terminal device 1 according to an Embodiment. The welding robot system 100 includes a terminal device 1, a robot control device 2, and a manipulator (industrial robot) 3, for example. The terminal device 1 and the robot control device 2 as well as the robot control device 2 and the manipulator 3 are connected to each other via a network, respectively. The terminal device 1 and the robot control device 2 are connected via wireless communication such as WiFi (Wireless Fidelity), for example, and the robot control device 2 and the manipulator 3 are connected via a communication cable, for example. It is to be noted that the network may be wired (including a communication cable) or wireless.

(9) It is to be noted that the welding robot system 100 may include a teach pendant. The teach pendant is an operating device which can be connected to the robot control device 2 and is used when a worker teaches an operation of the manipulator 3.

(10) The manipulator 3 is a welding robot which performs arc welding to a work W, which is a welding target, in accordance with a performing condition set in the robot control device 2. The manipulator 3 has an articulated arm provided on a base member fixed to a floor surface or the like of a plant and a welding torch (tool) coupled with a distal end of the articulated arm, for example.

(11) To a distal end part of the articulated arm of the manipulator 3, a marker Mr for robot is attached, and a marker Mw for work is disposed on a work W.

(12) The marker Mr for robot and the marker Mw for work only need to be identifiers whose presence can be recognized by the terminal device 1, but preferably, an AR marker may be used. By using the AR marker, when the AR marker is recognized, display of the user coordinate system with the AR marker as an origin superimposed on an actual image can be easily realized.

(13) With reference to FIG. 2, configurations of the robot control device 2 and the terminal device 1 will be explained in order in the following.

(14) The robot control device 2 is a control unit that controls an operation of the manipulator 3 and includes a control portion 21, a storage portion 22, and a communication portion 23, for example.

(15) The control portion 21 is a processor and controls the manipulator 3 by executing a welding program or the like stored in the storage portion 22.

(16) The communication portion 23 is a communication interface and controls communication with the terminal device 1 and the manipulator 3 connected via the network.

(17) It is to be noted that the robot control device 2 may further include a welding power-source portion. The welding power-source portion supplies a welding current, a welding voltage and the like to the manipulator 3 in accordance with a welding performing condition set in advance in order to generate arc between the distal end of the welding wire and the work W, for example. The welding power-source portion may be provided separately from the robot control device 2.

(18) The terminal device 1 is a portable terminal with a digital camera, for example. The portable terminal includes, for example, terminals capable of being carried such as a tablet terminal, a smartphone, a personal digital assistance (PDA), a note PC (personal computer) and the like. The terminal device 1 includes, for example, a control portion 11, a storage portion 12, a communication portion 13, a photographing portion 14, a distance measuring portion 15, and a display portion 16.

(19) The control portion 11 is a processor and controls each part of the terminal device 1 by executing a program stored in the storage portion 12.

(20) The storage portion 12 is a computer-readable recording medium and stores programs for realizing various functions of the terminal device 1 and various types of data used in the programs and the like.

(21) The communication portion 13 is a communication interface and controls communication with the robot control device 2 connected via the network.

(22) The photographing portion (a first photographing portion, a second photographing portion) 14 is a 2D camera including a lens and an image sensor (image pickup element), for example, and converts light of a subject received by the lens into an electric signal (digital image data).

(23) Here, in this Embodiment, the photographing portion 14 exemplarily photographs two types of images, that is, a first image and a second image. With reference to FIG. 3 and FIG. 4, the first image and the second image will be explained.

(24) As shown in FIG. 3, the first image photographed by the photographing portion 14 includes the work W, which is a welding target of the manipulator 3, and a marker for work (marker for welding target) Mw disposed on the work W. The first image does not have to include the manipulator 3. Therefore, when the first image is to be photographed, the manipulator 3 can perform the welding work at a work place (a robot cell, for example) located at a place different from the place where the work W is present, for example.

(25) As shown in FIG. 4, the second image photographed by the photographing portion 14 includes the work W and the marker Mw for work which maintain the positional relation in the first image and the marker Mr for robot attached to the manipulator 3. When the second image is to be photographed, it is only necessary to move the work W and the marker Mw for work included in the first image to the work place of the manipulator 3. It is to be noted that the manipulator 3 with a cart may be moved to the work place where the work W and the marker Mw for work are present.

(26) The distance measuring portion 15 in FIG. 2 is a 3D camera on which a distance measuring sensor is mounted, for example. The distance measuring sensor is a sensor capable of measuring a distance to a target. As the distance measuring sensor, a LIDAR (Light Detection and Ranging) sensor, a millimeter wave sensor, a supersonic sensor and the like can be used, for example.

(27) It is to be noted that the distance measuring portion 15 may be only either one of the 3D camera and the distance measuring sensor. In a case of only the 3D camera, it is preferable that three-dimensional coordinate data corresponding to the object is calculated on the basis of a plurality of images obtained by photographing the object from a plurality of different positions. In this case, a three-dimensional measurement method by a publicly-known stereo method can be used.

(28) Here, by including the distance measuring sensor in the terminal device 1, a positional relation between the image sensor and the distance measuring sensor can be fixed, and timing to obtain data by each sensor can be matched. As a result, processing efficiency of setting a specified position of the marker Mw for work, which will be described later, on point-group data can be improved, for example.

(29) The display portion 16 is a display having a touch panel, for example, for displaying an image of a subject photographed by the photographing portion 14 and for receiving an input of an operation instruction or the like by a worker. The display portion 16 may be provided as a display device having a touch panel separately from the terminal device 1, for example.

(30) FIG. 5 is a diagram exemplifying a functional configuration in the control portion 11 of the terminal device 1 according to the Embodiment. The control portion 11 of the terminal device 1 has, as a functional configuration, a user-coordinate system setting portion 111, a coordinate giving portion 112, an operation-path setting portion 113, a program creating portion 114, a robot-coordinate system calculating portion 115, and a program modifying portion 116.

(31) The user-coordinate system setting portion 111 sets a three-dimensional user coordinate system with the marker Mw for work included in the first image photographed by the photographing portion 14 as a reference. The first image includes the work W and the marker Mw for work as shown in FIG. 3.

(32) Specifically, the user-coordinate system setting portion 111 calculates the three-dimensional coordinate axis (direction) with a specified position of the marker Mw for work (a corner of the marker, a center of the marker or the like, for example) as the origin and sets the user coordinate system.

(33) It is to be noted that the origin of the user coordinate system is not limited to the specified position of the marker Mw for work. For example, it may be an origin of a provisional robot coordinate system determined with the specified position of the marker Mw for work as a reference. The provisional robot coordinate system can be set as a coordinate system of a virtual model by placing the virtual model of the manipulator beside the work W, for example. By fixing the position of the marker Mw for work with respect to the virtual model of the manipulator, the specified position of the marker Mw for work can be specified on the provisional robot coordinate system. The user coordinate system in this case can be handled as the provisional robot coordinate system.

(34) It may be configured such that a position where the virtual model of the manipulator is placed can be arbitrarily specified by a worker who touches or the like an image of a work space in accordance with guidance in sound or by a character message or the like, for example.

(35) The coordinate giving portion 112 in FIG. 5 gives a coordinate on the user coordinate system to point-group data obtained by the distance measuring portion 15 which measures a distance to an object included in the image photographed by the photographing portion 14. Hereinafter, specific explanation will be made.

(36) The coordinate giving portion 112 detects the specified position of the marker Mw for work on the basis of the first image and sets the specified position of the detected marker Mw for work on the point-group data obtained by the distance measuring portion 15. The coordinate giving portion 112 gives the coordinate on the user coordinate system with the specified position of the marker Mw for work set on the point-group data as the origin (reference) to the point-group data.

(37) As a result, the point-group data can be drawn as data of the user coordinate system. Regarding the specified position of the marker Mw for work set on the point-group data, the specified position of the marker Mw for work on the point-group data may be automatically recognized by data analysis or may be specified by the worker's pointing out of the specified position of the marker Mw for work on the point-group data or the like, for example.

(38) The operation-path setting portion 113 sets the operation path of the manipulator 3 on the user coordinate system on the basis of a welding spot of the work W included in the first image.

(39) Exemplarily, the operation-path setting portion 113 sets the operation path of the virtual model of the manipulator so that the welding torch coupled with the distal end of the virtual model of the manipulator operates along the welding spot of the work W in the first image. The operation path may be set by the user's manual selection or set automatically by calculating a distance between the welding spot of the work W and the welding torch.

(40) Moreover, the welding spot of the work W may be selected by detecting candidates for the welding spot on the basis of the point-group data and through selection by the user from the detected candidates for the welding spot or may be selected by detecting the candidates for the welding spot through image processing of the first image and through selection by the user from the detected candidates for the welding spot. When the welding spot is to be selected, a welding order of the welding spots, a welding direction or the like may be selected or specified by the user.

(41) The program creating portion 114 creates a welding program for welding the welding spot of the work W included in the first image on the basis of the operation path of the manipulator 3 on the user coordinate system set by the operation-path setting portion 113.

(42) The robot-coordinate system calculating portion 115 calculates the robot coordinate system on the basis of the marker Mr for robot included in the second image photographed by the photographing portion 14 and the position attitude information of the manipulator 3. The second image includes, as shown in FIG. 4, the work W and the marker Mw for work which maintain the positional relation in the first image and the marker Mr for robot attached to the manipulator 3.

(43) Specifically, the robot-coordinate system calculating portion 115 calculates the robot coordinate system by calculating the origin of the robot coordinate system on the basis of the marker Mr for robot and by calculating the coordinate axis (direction) of the robot coordinate system on the basis of the position attitude information of the manipulator 3.

(44) Here, the robot control device 2 holds the coordinate on the robot coordinate system corresponding to the attached position of the marker Mr for robot and the position attitude information of the manipulator 3 controlled by the robot control device 2. The position attitude information of the manipulator 3 includes the angle information of each axis of the manipulator 3, for example.

(45) Contents of the position attitude information of the manipulator 3 might change depending on the type of the manipulator 3 and thus, when the position attitude information of the manipulator 3 is to be obtained from the robot control device 2, the type of the manipulator 3 may be obtained with that. In this case, the robot-coordinate system calculating portion 115 preferably calculates the coordinate axis (direction) of the robot coordinate system on the basis of the position attitude information of the manipulator 3 and the type of the manipulator 3.

(46) The program modifying portion 116 in FIG. 5 modifies the welding program by converting the coordinate system of the operation path of the manipulator 3 from the user coordinate system to the robot coordinate system.

(47) Exemplarily, the program modifying portion 116 calculates a difference between the origin of the user coordinate system and the origin of the robot coordinate system and moves the coordinate of the operation path of the manipulator 3 on the user coordinate system on the basis of the calculated difference.

(48) As a result, the coordinate of the operation path can be converted from the user coordinate system to the robot coordinate system. The program modifying portion 116 modifies the welding program created by the program creating portion 114 so that it is expressed by the coordinate on the robot coordinate system after the coordinate conversion.

(49) The program modifying portion 116 transmits the modified welding program to the robot control device 2 and causes it to be stored in the storage portion 22 of the robot control device 2. As a result, the control portion 21 of the robot control device 2 can control the manipulator 3 so that the welding spot of the work W included in the second image is welded in accordance with the modified welding program.

(50) With reference to FIG. 6, an example of the operation of the terminal device 1 will be explained.

(51) First, the photographing portion 14 of the terminal device 1 photographs the first image including the work W, which is a welding target of the manipulator 3, and the marker Mw for work disposed on the work W (Step S101).

(52) Subsequently, the user-coordinate system setting portion 111 of the terminal device 1 sets the three-dimensional user coordinate system with the marker Mw for work included in the first image photographed at the aforementioned Step S101 as a reference (Step S102).

(53) Subsequently, the operation-path setting portion 113 of the terminal device 1 sets the operation path of the manipulator 3 on the user coordinate system on the basis of the welding spot of the work W included in the first image (Step S103).

(54) Subsequently, the program creating portion 114 of the terminal device 1 creates the welding program for welding the welding spot of the work W included in the first image on the basis of the operation path of the manipulator 3 on the user coordinate system set at the aforementioned Step S103 (Step S104).

(55) Subsequently, the photographing portion 14 of the terminal device 1 photographs the second image including the work W and the marker Mw for work which maintain the positional relation in the first image and the marker Mr for robot attached to the manipulator 3 (Step S105).

(56) Subsequently, the robot-coordinate system calculating portion 115 of the terminal device 1 calculates the robot coordinate system on the basis of the marker Mr for robot and the position attitude information of the manipulator 3 included in the second image photographed at the aforementioned Step S105 (Step S106).

(57) Subsequently, the program modifying portion 116 of the terminal device 1 converts the coordinate system of the operation path of the manipulator 3 set at the aforementioned Step S103 from the user coordinate system to the robot coordinate system so as to modify the welding program created at the aforementioned Step S104 (Step S107). And this operation is finished.

(58) As described above, according to the terminal device 1 according to the Embodiment, the user coordinate system with the marker Mw for work as a reference is set by photographing the first image including the work W and the marker Mw for work, and the welding program of the manipulator 3 can be created on the basis of the operation path of the manipulator 3 on the user coordinate system set on the basis of the welding spot of the work W. Then, the robot coordinate system based on the marker Mr for robot and the position attitude information of the manipulator 3 is calculated by photographing the second image including the work W and the marker Mw for work which maintain the positional relation in the first image and the marker Mr for robot attached to the manipulator 3, and the welding program can be modified by converting the coordinate system of the operation path of the manipulator 3 from the user coordinate system to the robot coordinate system.

(59) As a result, the welding program of the manipulator 3 on the user coordinate system is created in advance on the basis of the first image not including the manipulator 3, the coordinate system indicating the operation path of the manipulator 3 is converted from the user coordinate system to the robot coordinate system on the basis of the second image to which the marker Mr for robot attached to the manipulator 3 to be photographed after that is added, and the welding program based on the user coordinate system can be modified to the welding program based on the robot coordinate system.

(60) Thus, according to the terminal device 1 according to the Embodiment, an occupancy rate of the manipulator 3 when the welding program is to be created can be lowered and thus, the welding program can be created without lowering the operation rate of the manipulator 3.

(61) It is to be noted that this disclosure is not limited to the aforementioned Embodiment but can be worked in the other various forms within the range not departing from the gist of this disclosure. Therefore, the aforementioned Embodiment is only exemplification and should not be interpreted in a limited manner.

(62) For example, in the aforementioned Embodiment, the explanation was made by using the welding robot, but this is not limiting. For example, this disclosure can be applied to industrial robots including a handling robot performing picking or the like. In this case, the welding program, the welding target, the welding spot, and the welding operation used in the aforementioned Embodiment can be replaced with a work program, a work target, a work spot, and a work, respectively.