METHOD OF CLOSED-LOOP POINT TO POINT ROBOT PATH PLANNING BY ONLINE CORRECTION AND ALIGNMENT VIA A DUAL CAMERA VISION SYSTEM

20200262065 ยท 2020-08-20

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for point-to-point path planning of a manipulator robot with up to 6 degrees-of-freedom via a dual vision system aims at generating a rest-rest path and corrects it with a high precision through closed-loop pick and place path planning. The path is corrected and aligned with the desired path as soon as different visual feedbacks such as the position and orientation of the pick nests, the placement nests, and the workpiece (part) are observed via the dual vision system. The introduced path planning method is a comprehensive online approach that benefits from: (i) An advantageous path definition based on multiple coordinate systems, (ii) An online path planner with three correction procedures that corrects the pose of the workpiece with respect to the robot, to the desired path and to the placement nest.

    Claims

    1. An online path planner with three correction and alignment procedures comprising: i) look before picking (LBP); ii) correct pose on path (CPP); and iii) correct pose on nest (CPN), wherein each procedure respectively corrects and aligns a pose of: a) a workpiece with respect to a robot; b) the workpiece with respect to a desired path; and c) the workpiece with respect to a placement nest.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0012] FIG. 1 is a 3D model of a gantry robot suitable to be used to implement an embodiment of a method of the present invention;

    [0013] FIG. 2 is a schematic view of a robot path and dual camera vision system in accordance with an aspect of the present invention;

    [0014] FIG. 3 is a schematic view showing flexibly of workpiece poses using an embodiment of the system and method in accordance with an aspect of the present invention;

    [0015] FIG. 4 is a schematic representation of an exemplary method in accordance with an aspect of the present invention using three pick and placement nests;

    [0016] FIG. 5 is an exemplary flowchart of the exemplary method shown in FIG. 4; and

    [0017] FIG. 6 is a table compiling the operations utilized within the exemplary method shown in FIGS. 4 and 5.

    DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

    System Description

    [0018] FIG. 1 shows the 3D model of the gantry robot used to implement the proposed method. As shown in FIG. 1, the robot has two end-effectors (tools) and equipped with a dual camera vision system, where the downward looking camera (DLC) is fixed to the end-effector of the robot and moves as the end-effector moves. The upward looking camera is though fixed to the base of the machine as shown in FIG. 1. The tools on the end-effectors are configurable and therefore one or both tools can be used for pick and place application by installing different grippers such as vacuum grippers. [0019] [10] Robot X axis [0020] [20] Robot Y axis [0021] [30] Robot Z and Rotation axesEnd Effector 1 [0022] [40] Robot Z and Rotation axesEnd Effector 2 [0023] [50] Downward Looking Camera (DLC) [0024] [60] Upward Looking Camera (ULC) [0025] [70] Pick/Placement Nests and the Workpiece (Part)
    Convention Used to Represent the Pose of Objects and Transformation Between them

    [0026] FIG. 2 illustrates the schematic of the robot path and the dual camera vision system. It also includes some basic definitions of the coordinate systems (poses of different physical components).

    [0027] The pose of each object is uniquely represented by a certain coordinate system fixed to an object, where the origin of coordinate system is located at a predefined point on the object. A linear (affine) transformation .sup.custom-character.sup. is used to transform from one pose (coordinate system ) to another one (coordinate system ):

    [00001] T a = [ R a 3 3 d a 3 1 0 1 3 1 ] ( 1 )

    where .sup.custom-character.sub.33.sup. and .sup.custom-character.sub.31.sup. are respectively the rotation matrix and the translation vector that is used to transform from one pose (coordinate system ) to another pose (coordinate system ). This way, both the transformation from an object () to another object (), and the pose of latter object () with respect to the former one (), can be represented only by one transformation matrix (.sup.custom-character.sup.).
    For example, in order to transform from pose to , one can transform from pose to and then from to :


    .sup.custom-character.sup.=.sup.custom-character.sup.custom-character.sup.(2)

    According to the property of the rotation matrix, one can find the inverse transformation using the following equation:

    [00002] T = ( T a ) - 1 = [ ( R a 3 3 ) T - ( R a 3 3 ) T .Math. ( d a 3 1 ) 0 1 3 1 ] ( 3 )

    where .square-solid..sup.1 and .square-solid..sup.T denote the inverse and the transpose of a matrix, respectively.

    [0028] The world coordinate system (W), the camera coordinate system (C), and the end-effector coordinate system (E) are three most useful poses used in the path planning, where W is the main reference coordinate system fixed on the corner of the base of the robot, E represents the active end-effector of the robot and C represents either DLC or ULC.

    [0029] A useful application of equation (2), is to obtain the pose of an arbitrary object (O) with respect to the world coordinate system through the camera once an object is observed via DLC or ULC as:


    .sup.Wcustom-character.sup.O=.sup.Wcustom-character.sup.CCcustom-character.sup.O(4)

    [0030] Another advantageous application of equation (2), is to obtain the pose of an arbitrary object (O) with respect to the world coordinate system through the end-effector, once the object needs to be picked by the active end-effector:


    .sup.Wcustom-character.sup.O=.sup.Wcustom-character.sup.EEcustom-character.sup.O(5)

    An Advantageous Path Definition Based on Different Coordinate Systems

    [0031] The proposed method is based on a specific convention to define the spatial trajectory for the pose (both the position and orientation) of the workpiece (part), which is called the path definition. This path is introduced as a sequence of q key points (poses) {.sup..sup.1custom-character.sup.P.sup.1, .sup..sup.2custom-character.sup.P.sup.2, . . . , .sup..sup.mcustom-character.sup.P.sup.m, . . . , .sup..sup.q1custom-character.sup.P.sup.q1, .sup..sup.qcustom-character.sup.P.sup.q}, where each pose can be defined with respect to an arbitrary coordinate system. These arbitrary coordinate systems can be selected afterwards such that the time-to-time variations of the relative poses of the nests with respect to each other as well as the workpiece with respect to the robot will not entail any side effects on the values defined for the poses of the key points of the path. This advantageous property will be used afterwards for the online path planner. FIG. 3 shows how flexibly the poses of the workpiece can be defined on the path using this general approach; the user can define the pose of the part on the path at a certain moment like t=k, 1kq with respect to an arbitrary coordinate system, .sub.k, as .sup..sup.kcustom-character.sup.P.sup.k.

    [0032] To provide a simple visualization, without loss of generality, only three pick and placement nests are considered in FIGS. 4-6. Generalizing the 3 pick and placement nests shown in FIG. 6 to r.sub.1 pick nests (A.sub.1, A.sub.2, . . . , A.sub.j.sub.1, . . . , A.sub.r.sub.1.sub.1, A.sub.r.sub.1) or r.sub.2 placement nests (B.sub.1, B.sub.2, . . . , B.sub.j.sub.1, . . . , B.sub.r.sub.1.sub.1, B.sub.r.sub.1), respectively, we can use the unified notations of FIG. 6, where r nest are defined as (N.sub.1, N.sub.2, . . . , N.sub.j, . . . , N.sub.r-1, N.sub.r) for both types of nests. For any specific application, the nests can be grouped into two groups with one overlapping member (N.sub.j): a) the Varying group (N.sub.1, N.sub.2, . . . , N.sub.j), and b) the Fixed group (N.sub.1, N.sub.r-1, N.sub.r), where the relative poses between two consequent nests is not altering among the Fixed group and the values of these relative poses are known and determined with an acceptable accuracy, while such relative poses can alter among the Varying group.

    [0033] A useful customization of this path definition will be used throughout the rest of manuscript hereafter, where the following assumptions are made to obtain it: [0034] [1] The picking pose .sup..sup.1custom-character.sup.P.sup.1 (position and orientation) as the very first point of the path is defined with respect to the coordinate system which is attached to the pose-varying pick nest (A.sub.j.sub.1) as

    [00003] T P 1 A j 1 . [0035] [2] The placing pose .sup..sup.qcustom-character.sup.P.sup.q the very last point of the path is defined with respect to the coordinate system which is attached to the last pose-varying placement nest (B.sub.j.sub.2) as

    [00004] B j 2 .Math. T P q . [0036] [3] The immediate pose that robot needs to plan to move to immediately either after the pick pose or before the placement pose, which respectively added to the path after the very first key point as the second key point .sup..sup.2custom-character.sup.P.sup.2 and before the very last key point as second last key point .sup..sup.q1custom-character.sup.P.sup.q1, are easier for the user to be defined in terms of the approaching offset with respect to the first and last key points as .sup..sup.1custom-character.sup.P.sup.2 and .sup..sup.qcustom-character.sup.P.sup.q1. Therefore, their poses with respect to the last Varying nests are obtained as follows:

    [00005] A j 1 .Math. T P 2 .Math. = A j 1 .Math. T P 1 .Math. .Math. T P 2 P 1 .Math. .Math. B j 2 .Math. T P q - 1 .Math. = B j 2 .Math. T P q .Math. .Math. T P q - 1 P q ( 6 ) [0037] [4] Up to this step, the very first two key points at the beginning of the path as well as the very last two key points at the end of the path are defined. The rest of the key points on the path can be arbitrarily defined with respect to the world coordinate system, one of the pick and placement nests among the Fixed groups, the workpiece at the first or last key points, the cameras, or etc. [0038] [5] In cases where the ULC is required to be used, the only requirement in the definition of the path is to assure that the workpiece can be seen via the ULC somewhere in the middle of the path at a speed lower than a predefined maximum speed. For example, for the middle point (t=m), the pose can be defined with respect to the ULC as .sup.C.sup.1custom-character.sup.P.sup.m. The maximum speed is defined by the Vision System requirement such that the camera has enough time to properly capture a single-frame image from the workpiece's visual features. [0039] [6] Another assumption made here is that, the position of the workpiece (part) with respect to the active end-effector of the robot is fixed, therefore the user can define .sup.Pcustom-character.sup.E.sup.desired that represents the location on the part, where the robot is required to pick the part at, or in other words, .sup.Pcustom-character.sup.E.sup.desired is the desired pose of the end-effector with respect to the part during pick and place process. It should be noted that although the robot would try to pick the part such that .sup.Pcustom-character.sup.E.sup.desired is considered, there are some possible inaccuracy in the position of the object compared to theory, and also some slippage may occur when the part is being picked, and thus the value of the actual .sup.Pcustom-character.sup.E would be most likely different from its desired value and needs to be corrected afterwards.
    Therefore, the path definition used by the online path planner throughout the next section will be defined as follows:

    [00006] Path desired = { A j 1 .Math. T P 1 , T P 2 P 1 , T P 3 W , .Math. .Math. , T P m - 1 W , T P m C , T P m + 1 W , .Math. .Math. , T P q - 2 W , T P q - 1 P q .Math. , B j 2 .Math. T P q } ( 7 )

    Online Path Planner with Three Correction Procedures

    [0040] The proposed method get feedback from a dual camera vision system to correct the dynamic pose of the workpiece 1) with respect to the robot, 2) with respect to the desired pick-and-place path, and 3) with respect to the pose-varying placement nests in an online manner through three procedures:

    [0041] 1) Correct Before Pick (CBP): The online path planner via CBP procedure can correct the pose at which the end-effector of the robot picks the workpiece in the beginning of motion through DLC. In this procedure, the pose of the last varying pick nest (A.sub.j.sub.1) must be observed as

    [00007] T A j 1 C

    via DLC and then all poses that are dependent on this nest will be corrected accordingly. It should be noted that, after CBP the relative pose of the workpiece with respect to the robot is not always guaranteed to be determined due to possible slippage or other type of physical interaction between the workpiece and the robot gripper (end-effector). However, this possible undesired dislocation is often controllable inside a desired range with an acceptable accuracy for placement. Otherwise, this discrepancy can be corrected in the next procedure.

    [0042] 2) Correct After Pick (CAP): the online path planner via CAP procedure can correct the pose of the workpiece with respect to the desired path in the middle of the path through ULC. In this procedure, the pose of the workpiece (part) is observed as .sup.Ccustom-character.sup.P.sup.m via ULC and the relative pose of the part with respect to the robot will be corrected accordingly. This procedure can be done only after the workpiece is picked and the motion is started, it is assumed that this is done at some point in the middle of the path, referred to as the middle point and indexed by the subscripted m. Since the workpiece would not be dislocated after it is picked, this correction procedure is used to align the pose of the workpiece on the path as expected and accordingly to correct the placement error that otherwise cannot be corrected.

    [0043] 3) Correct Earlier than Placement (CEP): the online path planner via CEP procedure can correct the pose of the workpiece with respect to the pose of placement nests through DLC. In this procedure, the pose of the last varying placement nest (B.sub.j.sub.2) must be observed as

    [00008] T B j 2 C

    via DLC and then all poses that are dependent on this nest will be corrected accordingly. The CEP procedure can be done either immediately before the placement or at any time earlier than placement for example before when the motion gets started and the part has not been fed to start the pick and place process. It should be noted that CEP must be done once the poses of the placement nests can be guaranteed afterwards to be not changing while the pick- and place process gets started and continues to be happening.

    Correction Procedures

    [0044] The three correction procedures, CBP, CAP, and CEP, can be considered in two sections based on the camera used for correction.

    [0045] 1) Correction via DLC: To perform CBP and CEP procedures, custom-character and custom-character are assumed to be observed via the DLC, respectively. Using general notation of N.sub.j for nests A.sub.j.sub.1 and B.sub.j.sub.2, one can obtain the pose of the nest through the camera using:


    .sup.Wcustom-character.sup.N.sup.j=.sup.Wcustom-character.sup.CCcustom-character.sup.N.sup.j(8)

    A similar equation can be written to obtain the pose of the part through the nest:


    .sup.Wcustom-character.sup.P.sup.k=.sup.Wcustom-character.sup.N.sup.j.sup.N.sup.jcustom-character.sup.P.sup.k(9)

    One can obtain equation (10) by combine (8) and (9).


    .sup.Wcustom-character.sup.P.sup.k=.sup.Wcustom-character.sup.CCcustom-character.sup.N.sup.j.sup.N.sup.jcustom-character.sup.P.sup.k(10)

    Equation (10) can be used as a general formula to obtain the pose of the workpiece (part) in world coordinate system, in which the camera observation is considered. However, the value of .sup.Ccustom-character.sup.N.sup.j is not always available. While the actual value of .sup.Ccustom-character.sup.N.sup.j will be available once the nest is observed via DLC as .sup.Ccustom-character.sub.observed .sup.N.sup.j, it is required to be initially approximated as .sup.Ccustom-character.sub.approx.sup.N.sup.j using equation (11), which itself is resulted by (8).


    .sup.Ccustom-character.sub.approx.sup.N.sup.j=(.sup.Wcustom-character.sup.C).sup.1Wcustom-character.sup.N.sup.j(11)

    In equation (11), .sup.Wcustom-character.sup.N.sup.j is obtained from initial estimation according to the robot CAD model.

    [0046] 2) Correction via ULC: To perform CAP procedure, .sup.Ccustom-character.sup.P.sup.m is assumed to be observed via the ULC at the middle point (m.sup.th moment) of the path. Therefore, one can obtain the pose of the part in world coordinate system using:


    .sup.Wcustom-character.sup.P.sup.m=.sup.Wcustom-character.sup.CCcustom-character.sup.P.sup.m(12)

    Moreover, the pose of the end-effector at the middle point is assumed to be .sup.Wcustom-character.sup.E.sup.m, and thus the pose of the part can be obtained through the robot's end-effector using:


    .sup.Wcustom-character.sup.P.sup.m=.sup.Wcustom-character.sup.E.sup.m.sup.E.sup.mcustom-character.sup.P.sup.m(13)

    Now, let's use equation (2) to obtain pose of end-effector in world coordinate system through the part at some arbitrary moment k:


    .sup.Wcustom-character.sup.E.sup.k=.sup.Wcustom-character.sup.P.sup.k.sup.P.sup.kcustom-character.sup.E.sup.k(14)

    Since the pose of end-effector with respect to the part must be fixed during the pick and place, we can conclude that .sup.P.sup.kcustom-character.sup.E.sup.k=.sup.P.sup.mcustom-character.sup.E.sup.m=.sup.Pcustom-character.sup.E, and therefore .sup.Pcustom-character.sup.E can be written as equation (15) by combining (12) and (13).


    .sup.Pcustom-character.sup.E=(.sup.Wcustom-character.sup.CCcustom-character.sup.P.sup.m).sup.1(.sup.Wcustom-character.sup.E.sup.m)(15)

    Substituting .sup.P.sup.kcustom-character.sup.E.sup.k in (14) with .sup.Pcustom-character.sup.E of (15), we obtain the general transformation of (16) for every k>m.


    .sup.Wcustom-character.sup.E.sup.k=(.sup.Wcustom-character.sup.P.sup.k)(.sup.Wcustom-character.sup.CCcustom-character.sup.P.sup.m).sup.1(.sup.Wcustom-character.sup.E.sup.m) (16),

    where T m is obtained from the ULC observation (.sup.Ccustom-character.sub.observed.sup.P.sup.m), once the CAP procedure is performed. However, can be initially approximated using the desired value .sup.Ccustom-character.sub.desired.sup.P.sup.m assigned by user in the path definition of (7). In the case where the middle point of the path is defined by p user with respect to the world coordinate system as .sup.Wcustom-character.sup.P.sup.m instead of the camera, an initial approximation can be calculated by inversing (12) as:


    .sup.Ccustom-character.sub.approx.sup.P.sup.m=(.sup.Wcustom-character.sup.C).sup.1Wcustom-character.sup.P.sup.m(17)

    Moreover, .sup.Wcustom-character.sup.E.sup.m in equation (16) is obtained from (18), where the .sup.Pcustom-character.sub.desired.sup.E is defined by user and .sup.Wcustom-character.sup.P.sup.m is also obtained either directly from the path defined by the user or by first obtaining .sup.Wcustom-character.sup.P.sup.m using (12) and .sup.Ccustom-character.sup.P.sup.m from the path definition.


    .sup.Wcustom-character.sup.E.sup.m=.sup.Wcustom-character.sup.P.sup.m.sup.Pcustom-character.sub.desired.sup.E(18)

    At the end, it should be noted that after we obtain the new observation for .sup.Ccustom-character.sup.P.sup.m via the ULC, the corrected pose of the end-effector with respect to the part can be calculated using:


    .sup.Pcustom-character.sub.corrected.sup.E=(.sup.Wcustom-character.sup.CCcustom-character.sub.observed.sup.P.sup.m).sup.1(.sup.Wcustom-character.sup.E.sup.m)(19)

    [0047] The introduced point-to-point path planning method is a comprehensive online approach that is based on a highly flexible path definition method, by use of which, an online correction and alignment method is applied on the pose of both the robot's end-effector and the workpiece with respect to the path and the pick and/or placement nest.

    [0048] An advantageous path definition based on multiple coordinate systems is introduced and used in the proposed method that allows hierarchal transformation from the world coordinate system to the final varying coordinates defined as poses of the workpiece (part) on the path. This beneficial definition is the basis for online correction and alignment on the path as new observations from dual vision camera are obtained.

    [0049] An online path planner with three correction and alignment procedures i) LBP (look before picking), ii) CPP (correct pose on path), iii) CPN (correct pose on nest) that respectively corrects and aligns the pose of a) the workpiece with respect to the robot, b) the workpiece with respect to the desired path and c) the workpiece with respect to the placement nest. Each of these three procedures can be used according to the requirements of a specific application that the robot is tasked to perform.

    [0050] From the foregoing, it will be seen that this invention is one well adapted to attain all the ends and objects hereinabove set forth together with other advantages which are obvious and which are inherent to the method and apparatus. It will be understood that certain features and sub combinations are of utility and may be employed without reference to other features and sub combinations. This is contemplated by and is within the scope of the claims. Since many possible embodiments of the invention may be made without departing from the scope thereof, it is also to be understood that all matters herein set forth or shown in the accompanying drawings are to be interpreted as illustrative and not limiting.

    [0051] The constructions described above and illustrated in the drawings are presented by way of example only and are not intended to limit the concepts and principles of the present invention. As used herein, the terms having and/or including and other terms of inclusion are terms indicative of inclusion rather than requirement.

    [0052] While the invention has been described with reference to preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof to adapt to particular situations without departing from the scope of the invention. Therefore, it is intended that the invention not be limited to the particular embodiments disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope and spirit of the appended claims.