CALIBRATION ARTICLE FOR A 3D VISION ROBOTIC SYSTEM
20180222055 ยท 2018-08-09
Inventors
Cpc classification
International classification
Abstract
A calibration article is provided for calibrating a robot and 3D camera. The calibration article includes side surfaces that are angled inward toward a top surface. The robot and camera are calibrated by capturing positional data of the calibration article relative to the robot and the camera. The captured data is used to generate correlation data between the robot and the camera. The correlation data is used by the controller to align the robot with the camera during operational use of the robot and camera.
Claims
1. A method of calibrating a 3D vision robotic system, comprising: positioning a calibration article within a work space of a robot, wherein the calibration article comprises at least four side surfaces and a top or bottom surface, the side surfaces being angled inward toward the top or bottom surface with at least two side surfaces being angled inward at different angles from each other; positioning the 3D camera in a viewing position, wherein at least two of the side surfaces angled at different angles and the top or bottom surface are viewable in the viewing position; capturing a 3D image in the viewing position with the 3D camera, the 3D image comprising positional data corresponding to the viewable side surfaces and top or bottom surface in the viewing position; generating calibration data from the 3D image, wherein the calibration data aligns the camera and the robot; storing the calibration data in memory; and transforming between positions of the robot and the camera during use thereof based upon the calibration data.
2. The method according to claim 1, wherein the 3D camera is mounted on the robot, the 3D camera being positioned in the viewing position with the robot.
3. The method according to claim 1, further comprising storing 3D shape data of the calibration article in memory, wherein the calibration data is generated from the 3D image and the 3D shape data.
4. The method according to claim 1, wherein the 3D camera is positioned in at least two viewing positions, one of the 3D images being captured in each of the two viewing positions, and the calibration data is generated from the 3D images from each viewing position.
5. The method according to claim 1, wherein at least three of the side surfaces are angled inward at different angles from each other.
6. The method according to claim 5, wherein at least four of the side surfaces are angled inward at different angles from each other.
7. The method according to claim 1, wherein the calibration article comprises no more than four side surfaces.
8. The method according to claim 7, wherein the top surface is rectangular and a base of the side surfaces is rectangular.
9. The method according to claim 1, wherein the side surfaces are angled 30 to 75 from a vertical plane.
10. The method according to claim 1, wherein the side surfaces and the top surface are each planar.
11. The method according to claim 10, wherein the side surfaces and the top surface lack recesses or protrusions thereon.
12. The method according to claim 1, wherein the side surfaces are adjoined to the top surface and each side surface is adjoined to another side surface on opposite sides thereof.
13. The method according to claim 1, wherein the side surfaces are adjoined to a bottom surface, the bottom surface being flat and adapted to rest on a flat surface.
14. The method according to claim 1, further comprising a mounting plate extending outward from a bottom of the side surfaces.
15. The method according to claim 14, wherein the mounting plate defines the largest plane of the calibration article.
16. The method according to claim 1, wherein the calibration article comprises no more than four side surfaces, and each of the four side surfaces are angled inward at different angles from each other.
17. The method according to claim 16, wherein the side surfaces are angled 30 to 75 from a vertical plane, and the side surfaces and the top surface are each planar.
18. The method according to claim 17, wherein the side surfaces are adjoined to the top surface and each side surface is adjoined to another side surface on opposite sides thereof.
19. The method according to claim 18, wherein the side surfaces and the top surface lack recesses or protrusions thereon.
20. The method according to claim 19, wherein the top surface is rectangular and a base of the side surfaces is rectangular, and the side surfaces are adjoined to a bottom surface, the bottom surface being flat and adapted to rest on a flat surface.
21. The method according to claim 1, wherein at least one of the side surfaces and the top or bottom surface has a pattern thereon.
22. The method according to claim 1, wherein at least one of the side surfaces and the top or bottom surface has a distinguishable color thereon.
Description
BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
[0005] The invention may be more fully understood by reading the following description in conjunction with the drawings, in which:
[0006]
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
DETAILED DESCRIPTION
[0014] Referring now to the figures, and particularly
[0015] A vision system 14 is also provided. The vision system 14 includes a 3D camera 14 that is capable of viewing objects in three dimensions within the camera space of the camera 14. Numerous types of 3D cameras 14 exist and may be used with the robot 12. Conventional 3D cameras 14 can use infrared beams, laser beams or stereoscopy. While photography cameras typically output a human viewable image of a scene, the 3D camera 14 need not produce a human readable image. Typically, the output of the camera 14 will be point cloud data that records three dimensional data for a finite number of points on the various surfaces of the objects in the camera space. As shown in
[0016] In order to provide accurate transformation between the robot 12 work space and the camera space of the 3D camera 14, a calibration article 20 is provided. The calibration article 20 is shown generally in
[0017] A coordinate frame {OXYZ} may be attached to the robot base 11 (b), the robot end effector 13 (e), the camera 14 (c), and the calibration article 20 (a), respectively. The coordinate frames may be designated as {OXYZ}.sub.b, {OXYZ}.sub.e, {OXYZ}.sub.c and {OXYZ}.sub.a. The relationship between any two coordinate frames can be represented by a 4 by 4 transformation matrix T.sub.r.sup.s, which is essentially the position and orientation of coordinate frame s in coordinate frame r. With this transformation matrix, the coordinates of any point P in coordinate frame r can be obtained from its coordinates in coordinate frame s with the equation:
[0018] In
T.sub.b.sup.p=T.sub.b.sup.e*T.sub.e.sup.c*T.sub.c.sup.p(2)
where T.sub.b.sup.p and T.sub.c.sup.p are the positions and orientations of an object (p) in relation to the robot base 11 frame b and the camera 14 frame c, respectively, and T.sub.b.sup.e is the position and orientation of the robot end effector 13 frame e in relation to the robot base 11 frame b. T.sub.b.sup.e will typically be programmed into and provided by the robot 12 controller.
[0019] The calibration article 20 is designed to allow the vision system 14 to easily and accurately determine the position and orientation of the calibration article 20 from captured images with respect to the camera 14 frame c. In one embodiment, the calibration article 20 may be thought of as a truncated pyramid or a frustum. However, because a conventional frustum is defined as the portion of a pyramid between two parallel planes, some embodiments of the calibration article 20 may not be considered to be a frustum since it may be desirable for the top surface 24 to be non-parallel with the bottom surface 22. Alternatively, the bottom surface 28 may be used herein in place of the top surface 24.
[0020] While the calibration article 20 may have four or more side surfaces 26 adjoined to adjacent side surfaces 26 at opposing sides of each side surface 26, it may be preferable for the calibration article 20 to have only four side surfaces 26. That is, in some embodiments, the calibration article 20 may have five, six, seven or eight side surfaces 26, but the calibration article 20 preferably has at least four side surfaces 26 and may have no more than four side surfaces 26. As shown, the side surfaces 26 may be adjoined along the bottom edges to the bottom surface 22 and may be adjoined along the top edges to the top surface 24.
[0021] As shown, the side surfaces 26 are preferably angled inward from the bottom surface 22 to the top surface 24. Thus, the side surfaces 26 are angled inward toward the top surface 24. One feature of the calibration article 20 is that at least some of the side surfaces 26 are angled inward at different angles relative to each other. Thus, the calibration article 20 is asymmetrical in at least some respect. For example, when at least two of the side surfaces 26 are compared to each other, the side surfaces 26 are oriented at different angles relative to the top surface 24 and/or the bottom surface 22. In order to provide further distinctions between each of the side surfaces 26, it may also be desirable for at least three of the side surfaces 26 to be angled at different angles relative to each other, or for at least four of the side surfaces 26 to be angled at different angles relative to each other. Where the top surface 24 and the bottom surface 22 are parallel to each other, the angles may be measured from the top surface 24 or the bottom surface 22. Where the top surface 24 and the bottom surface 22 are non-parallel, it is preferable to measure the angles from the bottom surface 22; however, the angles may alternately be measured from the top surface 24.
[0022] As shown in
[0023]
[0024] In use, the shape of the calibration article 20 may be utilized by the vision system 14 to determine the position and orientation of the calibration article 20 from captured images. An example of such a process is shown in
[0025]
[0026] Computing the position and orientation of the calibration article 20 after recognizing the dominant planes in the captured 3D images of the calibration article 20 can be done in numerous ways. For example, one method is least squares fitting. In this method, the pre-stored shape data of the calibration article 20 may include the plane equation parameter [n.sub.x,n.sub.y,n.sub.z,d].sub.a for each surface with respect to the coordinate frame of the calibration article 20:
where <n.sub.zn.sub.y,n.sub.z> is a unit vector representing the normal plane, d is the distance of the origin of the calibration article 20 frame a to the identified dominant plane. The position and orientation of the calibration article 20 may be defined as T.sub.c.sup.a with respect to the camera 14 frame c, then the plane fitting algorithm in step (51) of
Using the relationship:
the plane equations (3) and (4) may be defined as:
[n.sub.xn.sub.yn.sub.zd].sub.cT.sub.c.sup.a=[n.sub.xn.sub.yn.sub.zd].sub.a(6)
Since the number of recognized surfaces is preferably at least three, there will be at least three equations for (6) available to use in a linear least squares fitting to solve for T.sub.c.sup.a. Since the bottom surface 28 or the mounting plate 29 may also be recognized, this surface could also be added into the linear least squares fitting to solve for T.sub.c.sup.a.
[0027] Using the algorithms described above for computing the position and orientation of the calibration article 20 from captured images, there are various ways to use the calibration article 20 to calibrate the geometric relationship between the robot 12 space and the camera 14 space using the calibration article 20.
[0028] The mathematic foundation of camera 14 to robot 12 calibration may be based on a closed kinematic chain:
T.sub.b.sup.a=T.sub.b.sup.e()*T.sub.e.sup.c*T.sub.c.sup.a(7)
where is the measurement of robot joint angles, T.sub.b.sup.e is the position and orientation of the robot end effector 13 in relation to the robot base 11 frame b and is provided by the robot controller through a forward kinematics calculation as:
T.sub.b.sup.e()=f(,M)(8)
with M being the robot kinematic model parameters, including the arm link 18 lengths, the initial joint positions, and other factors; T.sub.c.sup.a is the position and orientation of the calibration article 20 in relation to the camera 12 frame c as computed in
[0029] It is noted that Equation (7) is valid for a robotic system shown in
[0030] If the position and orientation of the calibration article 20 in robot 12 space, T.sub.b.sup.a, is unknown, then Equation (7) will have two unknown matrices, T.sub.e.sup.c and T.sub.b.sup.a. To solve them, at least two instances of Equation (7) are required, which means at least two camera 14 view angles are needed. As shown in
T.sub.b.sup.a=T.sub.b.sup.e(.sup.(i))*T.sub.e.sup.c*T.sub.c.sup.a(.sup.(i)),i=1 . . . N(9)
where .sup.(i) is the robot joint angles at the i.sup.th view position, T.sub.c.sup.a(.sup.(i)) is the detected position and orientation of the calibration article 20 in the camera 14 frame cat the i.sup.th view position. Equation (9) has two unknowns T.sub.e.sup.c and T.sub.b.sup.a, which can be solved using a standard linear least squares algorithm, or other known algorithms. This method is also known as hand-eye calibration.
[0031] After calibration, the correlation data is used in the form of Equation (2). That is, after the correlation data has been generated (76, 86), positional data captured by the camera 14 through viewing a work piece can be transformed with the correlation data into a precise location of the robot 12 relative to the robot base 11 frame b. Thus, after the calibration routine is complete, the correlation data is preferably stored in memory (77, 87) and used by the system (78, 88) during regular operation of the robot 12 and vision 14 systems for transforming positions between camera space and robot space.
[0032] While preferred embodiments of the inventions have been described, it should be understood that the inventions are not so limited, and modifications may be made without departing from the inventions herein. While each embodiment described herein may refer only to certain features and may not specifically refer to every feature described with respect to other embodiments, it should be recognized that the features described herein are interchangeable unless described otherwise, even where no reference is made to a specific feature. It should also be understood that the advantages described above are not necessarily the only advantages of the inventions, and it is not necessarily expected that all of the described advantages will be achieved with every embodiment of the inventions. The scope of the inventions is defined by the appended claims, and all devices and methods that come within the meaning of the claims, either literally or by equivalence, are intended to be embraced therein.