Stereo calibration method for movable vision system
11663741 · 2023-05-30
Assignee
Inventors
- Kaifang Wang (Shanghai, CN)
- Dongdong Yang (Shanghai, CN)
- Xiaolin Zhang (Shanghai, CN)
- Jiamao Li (Shanghai, CN)
- Wenhao Wang (Shanghai, CN)
Cpc classification
International classification
Abstract
A stereo calibration method for a movable vision system. The movable vision system involved in the method comprises at least two photography components, at least one calculation component, and at least one control component. The method comprises: placing a calibration template in front of photography components; rotating the photography components by degrees of freedom of motion; obtaining one or more groups of images including calibration template features, and obtaining corresponding position information at the degrees of freedom of motion when the images are obtained; obtaining, by means of calculation, a calibration result of the photography components and the degrees of freedom of motion, i.e., a rotation matrix and a translation matrix of the rotation axis at the degrees of freedom of motion with respect to the photography components; and then obtaining in real time a stereo calibration result by combining the obtained calibration result and the position information at the degrees of freedom of motion in the vision system. The method implements stereo calibration of the photography components in a movable multiocular system in a motion state, has a good calculation real-time property, eliminates the error problem caused by machining or assembly, and has a broad application prospect.
Claims
1. A stereo calibration method of a movable multi-ocular vision system, wherein the movable multi-ocular vision system comprises: at least two imaging components, each comprising at least one imaging element capable of capturing consecutive images and having an arbitrary number of rotation axes, wherein each rotation axis is provided with a position acquisition device capable of acquiring rotation or translation information; at least one calculation component capable of calculating and processing information of images and information of movement in association with each rotation axis; at least one control component capable of controlling movement in association with each rotation axis, and wherein the stereo calibration method comprises: acquiring a position information for each rotation axis at a corresponding reference position; for each imaging element, placing a calibration template in front of the imaging element; rotating an a-th rotation axis for a plurality of times, and for each rotation, capturing an image information and recording a position information of each rotation axis, where a=1 . . . N, a represents a serial number of a rotation axis, N represents a number of rotation axes; calculating rotation and translation transformation matrices for each captured image with respect to the position of the calibration template by using a calibration algorithm and a sequence of images captured at respective rotations; and calculating calibration results for the imaging element and each rotation axis by the calculation component, wherein the calibration results comprise rotation matrix and translation matrix of each rotation axis relative to the imaging element.
2. The stereo calibration method of claim 1, wherein the position acquisition device is provided in respective rotation axis for obtaining information of movement in respective rotation axis.
3. The stereo calibration method of claim 1, wherein at least one of the imaging components has one or more rotation axes.
4. The stereo calibration method of claim 1, wherein the calibration template includes one of a natural static scene and an artificial standard target, wherein the natural static scene when serving as the calibration template provides invariant image features and known information of position relationship between the image features, and wherein the artificial standard target includes at least one of a two-dimensional (2D) planar target and a three-dimensional (3D) stereoscopic target.
5. The stereo calibration method of claim 1, comprising the steps of: for each imaging component, recording position information of the imaging component in each rotation axis at a reference position; obtaining precise parameters about relative positions between imaging elements by performing a stereo calibration at the reference position; performing a calibration process for each rotation axis and calculating, by the calculation component, rotation matrix and translation matrix of each rotation axis in the respective rotation axis relative to an imaging element coordinate system.
6. The stereo calibration method of claim 1, wherein the stereo calibration results include reference position information, stereo calibration results of the imaging element at the reference position, and rotation and translation relationships of each rotation axis relative to an imaging element coordinate system, the stereo calibration results of the imaging element at the reference position including rotation and translation relationships between imaging elements.
7. The stereo calibration method of claim 1, further comprising calculating and obtaining rotation and translation matrices indicating relative positional relationships between the imaging elements by feeding the calibration results and position information of each rotation axis into a relationship model between an imaging element coordinate system and a coordinate system of respective rotation axes.
8. The stereo calibration method of claim 7, wherein the relationship model is
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
DETAILED DESCRIPTION
(8) The present application will be described in greater detail blow with reference to particular embodiments illustrated in the accompanying drawings. For the sake of brevity, when describing the various processes, conditions and experimental methods of those embodiments, some details well known in the art are omitted, and the present application is not particularly limited to such details.
(9) A more detailed description for stereo calibration methods of a movable multi-ocular vision system of present application will be set forth below with reference to
(10) The movable multi-ocular vision system mentioned in the present application includes at least two sets of imaging components, at least one calculation component and at least one control component. Each imaging component may be individually provided with one calculation component and one control component. Alternatively, the imaging components may share a single calculation component and a single control component. The imaging components, calculation components, and control components are coupled by signal connections. As shown in
(11) As shown in
(12) (S11) obtaining initial information of the vision system;
(13) (S12) For a system in need of binocular stereo calibration results, capturing information of several sets of images of the calibration template at a reference position based on a binocular stereo calibration;
(14) (S13) calculating and obtaining rotation and translation matrices between the two imaging components at the reference position using an existing calculation algorithm for a binocular fixed vision system;
(15) (S14) rotating each motion axis, capturing corresponding information of images of the calibration template and recording position information of each motion axis;
(16) (S15) based on the information captured in step S14, calculating rotation and translation matrices of each motion axis relative to a imaging component coordinate system; and
(17) (S16) obtaining calibration results for the movable multi-ocular vision system and ending the calibration process.
(18) Steps S12 and S13 may be interchanged with steps S14 and S15 in the order in which they are performed. That is, it is also possible for the steps to be performed in the following order: S11, S14, S15, S12, S13 and S16.
(19) The steps of calculating extrinsic parameters may include:
(20) (S21) acquiring the calibration results for the movable vision system;
(21) (S22) acquiring position information of the two imaging components (two eyes) in need of stereo calibration in each motional degree of freedom; and
(22) (S23) calculating rotation and translation matrices between the two imaging components by feeding related information to the solving model for extrinsic parameters of the movable vision system.
(23)
EXAMPLES
(24) The present application will be described in detail with reference to the six-DOF movable binocular vision system as shown in
(25) 1. Calibration of Initial Data
(26) In present application, one or more calibration templates possessing extractable invariant features with the distances between them being known may be used. In present embodiment, a camera calibration algorithm based on a single planar checkerboard proposed by Professor Zhang Zhengyou in 1998, is employed. In particular, a printed photo of a planar checkerboard, which is adhered to a plain plate, may be used as the calibration template, as shown in
(27) In this embodiment, in order to ease difficulty level for the calculation, each degree of freedom is independently calibrated. With the calibration of the roll, pitch and yaw degrees of freedom of the left “eye” (corresponding to motors Nos. 1, 2 and 3) as an example, the calibration template is fixed in position in front of the movable binocular vision system in such a manner that the complete calibration template can be seen with the left “eyeball” imaging element. Then, each of the motors Nos. 1, 2 and 3 is rotated several times, and following each rotation, an image M.sub.i is captured by the left “eyeball” imaging element, and position information {θ.sub.1i,θ.sub.2i,θ.sub.3i} of each motion axis corresponding to respective motor at the time when the images are captured is recorded. The combination of the images with the position information can obtain {M.sub.i,θ.sub.1i,θ.sub.2i,θ.sub.3i}. Based on the sequence of images (M.sub.1,M.sub.2,M.sub.3, . . . ,M.sub.P) captured after respective rotations, rotation and translation transformation matrices {R.sub.Ci,T.sub.Ci} of each captured image with respect to the position of the calibration template can be calculated by using the planar checkerboard calibration template shown in
(28) A position allowing cameras of the binocular vision system to have a large common field of view is then chosen as a reference position for stereo calibration, and the recorded position information of the six motors in the binocular system is denoted as (θ.sub.I1,θ.sub.I2,θ.sub.I3,θ.sub.I4,θ.sub.I5,θ.sub.I6), where the subscripts 1 to 6 respectively represent the roll, yaw and pitch degrees of freedom of the left and right eyes (this also applies to any subsequently captured image).
(29) In practice, in order to improve precision of stereo calibration for reference position, it is preferred that a set of calibrating images are captured when the camera is in a same position. That is, with the cameras being kept stationary, the pose of the planar checkerboard is changed in front of the cameras and multiple sets of images are captured. In general, 20 sets of images are captured.
(30) 2. Stereo Calibration of Movable Binocular Vision System
(31) Using several sets of images of the stereo calibration template captured at the reference position (θ.sub.I1,θ.sub.I2,θ.sub.I3,θ.sub.I4,θ.sub.I5,θ.sub.I6) in the previous step, and Zhang's stereo calibration algorithm well-known in the art, a positional relationship (including a rotation matrix R.sub.I and a translation vector T.sub.I) between the two cameras at the reference position (reference can be made to “A Flexible New Technique for Camera Calibration” published by Zhang Zhengyou in 1998 for more details in calculation steps) can be calculated.
(32) 3. Calibration of Motion Axis
(33) Limited by the existing mechanical machining techniques, it is difficult to either ensure that the optical center of the movable binocular vision system is positioned on the rotation axes or ensure that the rotation axis coordinate system is parallel to the camera coordinate systems. Therefore, it is necessary to calculate pose variation of each camera coordinate system based on outputs of the rotation axis encoders, which requires knowing positional relationships between each rotation axis and camera coordinate system. For a rigid body, positional relationships between each rotation axis and the camera are kept unchanged. In order to determine positional relationships between each rotation axis and camera coordinate system, a mathematical model is created as shown in
(34) If O.sub.cO.sub.b=d (which is a constant value that may be determined by the mechanical machining technique), then coordinate of O.sub.c in the system B can be represented as t=(−d,0,0).sup.T. Let R.sub.BC be a rotation matrix from the camera coordinate system C to the rotation axis coordinate system B. For any point P in space, its coordinate PC in the system C and coordinate PB in the system B satisfies the transformation equation P.sub.B=R.sub.BCP.sub.C+t expressed in homogeneous coordinates as:
(35)
(36) where
(37)
is the desired calibration result, which is a transformation matrix from the system B to C for any point P in space.
(38) After a rotation about the rotation axis by an angle θ, the rotation axis coordinate system B and the camera coordinate system C become new rotation axis coordinate system B′ and new camera coordinate system C′. The rotation about the rotation axis is equivalent to a corresponding rotation of the system B to B′ about the z.sub.b axis by the angle θ. For the same point P, its coordinate P.sub.B in the system B and coordinate P.sub.B′ in system B′ satisfy:
(39)
(40) where
(41)
(this value depends on the angle of rotation).
(42) Similarly, the camera coordinate system rotates by the angle θ into the new camera coordinate system C′. During calibration, the transformation of the camera coordinate system can be calculated using a fixed checkerboard. Assuming a point P is represented by a coordinate x.sub.w in a world coordinate system of the checkerboard, calculated extrinsic parameters of the checkerboard in the systems C and C′ will be T.sub.CW and T.sub.C′W, respectively. Since P.sub.C=T.sub.CWx.sub.w and P.sub.C′=T.sub.C′Wx.sub.w, the following equation can be get:
P.sub.C′=T.sub.C′WT.sub.CW.sup.−1P.sub.C=T.sub.C′CP.sub.C. (3)
(43) Because of the nature of a rigid body, the relative positional relationship T.sub.BC between the systems B and C remains unchanged in the rotating process. Thus, the same point in the space still satisfies Eqn. (1) in the new coordinate systems B′ and C′, and the following equation is get.
(44)
(45) From Eqns. (2), (3) and (4), the following equation is get.
(46)
(47) where
(48)
is the matrix that needs to be solved in the rotation axis calibration process,
(49)
is a matrix output from position sensors for each rotation, and
(50)
is a matrix T.sub.C′C calculated by the camera for each rotation. The T.sub.BC calculated for each rotation is used to calibrate relationship between the rotation axis and the camera coordinate system.
(51) For a given set of data {R.sub.Ci,T.sub.Ci,θ.sub.1i,θ.sub.2i,θ.sub.3i} (i=1 . . . P), the rotation angles for this set of data relative to the reference position can be calculated as {θ.sub.1i−θ.sub.I1,θ.sub.2i−θ.sub.I2,θ.sub.3i−θ.sub.I3}(i=1 . . . P), and the rotation matrices can be calculated as:
(52)
(53) When substituting these into Eqn. (5), the following sets of equations can get:
(54)
(55) P sets of equations can be obtained when all the data is substituted into Eqn. (7), and optimal solutions {R.sub.BCa,T.sub.BCa} (a=1 . . . 3) can be obtained by solving those equations.
(56) This process is repeated for motors Nos. 4-6 for the right “eyeball”. As a result, calibration results {R.sub.BCa,T.sub.BCa} (a=1 . . . 6) for motion axes of the six motors are obtained.
(57) 4. Calculation of Real-Time Calibration Results
(58) The overall calibration results include the reference position information (θ.sub.I1,θ.sub.I2,θ.sub.I3,θ.sub.I4,θ.sub.I5,θ.sub.I6), stereo calibration results {R.sub.I,T.sub.I} and motion axis calibration results {R.sub.BCa,T.sub.BCa}(a=1 . . . 6). When each camera component in the movable binocular vision system has moved, position information (θ.sub.p1,θ.sub.p2,θ.sub.p3,θ.sub.p4,θ.sub.p5,θ.sub.p6) in the 6 motional degrees of freedom is obtained. Rotation angles in the six degrees of freedom are obtained from the position information as (θ.sub.p1−θ.sub.I1, θ.sub.p2−θ.sub.I2, θ.sub.p3−θ.sub.I3, θ.sub.p4−θ.sub.I4, θ.sub.p5−θ.sub.I5,θ.sub.p6−θ.sub.I6), which are then converted into a rotation matrix (R.sub.p1,R.sub.p2,R.sub.p3,R.sub.p4,R.sub.p5,R.sub.p6 These are fed into the following equation to obtain post-movement extrinsic parameters (a rotation matrix R′ and a translation matrix) of the six-DOF movable binocular vision system, i.e., real-time calibration results:
(59)
(60) The movable binocular vision system and stereo calibration method mentioned in the present application are not limited to the foregoing embodiments. Although additional steps are added to the embodiments for increased accuracy or reduced computational complexity in practical use, other various variations and modifications can be made without departing from the principles of the present application.
(61) The scope of the present application is not limited to the embodiments disclosed hereinabove. Rather, it embraces any and all changes or advantages that can be devised by those skilled in the art without departing from the spirit and scope of conception of this application. Thus, the actual protection scope of the application is defined by the appended claims.