METHOD OF HIGH-PRECISION 3D RECONSTRUCTION OF EXISTING RAILWAY TRACK LINES BASED ON UAV MULTI-VIEW IMAGES

Abstract

Disclosed is a method of high-precision 3D reconstruction of existing railway track lines based on UAV multi-view images, including: acquiring initial data, acquiring a UAV image rail top centerline, calculating a rail top centerline based on a nonlinear least squares method, and calculating three-dimensional coordinates of the rail centerline. Based on the multi-view geometry principle in computer vision and photogrammetry, object space coordinates of the line can be directly calculated by using image information, which does not require outdoor workers to work online and can effectively improve the safety of railway operation line surveying and mapping. Therefore, this method has important engineering application value and application prospect.

Claims

1. A method of high-precision three-dimensional (3D) reconstruction of existing railway track lines based on unmanned aerial vehicle (UAV) multi-view images, comprising: S1, acquiring initial data comprising original images from UAV multi-view, external azimuth elements of the images, internal parameters of camera, and initial coordinates of a rail top centerline; S2, back-projecting the initial coordinates of the rail top centerline to the original image using the image external azimuth elements and the internal parameters of camera, and adjusting a location of an image straight segment to obtain a precise image rail top centerline observation value; S3, optimizing the image rail top centerline observation value using a nonlinear least squares method to obtain an object space coordinate parameter of a rail top straight segment, and connecting adjacent straight segments in sequence using the object space coordinate parameter to obtain complete 3D coordinates of the rail top centerline; and S4, distinguishing between rail straight and curved segments according to the obtained 3D coordinates of the rail top centerline, and calculating 3D centerline coordinates of each segment in turn to obtain high-precision 3D coordinates of the rail top centerline.

2. The method according to claim 1, wherein the back-projecting the initial coordinates of the rail top centerline to the original image in S2 comprises: segmenting an initial rail top centerline input according to a preset length threshold to obtain a plurality of straight segments of the rail top centerline; and back-projecting the rail top straight segments as segmented to the original image from UAV multi-view according to a collinear condition equation using the precise image external azimuth elements and internal parameters of camera to obtain a rough location of each rail top straight segment on the image.

3. The method according to claim 2, wherein in S2, the adjusting the location of the image straight segment comprises finely adjusting an endpoint location of the image straight segment so that each image rail straight segment is accurately located on the rail top centerline.

4. The method according to claim 2, wherein in S3, the optimizing the image rail top centerline observation value using the nonlinear least squares method to obtain the object space coordinate parameter of the rail top straight segment comprises: for the straight segments of the rail top centerline, calculating an included angle between each two image projection planes as an intersection angle, and taking an object space straight segment formed by an intersection of two image projection planes with the largest intersection angle as an initial value of least squares adjustment of the straight segment; taking an Euclidean distance between a back-projected straight segment of the rail top straight segment as the initial value on the image and a corresponding image line observation value as a cost to form a cost equation; calculating an overall cost function of least squares optimization of any object space rail top straight segment according to the cost equation; performing Taylor series expansion on the overall cost function, and omitting higher-order terms to obtain a linearized error equation; and solving the object space coordinate parameter of the rail top straight segment using the linearized error equation according to least squares adjustment criterion.

5. The method according to claim 4, wherein, for any object space rail top straight segment L.sub.i, if it has image line observation values on a plurality of images, several cost equations are formed, and an overall cost function of least squares optimization of the straight segment L.sub.i is: C = .Math. k dist ( proj ( L i , T k ) , l ik ) where C represents an overall back-projection cost of the least squares optimization of the straight segment L.sub.i, dist(*) is an Euclidean distance function from an observed value of the image line to the back-projection straight segment of the rail top straight segment, proj(*) represents a back-projection function based on perspective imaging, T.sub.k is internal and external azimuth elements of the k-th image where L.sub.i is observable, and L.sub.ik represents an image straight segment observation value corresponding to L.sub.i on this image.

6. The method according to claim 2, wherein, in S3, the connecting adjacent straight segments in sequence using the object space coordinate parameter to obtain the complete 3D coordinates of the rail top centerline comprises: taking a point on the obtained rail top straight segment that is closest to a projected ray of an image line endpoint as a homologue point as the endpoint on the rail top straight segment, calculating coordinates of homologue points as all image line endpoints on the rail top straight segment, and taking an average value thereof as an endpoint of the rail top straight segment; and determining a corresponding connection sequence according to starting and ending coordinates of each segment, and calculating an average value of coordinates of endpoints of adjacent rail top straight segments that are close to each other as coordinates of a rail node, so as to realize connection of the adjacent straight segments and obtain the complete rail top centerline.

7. The method according to claim 6, wherein in S4, the distinguishing between the rail straight and curved segments according to the obtained 3D coordinates of the rail top centerline comprises: for the obtained rail top centerline, calculating an azimuth angle of each segment by taking the rail node as a distinguishing point, and counting minimum and maximum azimuth angles; forming a rectangular slice space by taking a preset threshold δ as a search width, and counting a number N of rail nodes falling into the rectangular slice space; if N is greater than the preset threshold, determining that the rail nodes in the rectangular slice space are all straight segment points; otherwise, determining that the rail nodes in the rectangular slice space are curved segment points; and moving the rectangular slice space upward by a distance of δ/2 by taking a minimum value as a starting point, and continuing to determine the straight/curved segment point until the rectangular slice space reaches the maximum azimuth angle.

8. The method according to claim 1, wherein in S4, the calculating the 3D centerline coordinates of each segment in turn to obtain the high-precision 3D coordinates of the rail top centerline comprises: if it is a straight segment, for each node P.sub.G1 on the straight segment, calculating a point P.sub.G2 on G.sub.2 closest to P.sub.G1, and calculating a midpoint of P.sub.G1 and P.sub.G2 as a node of the 3D centerline; and performing such operation for all nodes on the rail G.sub.1 to obtain a 3D centerline of the straight segment.

9. The method according to claim 1, wherein in S4, the calculating the 3D centerline coordinates of each segment in turn to obtain the high-precision 3D coordinates of the rail top centerline comprises: if it is a curved segment, for each inner rail node P.sub.N(X.sub.p, Y.sub.p, Z.sub.p), first calculating azimuth angles α.sub.1 and α.sub.2 in a normal direction of two straight segments before and after the node on a two-dimensional plane, and offsetting a plane point P.sub.T(X.sub.p, Y.sub.p) along a direction of (α.sub.1+α.sub.2)/2 to the inside of the rail by a distance d=(1.435+ϑ)/2, where ϑ is a width of the rail top, to obtain coordinates (X.sub.S, Y.sub.S) as follows: { X S = X P + d * sin ( α 1 + α 2 2 ) Y S = Y P + d * cos ( α 1 + α 2 2 ) an elevation Z.sub.p of a corresponding inner rail point is used as an elevation value of a 3D centerline node of the curve segment to obtain coordinates (X.sub.S, Y.sub.S, Z.sub.p) of a 3D centerline node corresponding to the point P.sub.N.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0022] FIG. 1 is a flowchart of a method of high-precision 3D reconstruction of existing railway track lines based on UAV multi-view images according to an embodiment of the present application.

[0023] FIG. 2 is a flowchart showing a process of obtaining a UAV image rail top centerline according to an embodiment of the present application.

[0024] FIG. 3 is a flowchart showing a process of calculating the rail top centerline based on a nonlinear least squares method according to an embodiment of the present application.

[0025] FIG. 4 is a flowchart showing a process of performing least squares optimization on the rail top centerline according to an embodiment of the present application.

[0026] FIG. 5 is a flowchart showing a process of calculating high-precision 3D coordinates of the rail top centerline according to an embodiment of the present application.

[0027] FIG. 6 is a flowchart showing a process of distinguishing rail straight and curved segments according to an embodiment of the present application.

[0028] FIG. 7 is a schematic diagram of least squares optimization of a rail top centerline of the method of high-precision 3D reconstruction of existing railway track line based on UAV multi-view images according to an embodiment of the present application.

[0029] FIG. 8 is a schematic diagram of rail straight/curved segment determination of the method of high-precision 3D reconstruction of existing railway track lines based on UAV multi-view images according to an embodiment of the present application.

DETAILED DESCRIPTION OF THE EMBODIMENTS

[0030] The present application will be further described in detail below through the accompanying drawings and specific embodiments.

[0031] As shown in FIG. 1, an embodiment of the present application provides a method of high-precision 3D reconstruction of existing railway track lines based on UAV multi-view images. The method includes the following steps.

[0032] In S1, initial data is acquired. The initial data can include an original image from UAV multi-view, image external azimuth elements, internal parameters of camera (the results of aerial triangulation orientation), and initial coordinates of a rail top centerline.

[0033] In S2, a UAV image rail top centerline is acquired. The initial coordinates of the rail top centerline are back-projected to the original image using the image external azimuth elements and the internal parameters of camera, and fine location adjustment is performed on an image straight segment in a way of man-machine interaction to obtain a precise image rail top centerline observation value. As shown in FIG. 2, the step can be implemented as follows.

[0034] In S2.1, the rail top centerline is automatically segmented. An initial rail top centerline as input is segmented according to a preset length threshold (usually set as 10˜15 m) to obtain a plurality of straight segments of the rail top centerline.

[0035] In S2.2, back-projection prediction is performed on the rail top centerline. Based on a collinear condition equation, the rail top straight segments as segmented in step 2.1 are back-projected to the UAV image using the precise image external azimuth elements and internal parameters of camera to obtain rough locations of the rail top straight segments on the image.

[0036] In S2.3, precision measurement is performed on the image rail top straight segments by means of man-machine interaction. Locations of endpoints of image straight segments are finely adjusted by means of man-machine interaction according to the rough locations of the rail top straight segments obtained in S2.2 to ensure that each image rail straight segment is accurately located on the rail top centerline.

[0037] In S3, the rail top centerline is calculated based on a nonlinear least squares method. Object-space coordinate parameters of the rail top straight segments are optimized using the nonlinear least squares method by taking the image rail top straight segments obtained in S2 as observation values, and adjacent straight segments are connected in sequence to form a complete rail top centerline. As shown in FIG. 3, the step can be implemented as follows.

[0038] In S3.1, initial adjustment values of the straight segments are calculated. For the rail top straight segments, an included angle between each two image projection planes (a plane formed by the photography center and the image rail line) is calculated as an intersection angle, and an object-space straight segment formed by an intersection of two image projection planes with the largest intersection angle is used as an initial value of least squares adjustment of the straight segment. Object-space straight segments are described in a point-direction manner in the present application. If coordinates of a point on a straight segment L is [X, Y, Z].sup.T, and a unit direction vector of the straight segment is [u, v, √{square root over (1−u.sup.2−v.sup.2)}].sup.T, then the object-space parameter of the straight segment is L=[X, Y, Z, u, v].sup.T.

[0039] In S3.2, least squares optimization is performed on the rail top centerline. As shown in FIG. 4, in an embodiment, the step can be implemented as follows.

[0040] In S3.2.1, taking the rail top straight segment obtained in S3.1 as an initial value, an Euclidean distance between a back-projected straight segment thereof on the image and a corresponding image line observation value is taken as a cost to form a cost equation. The calculation method of the cost value is as follows: if the parameter of the straight segment is 1=[X, Y, Z, u, v, w].sup.T, the coordinate of the projection center is [X.sub.S, Y.sub.S, Z.sub.S].sup.T, a.sub.i, b.sub.i and c.sub.i(i=1,2,3) are element values in a photogrammetry rotation matrix, f is a focal length, x.sub.0 and y.sub.0 are coordinates of the principal point of the image, and two endpoints of the rail top image line l are respectively p=[x.sub.p,y.sub.p].sup.T and q=[x.sub.q,y.sub.q].sup.T, then the calculation equation of the cost value of point p is as follows:

[00003] d p = x P A 1 - y P A 2 + f 2 ( B 3 ( a 2 ( X - X S ) + b 2 ( Y - Y S ) + C 2 ( Z - Z S ) ) - B 1 ( a 1 ( X - X S ) + b 1 ( Y - Y S ) + C 1 ( Z - Z S ) ) ) B 2 B 4 A 1 2 + A 2 2 ( 1 )

[0041] In equation (1), variable symbols are defined as follows:

[00004] A 1 = f ( a 2 ( X - X S ) + b 2 ( Y - Y S ) + C 2 ( Z - Z S ) B 4 - B 1 B 2 ) ( 2 ) A 2 = f ( a 1 ( X - X S ) + b 1 ( Y - Y S ) + C 1 ( Z - Z S ) B 4 - B 3 B 2 ) B 1 = a 2 C 3 + b 2 C 2 + c 2 C 1 B 2 = a 3 C 3 + b 3 C 2 + c 3 C 1 B 3 = a 1 C 3 + b 1 C 2 + c 1 C 1 B 4 = a 3 ( X - X S ) + b 3 ( Y - Y S ) + C 3 ( Z - Z S ) C 1 = Z - Z S + ω C 2 = Y - Y S + v C 3 = X - X S + u

[0042] Similarly, a cost value d.sub.q of point q can be calculated, and then the cost value of the rail top image line is (d.sub.p+d.sub.q)/2. As shown in FIG. 7, L.sub.O is an image line observation value, L.sub.p is a back-projection line of a corresponding object space line on the image, and then the cost value is (d.sub.1+d.sub.2)/2. For the object space rail top straight segment L.sub.i, if it has image line observation values on a plurality of images, several cost equations described in equations (1) and (2) can be formed, and then the overall cost function of the least squares optimization of the straight segment L.sub.i is:

[00005] C = .Math. k dist ( proj ( L i , T k ) , l ik ) ( 3 )

[0043] In equation (3), C represents an overall back-projection cost of the least squares optimization of the straight segment L.sub.i, dist(*) is an Euclidean distance function from an observed value of the image line to the back-projection straight segment of the rail top line, prof(*) represents a back-projection function based on perspective imaging, T.sub.k is internal and external azimuth elements of the k-th image where L.sub.i can be observed, and L.sub.ik represents an image straight segment observation value corresponding to L.sub.i on the image.

[0044] In S3.2.2, Taylor series expansion is performed on terms of equation (3), and higher-order terms are omitted. An error equation after linearization is as follows:


V.sub.L=A.sub.Ll−b.sub.L,P.sub.L  (4)

[0045] In equation (4), V.sub.L is a back-projection residual distance of the rail top straight segment, l=[ΔX.sub.s, ΔY.sub.S, ΔZ.sub.s, Δu, Δv].sup.T is a correction number vector of the parameter of the rail top straight segment, P.sub.L is a first-order partial derivative matrix of an objective function with respect to a straight segment parameter vector, b.sub.L is a constant vector, and P.sub.L is a unit weight matrix. According to least squares adjustment criterion, the parameter of the rail top straight segment is accurately solved.

[0046] In S3.3, an endpoint of the rail top straight segment is calculated. Specifically, a point on the rail top straight segment obtained in S3.2 that is closest to a projected ray of an image line endpoint is calculated as a homologue point as the endpoint on the rail top straight segment. Coordinates of homologue points as all image line endpoints on the rail top straight segment are calculated, and an average value thereof is taken as an endpoint of the rail top straight segment.

[0047] In S3.4, the rail top straight segments are fused. For the rail top straight segments obtained in S3.3, a corresponding connection sequence is determined according to starting and ending coordinates of each segment. An average value of coordinates of endpoints of adjacent rail top straight segments that are close to each other is calculated as coordinates of a rail node, so as to realize connection of adjacent straight segments and obtain the complete rail top centerline.

[0048] In S4, rail straight and curved segments are distinguished according to the obtained 3D coordinates of the rail top centerline, and 3D centerline coordinates of each segment are calculated in turn to obtain high-precision 3D coordinates of the rail top centerline. As shown in FIG. 5, in an embodiment, the step can be realized as follows.

[0049] In S4.1, rail straight and curved segments are distinguished as follows:

[0050] Referring to FIG. 6, in S4.1.1, an azimuth angle of the rail top centerline is calculated. For the rail top centerline obtained in S3, an azimuth angle of each segment is calculated with the node as a distinguishing point, and the minimum azimuth angle β.sub.min and the maximum azimuth h angle β.sub.max are counted.

[0051] In S4.1.2, the rail straight/curved segments are determined. Taking β.sub.min obtained in S4.1.1 as the starting point and a preset threshold δ as a search width, a rectangular slice space is formed, and a number N of rail nodes falling into the rectangular slice space is counted. If N>N.sub.min, it is determined that the rail nodes in the rectangular slice space are all straight segment points; otherwise, it is determined that the rail nodes in the rectangular slice space are curved segment points. As shown in FIG. 8, the horizontal axis represents the rail straight segment, and the vertical axis represents the azimuth angle corresponding to the straight segment. The points in areas S.sub.1 and S.sub.3 are straight segment points, and the points in area S.sub.2 are curved segment points. The rectangular slice space is moved up a distance δ/2, and straight/curved segment points are continued to be determined according to the above method till the minimum value of the rectangular slice space reaches the maximum value of the azimuth angle to end this step.

[0052] In S4.2, 3D centerline coordinates of rail nodes are calculated. In S4.1, the rail nodes are divided into several straight segments and curved segments, and for nodes of each rail segment, the 3D centerline coordinates of each segment are calculated in turn. Assuming that two rails of the track are G.sub.1 and G.sub.2 respectively, the 3D centerline are calculated as follows:

[0053] (a) If it is a straight segment, for each node P.sub.G1 on the straight segment, a point P.sub.G2 closest to P.sub.G1 on G.sub.2 is calculated, and a midpoint of P.sub.G1 and P.sub.G2 is calculated as a node of the 3D centerline; and all nodes on the rail G.sub.1 are traversed and the above operations are performed to obtain a 3D centerline of the straight segment.

[0054] (b) If it is a curved segment, for each inner rail node P.sub.N(X.sub.p, Y.sub.p, Z.sub.p), azimuth angles α.sub.1 and α.sub.2 in a normal direction of two straight segments before and after the node on a two-dimensional plane are first calculated, and a plane point P.sub.T (X.sub.p, Y.sub.p) is offset along a direction of (α.sub.1+α.sub.2)/2 to the inside of the rail by a distance d=(1.435+ϑ)/2, where ϑ is a width of the rail top, to obtain coordinates (X.sub.S, Y.sub.S) as follows:

[00006] { X S = X P + d * sin ( α 1 + α 2 2 ) Y S = Y P + d * cos ( α 1 + α 2 2 ) ( 5 )

[0055] In step 4.2 (b), an elevation Z.sub.p of a corresponding inner rail point is used as an elevation value of the 3D centerline node of the curved segment to obtain the 3D centerline node coordinates (X.sub.S, Y.sub.S, Z.sub.p) corresponding to the point P.sub.N.

[0056] The 3D centerline coordinates of the entire rail can be obtained by processing the straight and curved segments of the rail according to the above methods (a) and (b). Combining the 3D coordinates of the rail top centerline obtained in step 3, the complete 3D coordinates of the rail can be obtained.

[0057] Obviously, the above-mentioned embodiments are only examples for clarity only, but not for limiting. Those skilled in the art can also make other changes or alterations in different forms on the basis of the above description. It is not necessary or possible to exhaust all embodiments here. The obvious changes or alterations derived therefrom still belong to the scope of protection of the present application.