Auxiliary berthing method and system for vessel
10378905 ยท 2019-08-13
Assignee
Inventors
- Xi Zhu (Jiangsu, CN)
- Yuanyuan Li (Jiangsu, CN)
- Feng Yan (Jiangsu, CN)
- Xiang Li (Jiangsu, CN)
- Xun CAO (Jiangsu, CN)
- Weisong Pan (Jiangsu, CN)
- Jianwen Ding (Jiangsu, CN)
- Jibin WANG (Jiangsu, CN)
- Jun Wang (Jiangsu, CN)
- Chen Chen (Jiangsu, CN)
- Dapeng Li (Jiangsu, CN)
- Wei LI (Jiangsu, CN)
- Wenzhu Wang (Jiangsu, CN)
Cpc classification
B63B2213/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
The present invention provides an auxiliary berthing method and system for a vessel. In the berthing method, by a solar blind ultraviolet imaging method, a position and an attitude of a vessel relative to a shoreline of a port berth during berthing are calculated by at least two solar blind ultraviolet imaging modules according to light signals received by a solar blind ultraviolet light source array arranged in advance on the shore. Further, when more than three solar blind ultraviolet imaging modules are used, in the method and device of the present invention, a normalized correlation algorithm and a data fusion algorithm are used to improve the accuracy of the position and attitude data of the vessel.
Claims
1. A method for berthing a vessel, comprising: providing at least two solar blind ultraviolet imaging modules on the vessel; receiving, by the at least two solar blind ultraviolet imaging modules, light signals transmitted by a solar blind ultraviolet light source comprising an array of solar blind ultraviolet lamps deployed on shore and defining a target lattice coordinate system; processing the received light signals by a signal processor to obtain a translation vector between each of the at least two solar blind ultraviolet imaging modules and a berth; calculating a vector representation of a line connecting the at least two solar blind ultraviolet imaging modules in the target lattice coordinate system; obtaining, according to an arrangement of the at least two solar blind ultraviolet imaging modules on the vessel, a vector representation of the vessel in the target lattice coordinate system; calculating, according to a preset or measured vector representation of the berth in the target lattice coordinate system, an attitude angle of an axis of the vessel relative to the berth; and determining a position and an attitude of the vessel relative to the berth, wherein a number of the at least two solar blind ultraviolet imaging modules is three or more, and position and/or angle information of the vessel relative to the berth measured by each solar blind ultraviolet imaging module is integrated to obtain a numerical value representing the position relationship between the vessel and the berth, and wherein the signal processor integrates position data of a reference point on the vessel relative to the berth or azimuth angle data of the vessel by a normalized correlation algorithm, then performs global error analysis to obtain a threshold for an average confidence value of a detection system consisting of the at least two the solar blind ultraviolet imaging modules and the confidence of each solar blind ultraviolet imaging module, filters positioning data with a lower confidence by using the threshold to obtain a final confidence weight for each solar blind ultraviolet imaging module, and performs weighted averaging on each solar blind ultraviolet imaging module by using the confidence weight.
2. The method for berthing a vessel according to claim 1, wherein a vector p.sub.i(x.sub.i,y.sub.i,z.sub.i) is used to represent the i.sup.th group of position data among N groups of position data, which are subjected to angular and spatial transformation, returned by N solar blind ultraviolet imaging module detection subsystems, where i=1, 2, 3 . . . N and x, y and z are three-axis coordinates of the N solar blind ultraviolet imaging modules; and, the position data, which is subjected to angular and spatial transformation, is obtained by the following method: when the relative positions of all the solar blind ultraviolet imaging modules and the attitude angle of the vessel are known, the measured position data of different measurement modules are converted into the measured position data of a same measurement module based on a spatial position relationship and through spatially geometric transformation; the normalized correlation algorithm used by the signal processor comprises the following specific steps: using a Normalized Correlation Coefficient (NCC) to represent the confidence of the position data returned by the N detection subsystems:
3. The method for berthing a vessel according to claim 1, wherein three components , and of a direction vector of any attitude angle are integrated by a normalized correlation algorithm, and a vector q.sub.i(.sub.i,.sub.i,.sub.i) is used to represent N groups of vector angle data returned by the N solar blind ultraviolet imaging module detection subsystems, where i=1, 2, 3 . . . N; and, the confidence of the position data returned by each solar blind ultraviolet imaging module system is represented as follows by a normalized correlation coefficient:
4. The method for berthing a vessel according to claim 1, wherein the signal processor integrates the position data or the attitude angle data by a data fusion method, respectively; and, the data fusion method comprises the following steps: (I) when the data to be integrated is the position data, a vector p.sub.i(x.sub.i,y.sub.i,z.sub.i) is used to represent N groups of position data, which are subjected to angular and spatial transformation, returned by N solar blind ultraviolet imaging module detection subsystems, where i=1, 2, 3 . . . N; the position data, which is subjected to angular and spatial transformation, is obtained by the following method: when the relative positions of all the solar blind ultraviolet imaging modules and the attitude angle of the vessel are known, the measured position data of different measurement modules are converted into the measured position data of a same measurement module based on a spatial position relationship and through spatially geometric transformation; a) the confidence of the data returned by each subsystem is judged by using a root-mean-square-error rmse actually calculated by the measured data of each subsystem, where the formula for calculating the root-mean-square-error of the measured data of each subsystem is as follows:
5. The method for berthing a vessel according to claim 1, characterized in that the signal processor integrates the position data or the attitude angle data by a data fusion method, respectively; and, the data fusion method comprises the following specific steps: (I) when the data to be integrated is the position data, a vector p.sub.i(x.sub.i,y.sub.i,z.sub.i) is used to represent N groups of position data, which are subjected to angular and spatial transformation, returned by N solar blind ultraviolet imaging module detection subsystems, where i=1, 2, 3 . . . N; the position data, which is subjected to angular and spatial transformation, is obtained by the following method: when the relative positions of all the solar blind ultraviolet imaging modules and the attitude angle of the vessel are known, the measured position data of different measurement modules are converted into the measured position data of a same measurement module based on a spatial position relationship and through spatially geometric transformation; a) a standard deviation of each coordinate sequence in the position data is calculated: the standard deviation of each coordinate sequence in the N groups of position data returned by the N solar blind ultraviolet imaging module detection subsystems is calculated as the basis for judging outliers in each coordinate sequence in the N groups of data, where the standard deviation of each coordinate sequence is as follows:
.sub.index={square root over ((X.sub.index
outliters=|X.sub.index
6. A system for berthing a vessel, comprising: a solar blind ultraviolet light source comprising an array of solar blind ultraviolet lamps deployed on shore; at least three solar blind ultraviolet imaging modules arranged on the vessel, each solar blind ultraviolet imaging module comprising a solar blind ultraviolet receiver configured to receive light signals from the solar blind ultraviolet light source and convert the light signals into corresponding electrical signals; and a signal processor electrically connected to the at least two solar blind ultraviolet imaging modules and, in use, receives the electrical signals transmitted by the solar blind ultraviolet imaging modules, processes the electrical signals to obtain a translation vector of each solar blind ultraviolet imaging module relative to the solar blind ultraviolet light source, further obtains a distance from the vessel to a target lattice from the translation vector, and calculates the position of each solar blind ultraviolet imaging module on the vessel relative to a berth, wherein the signal processor integrates the obtained coordinate and/or attitude angle data to obtain corresponding coordinate data and/or attitude angle data representing the position of the vessel relative to the berth, wherein the signal processor is designed to use a normalized correlation algorithm; a vector p.sub.i(x.sub.i,y.sub.i,z.sub.i) is used to represent the i.sup.th group of position data among N groups of position data, which are subjected to angular and spatial transformation, returned by N solar blind ultraviolet imaging module detection subsystems, where i=1, 2, 3 . . . N, and x, y and z are three-axis coordinates of the N solar blind ultraviolet imaging modules; and, the position data, which is subjected to angular and spatial transformation, is obtained by the following method: when the relative positions of all the solar blind ultraviolet imaging modules and the attitude angle of the vessel are known, the measured position data of different measurement modules are converted into the measured position data of a same measurement module based on a spatial position relationship and through spatially geometric transformation; the normalized correlation algorithm used by the signal processor comprises the following specific steps: using a Normalized Correlation Coefficient (NCC) to represent the confidence of the position data returned by the N detection subsystems:
7. The system for berthing a vessel according to claim 6, characterized in that the signal processor is designed to use a normalized correlation algorithm; three components , and of a direction vector of any attitude angle are integrated by the normalized correlation algorithm; a vector q.sub.i(.sub.i,.sub.i,.sub.i) is used to represent N groups of vector angle data returned by the N solar blind ultraviolet imaging module detection subsystems, where i=1, 2, 3 . . . N; and, the confidence of the position data returned by each solar blind ultraviolet imaging module system is represented as follows by a normalized correlation coefficient:
8. The system for berthing a vessel according to claim 6, wherein the signal processor is designed to integrate the coordinate and/or attitude angle data of the vessel by a data fusion algorithm, and the data fusion algorithm comprises the following specific steps: (I) when the data to be integrated is the position data, a vector p.sub.i(x.sub.i,y.sub.i,z.sub.i) is used to represent N groups of position data, which are subjected to angular and spatial transformation, returned by N solar blind ultraviolet imaging module detection subsystems, where i=1, 2, 3 . . . N; the position data, which is subjected to angular and spatial transformation, is obtained by the following method: when the relative positions of all the solar blind ultraviolet imaging modules and the attitude angle of the vessel are known, the measured position data of different measurement modules are converted into the measured position data of a same measurement module based on a spatial position relationship and through spatially geometric transformation; a) the confidence of the data returned by each subsystem is judged by using a root-mean-square-error rmse actually calculated by the measured data of each subsystem, where the formula for calculating the root-mean-square-error of the measured data of each subsystem is as follows:
9. The system for berthing a vessel according to claim 6, wherein the signal processor is designed to integrate the coordinate and/or attitude angle data of the vessel by a data fusion algorithm, and the data fusion algorithm comprises the following specific steps: (I) when the data to be integrated is the position data, a vector p.sub.i(x.sub.i,y.sub.i,z.sub.i) is used to represent N groups of position data, which are subjected to angular and spatial transformation, returned by N solar blind ultraviolet imaging module detection subsystems, where i=1, 2, 3 . . . N; the position data, which is subjected to angular and spatial transformation, is obtained by the following method: when the relative positions of all the solar blind ultraviolet imaging modules and the attitude angle of the vessel are known, the measured position data of different measurement modules are converted into the measured position data of a same measurement module based on a spatial position relationship and through spatially geometric transformation; a) a standard deviation of each coordinate sequence in the position data is calculated: the standard deviation of each coordinate sequence in the N groups of position data returned by the N solar blind ultraviolet imaging module detection subsystems is calculated as the basis for judging outliers in each coordinate sequence in the N groups of data, where the standard deviation of each coordinate sequence is as follows:
.sub.index={square root over ((X.sub.index
outliters=|X.sub.index
10. The system for berthing a vessel according to claim 6, wherein a power control system of the vessel receives a berthing distance signal transmitted by the data processor, and automatically adjusts the attitude of the vessel for berthing based on the berthing distance signal.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Some details of the present invention will be described below by the following accompanying drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
DETAILED DESCRIPTION OF THE INVENTION
(12) According to an example of the present invention, a system for improving a vessel's close-distance navigation capability may be provided. This system is able to display a schematic diagram of a vessel and a shoreline and the position information, so that a pilot is able to berth the vessel at low visibility through an output interface of a display device.
(13) For this purpose, the present invention will be further described below with reference to the accompanying drawings by embodiments. The following embodiments are merely illustrative, and the present invention is not limited to the solutions in the embodiments. Additionally, all technical solutions obtained by simple transformation by those skilled in the art within the scope of the prior art shall fall into the protection scope of the present invention.
Embodiment 1
(14) In this embodiment, as shown in
(15) The two solar blind ultraviolet imaging modules 102 and 103 are mounted on a same side of the vessel's rail at a certain interval. One solar blind ultraviolet imaging module 102 is mounted on a deck at a position closer to the head portion of the vessel, and has a distance L.sub.head to the head portion of the vessel and a distance L.sub.tail to the tail portion of the vessel; while the other solar blind ultraviolet imaging module 103 is mounted on the deck at a position farther away from the head portion of the vessel. The specific mounting positions are roughly shown in
(16) The solar blind ultraviolet imaging modules 102, the signal processor 104 and the display device 105 may be integrated together. The signal processor includes an information acquisition module, a calculation module and a storage module. The information acquisition module acquires image information generated by the solar blind ultraviolet imaging modules 102 and 103 in real time, and transmits all the data to the calculation module. This embodiment includes the following main steps.
(17) 1. Ultraviolet cameras of the two solar blind ultraviolet modules are calibrated to obtain internal parameters. There are many methods for calibrating the camera and algorithms for obtaining the internal parameters. Here, a conventional calibration technology or Zhenyou Zhang's calibration algorithm is preferably selected. The calibration flow is shown in
(18) 2: Berth information is measured: an included angle between each berth shoreline in a port for berthing and the north direction is measured in advance.
(19) 3. The ultraviolet light source lamp array is arranged and the related position information of the lamp array is measured: within half an hour before berthing the vessel, a target lamp array is arranged nearby the berth by using the group of solar blind ultraviolet lamps 101. The target lamp array is in a shape of a square grid. The size of the target lamp array and the number of lamps are not limited. In this embodiment, the arrangement of
(20) 4. The course of the vessel and the attitude and position information of the berth shoreline are calculated. Specifically:
(21) Firstly, in accordance with the principle of the present invention, since the two solar blind ultraviolet imaging modules 102 and 103 may obtain their respective translation vectors relative to a target lattice, a vector representation of a connection line between the two solar blind ultraviolet imaging modules 102 and 103 in a target lattice coordinate system may be obtained: (x.sub.1,y.sub.1,z.sub.1), (x.sub.2,y.sub.2,z.sub.2). Since the two solar blind ultraviolet imaging modules 102 and 103 are mounted on one side of the vessel's rail, (x.sub.1,y.sub.1,z.sub.1) and (x.sub.2,y.sub.2,z.sub.2) are vector representations of the rail in the target lattice coordinate system during the berthing of the vessel. However, since a vector representation of a shoreline of a wharf in the target lattice coordinate system may be measured in advance and its vector direction is the same as the X-axis of the group of solar blind ultraviolet lamps 101, an included angle between the course of the vessel and the berth shoreline may be known:
(22) if the vessel is berthed to the right,
(23)
(i.e., the degree of parallelism between the vessel and the shoreline of the port); and
(24) if the vessel is berthed to the left,
(25)
(26) Since the angle of roll of the vessel is generally small, the angle of roll of the vessel is ignored in this embodiment.
(27) In another embodiment, a level meter for measuring the angle of roll of the vessel may also be mounted on the vessel. The level meter may be placed on the bottom of a holder and in a same orientation as the horizontal orientation of the vessel.
(28) Secondly, position information of the vessel relative to the shoreline is determined. At this moment, the solar blind ultraviolet imaging module 102 closer to the head portion of the vessel is used as reference.
(29) When there is a close distance from the vessel to the shoreline, the solar blind ultraviolet imaging modules can clearly identify all signals. Then, the signal processor 104 performs image processing and coordinate transformation on the images taken by the solar blind ultraviolet imaging modules 102 and 103, to obtain position information X, Y and Z of the solar blind ultraviolet imaging modules 102 and 103 in the lamp array coordinate system.
(30)
(31) where R is a rotation matrix, and T is a translation vector.
(32) The algorithm includes the following specific steps:
(33) since the internal parameters of the cameras, the lattice coordinates and image plane coordinates of the target lattice coordinate system (referring to
(34)
(35) where (f.sub.x,f.sub.y,c.sub.x,c.sub.y) represent internal matrix parameters, R is a rotation matrix, T is a translation vector, (u,v) is image plane coordinates (in pixels), and (X,Y,Z) represents lattice coordinates in the target lattice coordinate system; and, the formula may be simplified as follows:
(36)
(37) where (x,y,z) represents coordinates of a target lattice in the camera coordinate system (referring to
(38) During the calculation of the coordinates of the camera in the target lattice coordinate system, since both the internal parameters (f.sub.x,f.sub.y,c.sub.x,c.sub.y) and the lattice coordinates (X,Y,Z) in the target lattice coordinate system are fixed values and the image plane coordinates (u,v) are acquired from an image in real time, the rotation matrix R.sub.0 and the translation vector T.sub.0 at the same moment (u.sub.0,v.sub.0) may be correspondingly obtained in real time. Then, if it is required to obtain the lattice coordinates of the camera in the target lattice coordinate system, the origin (0,0,0) of the camera coordinate system is substituted into the left side of the formula 2 to solve (X.sub.0,Y.sub.0,Z.sub.0) on the right side, thus:
(39)
(40) An inverse matrix R.sub.0.sup.1 of the rotation matrix represents the rotation of the camera coordinate system relative to the target lattice coordinate system, and may be simplified into a rotation vector through transformation. This vector represents an Euler angle of rotation of the camera relative to the target lattice coordinate system.
(41) Among the fixed values mentioned during the calculation of the camera coordinates, the target lattice coordinate is a result measured after manual arrangement, and the internal parameters represent intrinsic parameters of the camera, where f.sub.x and f.sub.y are values of focal lengths by using the number of pixels in the horizontal direction and the vertical direction as a unit of measurement, and c.sub.x and c.sub.y are coordinates of pixels formed by the right front of the center of a camera lens (i.e., a point on a theoretically optical axis) on an image plane.
(42) The distance from the connection line between the solar blind ultraviolet imaging modules 102 and 103 to the shoreline in a vertical direction, i.e., along the vessel's rail, is set as Y.sub.rail. In this case, without considering the angle of roll of the vessel, Y.sub.rail=YL, where L is the distance from the first row of the lamp array to the anti-collision fender. If the distance from the head portion of the vessel to the shoreline in the vertical direction is set as Y.sub.head and the distance from the tail portion of the vessel to the shoreline in the vertical direction is set as Y.sub.tail, Y.sub.head=Y.sub.railL.sub.head*sin a, and Y.sub.tail=Y.sub.rail+L.sub.tail*sin a, where L.sub.head and L.sub.tail are the distances from the solar blind ultraviolet imaging module 102 to the head portion of the vessel and to the tail portion of the vessel, respectively, and a is an included angle between the course of the vessel and the berth shoreline.
(43) After the data is obtained, the schematic diagram and position information of the vessel and the shoreline may be displayed on the display device 105, so that it is convenient for a pilot to pilot the vessel, as shown in
(44) 9. Scene simulation is performed to output a schematic navigation diagram and the position coordinate information into the display device 105.
(45) According to the position information X and Y of the vessel in the lamp array coordinate system, the direction information y of the vessel relative to the shoreline, the position information L.sub.head and L.sub.tail of the solar blind ultraviolet imaging modules relative to the vessel and the width B of the vessel, the schematic diagram and position information Y.sub.head and Y.sub.tail of the vessel and the shoreline may be displayed on the display device 105, as shown in
(46) When in heavily wavy weather, the vessel swings and shakes fiercely, and the course angle, the angle of pitch and the angle of roll need to be taken into consideration during the calculation of the attitude angle of the vessel. However, when in less wavy weather, the vessel does not swing fiercely, and the vessel is approximately in the horizontal direction. In this case, the course angle of the vessel may represent the direction of the vessel relative to the shoreline, and the angle of pitch and the angle of roll need not to be taken into consideration or calculation.
(47) In this embodiment, the solar blind ultraviolet imaging module 102 closer to the head portion of the vessel is used as reference. If two solar blind ultraviolet imaging modules are used as reference, it is assumed that the coordinates of the two solar blind ultraviolet imaging modules 102 and 103 relative to the berth in the lamp array coordinate system are assumed as A.sub.1(X.sub.1,Y.sub.1,Z.sub.1) and A.sub.2(X.sub.2,Y.sub.2,Z.sub.2), respectively, and the distances from the solar blind ultraviolet imaging module 103 to the head portion of the vessel and to the tail portion of the vessel are assumed as L.sub.head and L.sub.tail, respectively, then the distances obtained from A.sub.1 and A.sub.2 are as follows:
(48)
(49) The differences between the both are as follows:
(50)
(51) where the distance between A.sub.1 and A.sub.2 is as follows:
L.sub.headL.sub.head=L.sub.tailL.sub.tail(6)
(52) Since {square root over ((X.sub.1X.sub.2).sup.2+(Y.sub.1Y.sub.2).sup.2)} also represents the distance between A.sub.1 and A.sub.2, two errors Y.sub.head and Y.sub.tail are less than the error between the measured values (L.sub.headL.sub.head).sup.2 and {square root over ((X.sub.1X.sub.2).sup.2+(Y.sub.1Y.sub.2).sup.2)}; and, if L.sub.head is not measured, and instead L.sub.head and L.sub.tail are obtained from {square root over ((X.sub.1X.sub.2).sup.2+(Y.sub.1Y.sub.2).sup.2)} and L.sub.head, both the errors Y.sub.head and Y.sub.tail are zero.
Embodiment 2
(53) When one solar blind ultraviolet imaging module is used, a group of three-dimensional position coordinate values of the vessel may be measured.
(54) When two or more solar blind ultraviolet imaging modules are used, each of the solar blind ultraviolet imaging modules generates a group of coordinate data of the vessel. Thus, multiple groups of coordinate data of the vessel will be obtained.
(55) The redundant original coordinate data of the vessel may be integrated by a normalized correlation algorithm.
(56) The redundant original coordinate data of the vessel may also be integrated by a data fusion algorithm.
(57) When two solar blind ultraviolet imaging modules are used, the attitude angle of the vessel may be measured.
(58) When more than two solar blind ultraviolet imaging modules are used, the solar blind ultraviolet imaging modules are combined pairwise, and each combination may obtain or generate a group of attitude angle data of the vessel. Thus, multiple groups of attitude angle data of the vessel will be obtained.
(59) The redundant original coordinate data of the vessel may be integrated by a normalized correlation algorithm.
(60) In a further embodiment, multiple groups of coordinates (X,Y,Z) of a camera in a target lattice coordinate system are obtained by a plurality of solar blind ultraviolet imaging modules. To obtain the optimal positioning data, the algorithm is as follows.
(61) A vector p.sub.i(x.sub.i,y.sub.i,z.sub.i) is used to represent the positioning data, which is subjected to angular and spatial transformation, returned by N systems, where i=1, 2, 3 . . . N. The positioning data, which is subjected to angular and spatial transformation, is obtained by the following method: when the relative positions of all the solar blind ultraviolet imaging modules and the attitude angle of the vessel are known, the measured position data of different measurement modules are converted into the measured position data of a same measurement module based on a spatial position relationship and through spatially geometric transformation. The specific transformation method is as follows:
(62) (1) a reference point is determined, wherein the reference point may be the position of any measurement module among the solar blind ultraviolet receiving modules, or may be another point;
(63) (2) the distance from each other measurement module to the reference point and a direction angle (which is a parameter for a light source reference system and needs to be determined by superposing the attitude angle of the vessel) are measured, so that a corresponding transformation vector is obtained; and
(64) (3) the transformation vector is added to the relative position coordinate parameters obtained by each measurement module to obtain the transformed positioning data.
(65) As shown in
(66)
(67) Then, the coordinate of p.sub.2 after its transformation to the reference position is p.sub.2=p.sub.2+{right arrow over (A)}.
(68) The transformed coordinates of other measurement modules may be obtained by the same method.
(69) In the normalized correlation algorithm, a Normalized Correlation Coefficient (NCC) is used to represent the confidence of the positioning data returned by each system, which is expressed as follows:
(70)
(71) If a threshold is set to be 80% of the average confidence value of all systems, the threshold G may be expressed as follows:
(72)
(73) The positioning data with a lower NCC is filtered according to the threshold G, to obtain a final system confidence weight w, which is expressed as follows:
(74)
(75) Thus, the final fitted positioning data is obtained:
(76)
(77) The algorithm flow is shown in
(78) When the data to be integrated is azimuth angle data, a vector q.sub.i(.sub.i,.sub.i,.sub.i) is used to represent N groups of attitude angle data returned by N measurement subsystems, where =1, 2, 3 . . . N; and, the optimal attitude angle data is also calculated by the normalized correlation algorithm.
(79) The original coordinate data and attitude angle data of the vessel may also be integrated by a data fusion algorithm.
Embodiment 3
(80) The specific steps of performing ultraviolet camera calibration and solving internal parameters in a system with an enhanced close-distance vessel navigation capability according to the present invention will be described below.
(81) There are many methods for calibrating a camera and algorithms for solving the internal parameters. Here, a conventional calibration technology or Zhenyou Zhang's calibration algorithm is preferably selected. In the Zhenyou Zhang's calibration algorithm, a checkerboard-shaped calibration template is used, and connection points of black and white checkers on the calibration template are used as feature points of the calibration target. Calibration targets are placed at different positions, and synchronous acquisition is performed on the camera to obtain internal and external parameters of the camera. This method does not require any expensive instruments, and is excellent in robustness, easy to operate and improved in accuracy in comparison with self-calibration. However, all calibration methods and algorithms for solving internal parameters available to this embodiment shall be included in the present invention.
(82)
(83) Step 1-1-1: An ultraviolet light source array is arranged. The ultraviolet light source array is a planar and rectangular grid-shaped ultraviolet light source array.
(84) Step 1-1-2: Geometrical information of the ultraviolet light source array is measured, and coordinates c.sub.w={X.sub.1,Y.sub.1,Z.sub.1},{X.sub.2,Y.sub.2,Z.sub.2} . . . {X.sub.30,Y.sub.30,Z.sub.30} of a specific ultraviolet point in a coordinate system o-xyz are measured. The geometrical information of the ultraviolet light source array refers to the coordinates of the specific ultraviolet point or an angular point in the world coordinate system.
(85) Step 1-2: The ultraviolet light source array is shot by a solar blind ultraviolet imaging module 104. The selected shooting position A should fulfill the following conditions: at different shooting positions, different orientations of the OA are not parallel, and n groups of images are shot, where n should be greater than 3 in this embodiment.
(86) Step 2-2: The signal processor 105 performs software processing on the shot digital images to obtain image plane coordinate groups ci.sub.1,ci.sub.2,ci.sub.3 . . . ci.sub.n of the specific ultraviolet point, where there are total n groups.
(87) Step 2-2: c.sub.w and ci.sub.1,ci.sub.2 . . . ci.sub.n are processed by the Zhenyou Zhang's calibration algorithm to obtain internal parameters (f.sub.x, f.sub.y, c.sub.x, c.sub.y, k.sub.x, k.sub.y, etc.) of the camera.
(88) The principle of the Zhenyou Zhang's calibration algorithm is as follows:
(89) 1) Correspondence Between Angular Points of the Calibration Target and Corresponding Image Points
(90) If the plane of the calibration target is assumed as Z.sub.w=0, then:
(91)
(92) where A is determined by f.sub.x, f.sub.y, v.sub.0, u.sub.0 and s, i.e., the internal parameters of the camera, and is related to only the internal structure of the camera; and, H is an external parameter of the camera, and directly reflects the position of the camera in the space. (u,v) represents the pixel coordinates in the image coordinate system, and the world coordinate system is (X.sub.w, Y.sub.w, Z.sub.w). S is an amplification factor, where s=f.sub.x cot . f.sub.x=f/.sub.x and f.sub.y=f/.sub.y, where f is the focal length of the lens. [X.sub.w, Y.sub.x, Z.sub.w, 1].sup.T represents the world coordinates of any object point in the space, and [u, v, 1]T represents the pixel coordinates of this object point in an imaging point of the camera.
(93) The translation matrix T=[T.sub.x,T.sub.y,T.sub.z].sup.T is a 44 matrix, the rotation matrix R is an 33 orthogonal identity matrix, and both the translation matrix T and the rotation matrix R(r1 r2 r3) are external parameters.
(94) If it is assumed that H=[h.sub.1 h.sub.2 h.sub.3] then:
H=[h.sub.1h.sub.2h.sub.3]=A[r.sub.1r.sub.2T](13)
(95) where is an arbitrary scaling factor, r.sub.1 is orthogonal to r.sub.2, and two constraints for A can be obtained:
(96)
(97) 2) Solution of Parameters
(98)
(99) It can be seen from the formula that B is a positive definite symmetric matrix, which is defined as follows:
b=[B.sub.11B.sub.12B.sub.22B.sub.13B.sub.23B.sub.33].sup.T(16)
(100) If it is assumed that the i.sup.th column of H is h.sub.i, then:
h.sub.i.sup.TBh.sub.j=v.sub.ij.sup.Tb(17)
and:
v.sub.ij=[h.sub.1ih.sub.1jh.sub.1ih.sub.2j+h.sub.2ih.sub.1jh.sub.2ih.sub.2jh.sub.3ih.sub.1j+h.sub.1ih.sub.3jh.sub.3ih.sub.2j+h.sub.2ih.sub.2j+h.sub.2ih.sub.3jh.sub.2ih.sub.3j].sup.T(18)
hence:
(101)
(102) That is:
Vb=0(20)
(103) where V is a 2n6 matrix; and, when n>2, b has a unique solution, that is, at least three pictures needs to be collected. The internal parameters are decomposed by Cholesky decomposition:
(104)
(105) Thus, the external parameters are solved:
(106)
(107) 3) Non-Linear Optimization
(108) Parameter optimization is performed according to a maximum-likelihood criterion, and the target function is as follows:
(109)
(110) where