HIGHLY RELIABLE AND HIGH-PRECISION NAVIGATION AND POSITIONING METHOD AND SYSTEM FOR UAV UNDER GPS-DENIED CONDITIONS

20250172687 ยท 2025-05-29

Assignee

Inventors

Cpc classification

International classification

Abstract

A highly reliable and high-precision navigation and positioning method and system for UAV under GPS-DENIED conditions includes: obtaining measurements of inertial sensor and image of binocular camera, extracting and tracking point features of the image, and obtaining altitude value; calculating position and attitude of camera and depth of landmark point based on the parallax and baseline of the binocular camera; fusing the measurement data of the inertial sensor and the binocular camera, and optimizing the data to obtain high-precision position and attitude data; obtaining the altitude value of the ranging radar and performing four-degree-of-freedom position and attitude graph optimization after adding the detected keyframes when repeated occurrence of the UAV at the same location is detected; encapsulating optimized position and attitude data to form a pseudo-GPS signal, inputting the pseudo-GPS signal to the UAV for positioning.

Claims

1-10. (canceled)

11. A navigation and positioning method for an unmanned aerial vehicle (UAV) under GPS-DENIED conditions, wherein the UAV comprises: an inertial sensor, a binocular camera, a ranging radar and a loopback detection module; the method comprising: obtaining measurements of the inertial sensor and an image of the binocular camera, extracting and tracking point features of the image, and obtaining an altitude value measured by the ranging radar; initializing the binocular camera, and calculating a position and attitude of camera and a depth of landmark point based on a parallax and a baseline of the binocular camera; fusing measurement data of the inertial sensor and the binocular camera, using sliding-window based nonlinear graph optimization to obtain high-precision position and attitude data; obtaining the altitude value of the ranging radar and performing four-degree-of-freedom position and attitude graph optimization after adding detected keyframes when the loopback detection module detects a repeated occurrence of the UAV at a same location; encapsulating optimized position and attitude data to form a pseudo-GPS signal, inputting the pseudo-GPS signal to the UAV for positioning, and performing route planning and setting landmark point tasks based on the current positioning.

12. The navigation and positioning method for UAV under GPS-DENIED conditions according to claim 11, further comprising: integrating the measurements collected by the inertial sensor to obtain position, attitude and velocity of the UAV, integration formula is: [ b k + 1 b k b k + 1 b k b k + 1 b k 0 0 ] = [ R w b k ( p b k + 1 w - p b k w + 1 2 g w t k 2 - v b k w t k ) R w b k ( v b k + 1 w + g w t k - v b k w ) q b k w - 1 .Math. q b k + 1 w b a b k + 1 - b a b k b w b k + 1 - b w b k ] ; wherein, b.sub.k means the kth keyframe moment in IMU coordinate system, .sub.b.sub.k+1.sup.b.sup.k indicates an amount of change in position between b.sub.k and b.sub.k+1 frame moment, .sub.b.sub.k+1.sup.b.sup.k indicates an amount of change in velocity between b.sub.k and b.sub.k+1 frame moment, .sub.b.sub.k+1.sup.b.sup.k indicates an amount of change in attitude between b.sub.k and b.sub.k+1 frame moment, R.sub.w.sup.b.sup.k indicates a rotation from world coordinate system to the IMU coordinate system at b.sub.k frame moment, p.sub.b.sub.k+1.sup.w indicates a position of the world coordinate system at b.sub.k+1 frame moment, p.sub.b.sub.k.sup.w indicates a position of the world coordinate system at b.sub.k frame moment, g.sup.w indicates an acceleration of gravity in the world coordinate system, t.sub.k indicates a time interval from b.sub.k frame moment to b.sub.k+1 frame moment, v.sub.b.sub.k.sup.w indicates a velocity in the world coordinate system at b.sub.k frame moment, v.sub.b.sub.k+1.sup.w indicates a velocity in the world coordinate system at b.sub.k+1 frame moment, q.sub.b.sub.k.sup.w indicates an attitude in the world coordinate system at b.sub.k frame moment, q.sub.b.sub.k+1.sup.w indicates an attitude in the world coordinate system at b.sub.k+1 frame moment, b a b k + 1 indicates a bias of IMU accelerometer at b.sub.k+1 frame moment, b a b k indicates a bias of IMU accelerometer at b.sub.k frame moment, b w b k + 1 indicates a bias of IMU gyroscope at b.sub.k+1 frame moment, and b w b k indicates a bias of IMU gyroscope at b.sub.k frame moment.

13. The navigation and positioning method for UAV under GPS-DENIED conditions according to claim 12, further comprising: obtaining state variables in the sliding-window, the state variables comprising: position of IMU coordinate system, velocity, attitude, accelerometer bias, gyroscope bias, external parameter from camera to IMU, inverse depths of the m+1 3D landmark points during n+1 keyframe moments within sliding-window; wherein, state variables to be solved are expressed as: = [ x 0 , x 1 , .Math. x n , 0 , 1 , .Math. m ] x k = [ p b k w , v b k w , q b k w , b a , b g ] , k [ 0 , n ] x c b = [ p c b , q c b ] ; wherein, X denotes a set of state variables to be optimized during n+1 keyframe moments and m+1 landmark points, x.sub.k denotes a set of position p.sub.b.sub.k, velocity v.sub.b.sub.k, attitude q.sub.b.sub.k, bias b.sub.a of the IMU accelerometer and bias b.sub.g of the IMU gyroscope in the world coordinate system at b.sub.k frame moment, x.sub.c.sup.b denotes a translation p.sub.c.sup.b and a rotation q.sub.c.sup.b from a camera coordinate system to the IMU coordinate system.

14. The navigation and positioning method for UAV under GPS-DENIED conditions according to claim 13, wherein graph optimization cost function is: min { .Math. r p - H p .Math. 2 + .Math. k B .Math. r B ( b k + 1 b k , ) .Math. p b k + 1 b k 2 + .Math. ( f , j ) .Math. r C ( f c j , ) .Math. p f c j 2 + .Math. ( l , v ) L .Math. r L ( l c v , ) .Math. p l c v 2 } ; wherein, r.sub.p denotes a marginalized priori error, H.sub.p denotes a Hessian matrix, B denotes a set of IMU measurements, z.sub.b.sub.k+1.sup.b.sup.k denotes measurements of IMU between b.sub.k and b.sub.k+1 frame moment, p.sub.b.sub.k+1.sup.b.sup.k denotes a position from b.sub.k to b.sub.k+1 frame moment, r.sub.B denotes a errors in pre-integration, C denotes a set of point features, z.sub.f.sup.c.sup.j denotes an observation value at fth point feature of c.sub.jth image, p.sub.f.sup.c.sup.j denotes a position of fth point feature of c.sub.jth image, r.sub.c denotes a reprojection error of point features, L denotes a set of line features, z.sub.l.sup.c.sup.v denotes an observation value at lth line feature of c.sub.vth image, p.sub.l.sup.c.sup.v denotes a position of lth line feature of c.sub.vth image, r.sub.L denotes a reprojection error of line features.

15. The navigation and positioning method for UAV under GPS-DENIED conditions according to claim 11, wherein the parallax of the binocular camera is a difference in orientation between two cameras of the binocular camera observing a same target, and an angle between the two cameras as viewed from the target indicates a parallax angle of the two cameras, and a line connecting the two cameras is the baseline.

16. The navigation and positioning method for UAV under GPS-DENIED conditions according to claim 11, wherein the position and attitude of camera is used for UAV navigation, and the depth of landmark point is used to build a map of the environment.

17. The navigation and positioning method for UAV under GPS-DENIED conditions according to claim 12, wherein the fusing measurement data of the inertial sensor and the binocular camera comprises: fusing position, attitude and velocity obtained by integration of the inertial sensor, features extracted by the binocular camera, and image features matched by the binocular camera, to obtain optimized position, attitude and velocity.

18. A navigation and positioning system for an unmanned aerial vehicle (UAV) under GPS-DENIED conditions, comprising: a data obtaining module, configured to obtain measurements of an inertial sensor and an image of a binocular camera, extract and track point features of the image, and obtain an altitude value measured by a ranging radar; a data calculating module, configured to initialize a binocular camera, and calculate a position and attitude of camera and a depth of the landmark point based on a parallax and a baseline of the binocular camera; a fusing and optimal module, configured to fuse measurement data of the inertial sensor and the binocular camera, use sliding-window based nonlinear graph optimization to obtain high-precision position and attitude data; an optimal position and attitude module, configured to obtain the altitude value of the ranging radar and perform four-degree-of-freedom position and attitude graph optimization after adding detected keyframes when a loopback detection module detects a repeated occurrence of the UAV at the same location; and a navigation and positioning module, configured to encapsulate optimized position and attitude data to form a pseudo-GPS signal, input the pseudo-GPS signal to the UAV for positioning, and perform route planning and set landmark point tasks based on the current positioning.

19. An unmanned aerial vehicle (UAV), comprising: a memory, a processor and a navigation and positioning program for UAV under GPS-DENIED conditions stored on the memory and runnable on the processor, the highly reliable and high-precision navigation and positioning program for UAV under GPS-DENIED conditions, when being executed by the processor, implements the steps of the highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions according to claim 11.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0035] FIG. 1 shows a flow chart of an embodiment of a highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions provided by the present disclosure;

[0036] FIG. 2 shows a flow chart of an embodiment of entire positioning execution step in a highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions provided by the present disclosure;

[0037] FIG. 3 shows a tightly coupled visual-inertial graph optimization based on a sliding-window of an embodiment of a highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions provided by the present disclosure;

[0038] FIG. 4 shows schematic diagram of the optimization of the position map for loopback detection of an embodiment of a highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions provided by the present disclosure;

[0039] FIG. 5 shows a schematic diagram of a UAV flight trajectory of an embodiment of a highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions provided by the present disclosure;

[0040] FIG. 6 shows a schematic diagram of a UAV flight trajectory based on point-line features of an embodiment of a highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions provided by the present disclosure;

[0041] FIG. 7 shows a schematic diagram of the principle of an embodiment of a highly reliable and high-precision navigation and positioning system for UAV under GPS-DENIED conditions provided by the present disclosure;

[0042] FIG. 8 shows a schematic diagram of an operating environment of an embodiment of a UAV provided by the present disclosure;

DETAILED DESCRIPTION OF EMBODIMENTS

[0043] In order to make the objects, technical solutions and effects of the present disclosure clearer and more explicit, the present disclosure is described in further detail hereinafter with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are only for explaining the present disclosure and are not intended to limit the present disclosure.

[0044] For the highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions described in an embodiment of the present disclosure, as shown in FIG. 1 and FIG. 2, the highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions includes the following steps:

[0045] Step 10, obtaining the measurements of the inertial sensor and the image of the binocular camera, extracting and tracking the point features of the image, and obtaining the altitude value measured by the ranging radar.

[0046] In the present embodiment, the hardware computing platform of the present disclosure is: NVIDIA AGX Xavier, 8-Core Carmel ARM, 512 Core Volta, 32 GB 256 bit LPDDR4x; IMU (Inertial Measurement Unit, inertial sensors, 200 HZ, mainly used to detect and measure the acceleration and rotational motion) measurements are obtained, and images of the binocular cameras (e.g. the depth camera is the Intel Realsense D455, with a depth field of view of 8657(3)) are acquired. The IMU of the D455 is labeled and positioned using IMU_utils, and the internal references of the D455 binocular camera and the external references between the camera and the IMU are labeled and positioned using kalibr and Apriltag. The core problem of visual odometry is how to estimate camera motion from images. However, the image itself is a matrix composed of luminance and color, and it would be very difficult to consider motion estimation directly from the matrix level. Therefore, it is more convenient to extract and track the representative point features and line features from the image. The point features are tracked using Haris corner points with KLT optical flow, the line features are detected using a modified LSD (Line Segment Detector) algorithm, described by the LBD (Line Description Detector), and matched by KNN (K-Nearest Neighbor). And the IMU part uses the pre-integration technique, which integrates the measurements of multiple IMUs to get the position p, attitude q, and velocity v of the system as shown in Eq. 1. Also, the altitude value measured by the ranging radar is obtained.

[0047] That is to say, the plurality of measurements collected by the inertial sensor are integrated to obtain the position, attitude and velocity of the UAV, the integration formula is as follow:

[00008] [ b k + 1 b k b k + 1 b k b k + 1 b k 0 0 ] = [ R w b k ( p b k + 1 w - p b k w + 1 2 g w t k 2 - v b k w t k ) R w b k ( v b k + 1 w + g w t k - v b k w ) q b k w - 1 .Math. q b k + 1 w b a b k + 1 - b a b k b w b k + 1 - b w b k ] . ( Eq . 1 )

[0048] Among them, b.sub.k means the kth keyframe moment in IMU coordinate system, .sub.b.sub.k+1.sup.b.sup.k indicates an amount of change in position between b.sub.k and b.sub.k+1 frame moment, .sub.b.sub.k+1.sup.b.sup.k indicates an amount of change in velocity between b.sub.k and b.sub.k+1 frame moment, .sub.b.sub.k+1.sup.b.sup.k indicates an amount of change in attitude between b.sub.k and b.sub.k+1 frame moment, R.sub.w.sup.b.sup.k indicates a rotation from world coordinate system to the IMU coordinate system at b.sub.k frame moment, p.sub.b.sub.k+1.sup.w indicates a position of the world coordinate system at b.sub.k+1 frame moment, p.sub.b.sub.k.sup.w indicates a position of the world coordinate system at b.sub.k frame moment, g.sup.w indicates an acceleration of gravity in the world coordinate system, t.sub.k indicates a time interval from b.sub.k frame moment to b.sub.k+1 frame moment, v.sub.b.sub.k.sup.w indicates a velocity in the world coordinate system at b.sub.k frame moment, v.sub.b.sub.k+1.sup.w indicates a velocity in the world coordinate system at b.sub.k+1 frame moment, q.sub.b.sub.k.sup.w indicates an attitude in the world coordinate system at b.sub.k frame moment, q.sub.b.sub.k+1.sup.w indicates an attitude in the world coordinate system at b.sub.k+1 frame moment,

[00009] b a b k + 1

indicates a bias of IMU accelerometer at b.sub.k+1 frame moment,

[00010] b a b k

indicates a bias of IMU accelerometer at b.sub.k frame moment,

[00011] b w b k + 1

indicates a bias of IMU gyroscope at b.sub.k+1 frame moment, and

[00012] b w b k

indicates a bias of IMU gyroscope at b.sub.k frame moment.

[0049] Step 20, initializing the binocular camera, calculating the position and attitude of camera and the depth of landmark point based on the parallax and baseline of the binocular camera.

[0050] In the present embodiment, firstly the binocular camera is initialized, the parallax exists in the binocular camera. The parallax is the difference in orientation while observing the same target from two points, such as two cameras, having a certain distance therebetween, the parallax angle between the two points is the angle while observing the two points from the target, and the baseline is the line connecting the two points. Once the parallax angle and the length of the baseline are obtained, the distance between the target and the observer can be calculated. Based on the parallax and the baseline of the binocular camera the position and attitude of camera and the landmark points, i.e., the depth of image features, can be calculated. Then the calculated position and attitude of camera can be utilized for UAV navigation and the depth of landmark points is adopted to build the map of the environment. Since the gyroscope can also integrate the camera's rotation, the bias of IMU can be calculated by the equation that the rotation calculated by the camera is equal to the rotation integrated by the IMU. Then the calculated bias is used to re-integrate the previous IMU data. Subsequently, the position and attitude calculated by the IMU and the camera are further optimized. Ultimately, a relatively accurate position and attitude of the UAV at the current moment is provided to be the initial value for the back-end nonlinear optimization.

[0051] The back-end of the present disclosure adopts a form of sliding-window to optimize the point features and line features together. The state variables in the sliding-window include the position of the IMU coordinate system, velocity, attitude (rotation), accelerometer bias, gyroscope bias, external parameter from Camera to IMU, and the inverse depths of m+1 3D roadmap points at the n+1 keyframe moment within the sliding-window, and the state variables to be solved are expressed as Eq. 2:

[00013] = [ x 0 , x 1 , .Math. x n , 0 , 1 , .Math. m ] x k = [ p b k w , v b k w , q b k w , b a , b g ] , k [ 0 , n ] x c b = [ p c b , q c b ] . ( Eq . 2 )

[0052] Among them, X denotes a set of state variables to be optimized during n+1 keyframe moments and m+1 landmark points, x.sub.k denotes a set of position p.sub.b.sub.k, velocity v.sub.b.sub.k, attitude q.sub.b.sub.k, bias b.sub.a of the IMU accelerometer and bias b.sub.g of the IMU gyroscope in the world coordinate system at b.sub.k frame moment, x.sub.c.sup.b denotes a translation p.sub.c.sup.b and a rotation q.sub.c.sup.b from a camera coordinate system to the IMU coordinate system.

[0053] The graph optimization cost function is shown in Eq 3:

[00014] min { .Math. r p - H p .Math. 2 + .Math. k B .Math. r B ( b k + 1 b k , ) .Math. p b k + 1 b k 2 + .Math. ( f , j ) .Math. r C ( f c j , ) .Math. p f c j 2 + .Math. ( l , v ) L .Math. r L ( l c v , ) .Math. p l c v 2 } . ( Eq . 3 )

[0054] Among them, r.sub.p denotes a marginalized priori error, H.sub.p denotes a Hessian matrix, B denotes a set of IMU measurements, z.sub.b.sub.k+1.sup.b.sup.k denotes measurements of IMU between b.sub.k and b.sub.k+1 frame moment, p.sub.b.sub.k+1.sup.b.sup.k denotes a position from b.sub.k to b.sub.k+1 frame moment, r.sub.B denotes a errors in pre-integration, C denotes a set of point features, z.sub.f.sup.c.sup.j denotes an observation value at fth point feature of c.sub.jth image, p.sub.f.sup.c.sup.j denotes a position of fth point feature of c.sub.jth image, r.sub.c denotes a reprojection error of point features, L denotes a set of line features, z.sub.l.sup.c.sup.v denotes an observation value at lth line feature of c.sub.vth image, p.sub.l.sup.c.sup.v denotes a position of lth line feature of c.sub.vth image, r.sub.L denotes a reprojection error of line features. The structure of the graph optimization is illustrated in FIG. 3.

[0055] Step 30, fusing the measurement data of the inertial sensor and the binocular camera, using sliding-window-based nonlinear graph optimization to obtain high-precision position and attitude data.

[0056] Specifically, the front-end of the system adopts a tightly coupled method to fuse the optimized position, velocity, and attitude obtained from the integration of the IMU data with the image features extracted and matched by the binocular camera, to obtain the optimized position, velocity, and attitude. And the back-end adopts a sliding-window-based nonlinear graph optimization for further optimization to obtain the high-precision position and attitude data.

[0057] Step 40, obtaining the altitude value of the ranging radar and performing four-degree-of-freedom position and attitude graph optimization after adding the detected keyframes, when the loopback detection module detecting a repeated occurrence of the UAV at the same location.

[0058] Specifically, the loopback detection module performs loop detection with another thread utilizing a bag-of-words model with a BRIEF descriptor to detect if the UAV is reappeared at the same location. When a loop is detected, a connection needs to be established between the new keyframe and its loop candidate by means of the retrieved features. As shown in FIG. 4, once the repositioning is detected, the height data measured by the radar is firstly obtained, then the keyframes from the loopback detection are added to the graph optimization for the four-degree-of-freedom (x, y, z, yaw) position and attitude graph optimization. Because the data from the radar is more accurate, it is able to optimize the other three degrees of freedom when performing the global optimization to obtain results having high-precision positioning.

[0059] Step 50, encapsulating the optimized position and attitude data to form a pseudo-GPS signal, inputting the pseudo-GPS signal to the UAV for positioning, performing route planning and setting landmark point tasks based on the current positioning.

[0060] Specifically, the brainware (flight control) on the UAV encapsulates the position and attitude data output from the SLAM via the serial port to form a pseudo-GPS signal via the MavLink (Micro Air Vehicle Message Marshalling Library) protocol, and inputs the pseudo-GPS signal to the flight control to position the UAV. Based on the localization, corresponding route planning and landmark point tasks can be carried out.

[0061] As shown in FIG. 5, the position output by SLAM is given to the flight control, which controls the UAV to return to the origin after flying a circle in the air. The trajectory diagram demonstrates that the take-off point and the landing point of the airplane are the same.

[0062] As shown in FIG. 6, the results of running the Euroc dataset based on the VINS algorithm with point and line features are shown. Compared with the VINS-FUSION with point features alone, the algorithm of the present disclosure can achieve a higher localization accuracy.

[0063] The present disclosure integrates the line features into the binocular-inertial SLAM, and at the same time the laser data for joint optimization is fused, which can effectively reduce the data error of other axes, and improve the localization accuracy of the system in scenes having sparse texture or large illumination changes. At the same time, a restart mechanism when SLAM fails is developed, which can greatly enhance the safety of UAV adopting SLAM for positioning.

[0064] Further, line features are added to the binocular camera to be fused into VINS, and other modules for altimetry are used to measure the height of the UAV from the ground, or in feasible scenarios multiple axes can be measured at the same time to jointly participate in the joint optimization of the position and attitude graph of UAV.

[0065] The technical effects achieved by the technical solution of the present disclosure are as follows:

(1) The measurement data of binocular camera and IMU are fused to form an efficient and high-precision position and attitude estimation algorithm: the data of IMU has the characteristic of sensitive response; the position, attitude and velocity calculated by long-term integration according to the angular velocity and acceleration obtained from IMU may be diverged, and the accuracy may be reduced, and there is a zero bias of IMU. A section of angular velocity data is collected in the stationary state, and under the ideal state, (a.sub.x, a.sub.y, a.sub.z)=(0,0,0), but in reality the output of the gyroscope is a slowly varying curve of a complex white noise signal, and the average of the curve is the zero bias value.

[0066] Because of the existence of zero bias, the estimated position error will be enlarged. Therefore, the IMU is suitable for calculating short time and fast motion. The visual localization module adopts the binocular camera which has better static accuracy, but measurement dead zone and slow response exist, which is suitable for calculating long time and slow motion. The present disclosure adopts a novel framework of sensor fusion to process sensor fusion of IMU and binocular camera, taking the advantages of each to form an efficient and stable localization algorithm.

(2) The current mainstream SLAM algorithms are all completely based on point features, and the SLAM algorithms perform poorly in scenes having sparse textures or repeated textures. The present disclosure introduces a line feature mechanism that the SLAM front-end extracts the line features while extracting the point features. Because line features include more geometric information and are more robust regarding light changes, the accuracy of the SLAM algorithm can be greatly improved. At the same time, better back-end optimization methods are also adopted for multi-sensor fusion, for example, using a more high-precision laser ranging sensor to measure height, and then the height information in the z-axis is used to optimize the data in other axes.
(3) A unique safety mechanism is developed, taking into account the common failure problem of SLAM algorithms, the system can save the state variables before failure and initialize them quickly, and calculate the position change at the time of failure for compensation, so that the safety and robustness of the whole positioning system is greatly improved.

[0067] Further, as shown in FIG. 7, based on the above-described highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions, the present disclosure also provides a highly reliable and high-precision navigation and positioning system for UAV under GPS-DENIED conditions, the highly reliable and high-precision navigation and positioning system for UAV under GPS-DENIED conditions includes: [0068] data obtaining module 51, used for obtaining the measurements of the inertial sensor and an image of the binocular camera, extracting and tracking the point features of the image, and obtaining the altitude values measured by the ranging radar; [0069] data calculating module 52, used for initializing the binocular camera, calculating the position and attitude of camera and the depth of the landmark point based on the parallax and baseline of the binocular camera; [0070] fusing and optimal module 53, used for fusing the measurement data of the inertial sensor and the binocular camera, using sliding-window based nonlinear graph optimization to obtain high-precision position and attitude data; [0071] optimal position and attitude module 54, used for obtaining the altitude value of the ranging radar and perform four-degree-of-freedom position and attitude graph optimization after adding the detected keyframes, when the loopback detection module detects a repeated occurrence of the UAV at the same location; [0072] navigation and positioning module 55, used for encapsulating the optimized position and attitude data to form a pseudo-GPS signal, inputting the pseudo-GPS signal to the UAV for positioning, performing route planning and setting landmark point tasks based on the current positioning.

[0073] Further, as shown in FIG. 8, based on the above-described highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions, the present disclosure also provides a UAV, the UAV includes a processor 10, a memory 20, and a display 30. FIG. 8 only illustrates some of the components of the UAV, but it should be understood that the implementation of all the illustrated components is not required, and more or fewer components may alternatively be implemented.

[0074] The memory 20 in some embodiments may be an internal storage unit of the UAV, such as a hard disk or memory of the UAV. The memory 20 in some other embodiments can also be an external storage device of the UAV, such as a plug-in hard disk equipped on the UAV, a SMC (Smart Media Card), a SD (Secure Digital) card, a Flash Card, etc. Further, the memory 20 may also include both an internal storage unit of the UAV and an external storage device. The memory 20 is used to store application software and various types of data installed in the UAV, such as program code for the installed UAV, etc. The memory 20 may also be used to temporarily store data that has been output or will be output. In an embodiment, a highly reliable and high-precision navigation and positioning program 40 for UAV under GPS-DENIED conditions is stored in the memory 20, and the program 40 can be executed by the processor 10, thereby implementing the high reliable and high precision navigation and positioning method for UAV under GPS-DENIED conditions in the present disclosure.

[0075] The processor 10 may in some embodiments be a CPU (Central Processing Unit), microprocessor or other data processing chip for running program code stored in the memory 20 or processing data, such as executing a highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions.

[0076] The display 30 in some embodiments may be an LED display, an LCD display, a touchscreen LCD display, or an OLED (Organic Light-Emitting Diode) touchscreen, among others. The display 30 is used for displaying information in the UAV as well as for displaying a visual user interface. The components 10-30 of the UAV communicate with each other via a system bus.

[0077] In an embodiment, the processor 10 implements the steps of the highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions when the highly reliable and high-precision navigation and positioning program 40 for UAV under GPS-DENIED conditions in the memory 20 is executed by the processor 10.

[0078] The present disclosure also provides a computer-readable storage medium. The computer-readable storage medium stores a highly reliable and high-precision navigation and positioning program for UAV under GPS-DENIED conditions, while the highly reliable and high-precision navigation and positioning program for UAV under GPS-DENIED conditions being executed by a processor, the steps of the highly reliable and high-precision navigation and positioning method for UAV under GPS-DENIED conditions as described above are implemented.

[0079] In summary, the present disclosure provides a highly reliable and high-precision navigation and positioning method and system for UAV under GPS-DENIED conditions, a UAV and a computer-readable storage medium. The method includes steps of: obtaining the measurements of the inertial sensor and the image of the binocular camera, extracting and tracking point features of the image, and obtaining the altitude value measured by the ranging radar; initializing the binocular camera, calculating position and attitude of camera and depth of landmark point based on the parallax and baseline of the binocular camera; fusing the measurement data of the inertial sensor and the binocular camera, using sliding-window-based nonlinear graph optimization to obtain high-precision position and attitude data; obtaining the altitude value of the ranging radar and performing four-degree-of-freedom position and attitude graph optimization after adding the detected keyframes, when the loopback detection module detects a repeated occurrence of the UAV at the same location; encapsulating the optimized position and attitude data to form a pseudo-GPS signal, inputting the pseudo-GPS signal to the UAV for positioning, performing route planning and setting landmark point tasks based on the current positioning. The present disclosure provides positioning signals efficiently and accurately by fusing the data from the inertial sensor and binocular camera, so that the safety and robustness of the UAV positioning system are greatly improved.

[0080] It should be noted that, as used herein, the terms including, includes, or any other variant thereof, are intended to encompass non-exclusive inclusion, such that a process, method, article, or UAV that comprises a set of elements not only comprises those elements, but also comprises other elements that are not explicitly listed or that are inherent to such process, method, article, or UAV. includes other elements that are not explicitly listed, or that are inherent to such process, method, article or UAV. Without further limitation, the fact that an element is defined by the phrase includes a . . . does not preclude the existence of another identical element in the process, method, article or UAV that includes that element.

[0081] Of course, those skilled in the art may understand that all or part of the process of realizing the method of the above embodiments may be accomplished by a computer program to instruct the relevant hardware (e.g., a processor, a controller, etc.), and the program may be stored in a computer-readable storage medium, and the program may include the process of the above embodiments of the method in the execution thereof. The computer-readable storage medium may be a memory, a disk, a CD-ROM, and the like.

[0082] It should be understood that for those skilled in the art, equivalent substitutions or changes may be made in accordance with the technical solution of the present disclosure and its inventive conception, and all such changes or substitutions shall fall within the scope of protection of the claims appended to the present disclosure.