Method and device for in-flight terrain identification for microdrone
10061319 ยท 2018-08-28
Assignee
Inventors
Cpc classification
G01F1/00
PHYSICS
B64U2101/55
PERFORMING OPERATIONS; TRANSPORTING
B64U10/80
PERFORMING OPERATIONS; TRANSPORTING
B64U2201/10
PERFORMING OPERATIONS; TRANSPORTING
G05D1/0094
PHYSICS
B64U50/19
PERFORMING OPERATIONS; TRANSPORTING
International classification
G01C23/00
PHYSICS
G01F1/00
PHYSICS
G05D1/00
PHYSICS
Abstract
The invention relates to a surface identification device for the movement of a vehicle at a distance from that surface, the device comprising a detection head, the head including at least one sensor of a property depending on the distance of the center of the head from the surface, each sensor covering a detection zone centered on a line of sight, an orientation system for the detection zone of each sensor, and a controller processing the signals from each sensor and controlling the system based on said signals. The controller estimates the direction of the perpendicular to the surface, and uses said system to rotate the line of sight of each sensor in a separate direction by a reorientation angle of the direction of said perpendicular.
Claims
1. A surface identification device for the autonomous movement of a moving vehicle at a distance from that surface, the surface identification device comprising: a) a surface detection head defining a coordinate system fixed to the detection head with an origin at a center of the detection head, an axis of advance, a drift axis, an ascent axis, the detection head including at least one sensor for a quantity depending on the distance of the center of the detection head from the surface, each sensor covering a detection zone centered on a line of sight; b) an orientation system for orienting the detection zone of each sensor by rotating the detection zone around the drift axis; and c) a controller suitable for receiving and processing signals from each sensor and controlling the system for orienting the detection zone based on said signals, wherein the controller is configured for estimating the direction of the perpendicular to the surface in said coordinate system based on said signals, and for using said orientation system to rotate the line of sight of the viewing zone of each sensor into a direction separated by a determined reorientation angle from the direction of said perpendicular.
2. The device according to claim 1, wherein the reorientation angle is comprised between 0 and 90?.
3. The device according to claim 1, wherein each sensor is an optical flow sensor, and wherein the controller is configured for estimating the direction of the perpendicular to the surface by determining a maximum optical flow direction based on the signals delivered by each optical flow sensor.
4. The device according to claim 3, wherein the controller is configured for determining the maximum optical flow direction by carrying out the following steps: a) determining a function of the optical flow from the orientation of the line of sight of the detection zone in said coordinate system by regression analysis of the optical flow signals provided by each optical flow sensor; and b) determining the orientation of the line of sight of the detection zone for which said function has a maximum.
5. The device according to claim 4, wherein step a) is carried out using a least squares method.
6. The device according to claim 4, wherein step b) is carried out by differentiation of said function.
7. The device according to claim 1, wherein the controller is further configured for computing a confidence index based on said signals, and validating or rejecting the estimate of the direction of the perpendicular to the surface based on the value of the confidence index.
8. The device according to claim 1, wherein the detection head comprises four optical flow sensors, the four optical flow sensors being a front ventral sensor, a rear ventral sensor, a front dorsal sensor, and a rear dorsal sensor.
9. The device according to claim 1, wherein the detection zone orientation system comprises an actuator able to rotate the surface detection head around the drift axis.
10. The device according to claim 9, wherein the actuator is a stepping motor.
11. The device according to claim 1, provided with an optical flow measuring assembly comprising a matrix of photodetectors covering a detection field of substantially 360??60? around the drift axis, and wherein the orientation system of the detection zone is able to select one or more subsets of said photodetectors to represent each of said sensors.
12. The device according to claim 1, wherein the detection head is further equipped with at least one gyro-stabilization sensor of the detection head to compensate the rotational movements of said moving vehicle during flight.
13. An aerial microdrone with autonomous movement comprising a device for the autonomous movement of a moving vehicle at a distance from that surface, the device comprising: a) a surface detection head defining a coordinate system fixed to the detection head with an origin at a center of the detection head, an axis of advance, a drift axis, an ascent axis, the detection head including at least one sensor for a quantity depending on the distance of the center of the detection head from the surface, each sensor covering a detection zone centered on a line of sight; b) an orientation system for orienting the detection zone of each sensor by rotating the detection zone around the drift axis; and c) a controller suitable for receiving and processing signals from each sensor and controlling the system for orienting the detection zone based on the signals, wherein the controller is configured for estimating the direction of a perpendicular to the surface in the coordinate system based on the signals, and for using the orientation system to rotate the line of sight of the viewing zone of each sensor into a direction separated by a determined reorientation angle from the direction of the perpendicular, wherein the detection zone orientation system comprises an actuator able to rotate the surface detection head around the drift axis, the controller being configured for controlling the actuator so as to align the axis of ascent of the detection head in the estimated direction of the perpendicular to the surface and thus to keep the axis of advance of the detection head parallel to the local slope of the surface overflown by the microdrone.
14. The microdrone according to claim 13, further comprising a gyrometer or an airspeed indicator or both for controlling the linear advance speed of the microdrone.
15. A surface identification method for the autonomous movement at a distance from said surface, the method comprising: defining a coordinate system fixed to a surface detection head of a device with an origin at a center of the surface detection head, an axis of advance, a drift axis, and an ascent axis; providing a detection zone centered on a line of sight of the at least one sensor; orienting an orientation system for the detection zone of the at least one sensor by rotating the detection zone around the drift axis; measuring a quantity of the at least one sensor depending on said distance from the surface in the provided detection zone; estimating with a controller of the device suitable for receiving and processing signals from the at least one sensor the direction of a perpendicular to the surface in the coordinate system based on the signals and thus the minimum distance direction from the surface based on the measured quantity; and reorienting with the controller the line of sight of the detection zone in a direction separated by a determined reorientation angle from the direction of said perpendicular.
16. The method according to claim 15, wherein the measuring step consists of measuring the optical flow, and the estimating step comprises: i) determining the function of the optical flow of the orientation of the line of sight of the detection zone by regression analysis of the optical flow measurements; and ii) determining the orientation of the line of sight of the detection zone for which said function has a maximum.
17. The method according to claim 16, wherein step i) is carried out using a least squares method.
18. The method according to claim 16, wherein step ii) is carried out by differentiation of said function.
19. The method according to claim 15, wherein the detection zone orientation system comprises an actuator able to rotate the surface detection head around the drift axis, the controller being configured for controlling the actuator so as to align the axis of ascent of the detection head in the estimated direction of the perpendicular to the surface and thus to keep the axis of advance of the detection head parallel to a local slope of the surface.
Description
(1) The invention will be better understood upon reading the following detailed description, done in reference to the appended drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
GENERAL CONSIDERATIONS
(10) The present invention preferably applies to flying large drones, microdrones and nanodrones equipped with optical flow sensors. Nevertheless, the invention is not limited to this. It can advantageously applied to other vehicles moving at a distance from the surface, such as a robotic fish, a space vessel, etc.
(11) Aside from optical flow sensors, distance sensors such as ultrasound sensors, laser sensors, radar sensors, sonar sensors, etc., or sensors measuring a property depending on distance such as a sensor based on the electrical direction, can also be used in the context of the invention.
First Embodiment
(12)
(13) Microdrones or nanodrones of this type are applicable in search, reconnaissance or rescue missions, in particular when the relief is steep or in a setting with buildings and/or when GPS navigation is not possible or desirable.
(14) In addition to the two rotors 4, 6, the microdrone 2 includes two electric motors 8, 10 for driving the rotors, a main body 12, and a land identification device 14. The main body 12 defines a longitudinal axis X-X. It includes a printed circuit 16 and a reinforcing bar 18 of the printed circuit 16, preferably made from carbon fibers. The reinforcing bar 18 preferably has a length of about 25 cm. Preferably, a gyrometer 20 measuring the pitch speed of the microdrone 2 is positioned on the printed circuit 16.
(15) The land identification device 14 is defined in
(16) The terrain detection head 22 defines a local coordinate system L connected to the detection head, preferably orthonormal, with an origin E (cf.
(17) In the example illustrated in
(18) It will be noted that the detection head 22 can also have lateral optical flow sensors.
(19) In one alternative, the detection head 22 also has an airspeed indicator 30 measuring the airspeed during flight. Furthermore, the detection head 22 can be equipped with gyro-stabilization sensors, such as GPS sensors or gyrometers. These sensors make it possible to take the frequent tilting movements of the microdrone 2 during flight into account, and thus to improve the measurements of the optical flow sensors 28.
(20) In another alternative, the detection head 22 can have optical flow sensors with a camera having a small field of view or at a very large angle. The optical flow sensors can also comprise a combination of a camera and a mirror in order to create a panoramic viewing field.
(21) In the embodiment according to
(22) The controller 26 is a microcontroller positioned on the printed circuit 16.
(23) In reference to
(24) It will be noted that, in the context of the invention, the term terrain refers to any type of relief or interface that the microdrone is called upon to follow during flight. It may in particular involve a ground, a ceiling, the surface of an expanse of water, a wall, a seabed, etc.
(25) Reference R denotes a land inertial reference with origin O and three axes Xo (transverse), Yo (longitudinal) and Zo (vertical). Reference ? denotes the elevation angle between the ascent axis Ze of the detection head 22 and the line of sight ZC of the detection zone ZV of an optical flow sensor 28. Preferably, the elevation angle ? is approximately 23?, and the detection zone ZV covers approximately 24? times 12?.
(26) The vector V represents the instantaneous linear speed of the microdrone 2. ? is the angle between the vector V and the speed of advance Xe of the detection head 22. Reference Vx denotes the component of the linear speed V along the axis of advance Xe, Vy denotes the component of the linear speed V along the drift axis Ye, and Vz denotes the component of the linear speed V along the ascent axis Ze.
(27) The angle ? is the angle between the local slope P of the terrain T flown over by the microdrone 2 and the horizontal Xo-Yo of the land reference R.
(28) The distance Dsurface is the distance between the surface S of the terrain T and the center E of the detection head 22 along the normal N to the slope P of the terrain T.
(29) The angle ?.sub.pitch is the angle between the longitudinal axis X-X of the microdrone 2 and the horizontal Xo-Yo of the land plane of reference R.
(30) The angle ?.sub.head is the angle between the axis of advance Xe of the detection head 22 and the horizontal Xo-Yo of the land reference R.
(31) The angle ?.sub.EiR is the angle between the longitudinal axis X-X of the microdrone 2 and the angle of advance Xe of the detection head 22.
(32) The angle ?.sub.head/slope is the angle between the axis of advance Xe of the detection head 22 and the local slope P of the terrain T.
(33)
(34) Each feedback loop B1, B2, B3 is delimited by dotted lines.
(35) The feedback loop B2 according to the invention is identical to the corresponding feedback loop described in the IROS article. For a detailed explanation of this feedback loop, reference is made to this article, the corresponding content of which is incorporated into the present description.
(36) The feedback loop B1 according to the invention is comparable to the corresponding feedback loop described in the IROS article. It in particular also contains an advance controller 35. For a detailed explanation of the identical elements of these two feedback loops, reference is made to this article, the corresponding content of which is incorporated into the present description.
(37) Compared to the feedback loop of the linear speed of advance known from the IROS article, the feedback loop B1 according to the invention further takes into account the measurements from the airspeed indicator 30 and from the gyrometer 20 using two additional feedback loops B4 and B5 interleaved in feedback loop B1 and respectively using an airspeed controller 36 and a pitch speed controller 38.
(38) The loops B1, B4 and B5 cooperate as follows:
(39) The advance controller 35 generates an airspeed setpoint value based on the result of the comparison between the sum of the measured ventral and dorsal optical flows ?.sub.vtrl+?.sub.drsl (with ?.sub.vtrl proportional to (?.sub.va+?.sub.vd) and ?.sub.drsl proportional to (?.sub.da+?.sub.dd)) and a setpoint value ?.sub.setpoint sum FO. The airspeed setpoint value is next compared to the airspeed measured by the airspeed indicator 30. As a result of this comparison, the airspeed controller 36 generates a setpoint value of the pitch speed of the microdrone 2. The pitch speed setpoint value is next compared to the pitch speed measured by the gyrometer 20. Based on the result of this comparison, the pitch speed controller 38 determines the difference in thrust between the two rotors ??.sub.rotors that is necessary to obtain the desired linear speed of advance.
(40) We will now describe the new feedback loop B3 of the orientation of the detection zones ZV of the optical flow sensors 28.
(41) The basic idea of this feedback loop B3 is illustrated by
(42) In
(43) The applicant has discovered that the detection of discontinuities in the terrain and thus the navigation of the microdrone 2 is substantially improved by rotating the detection head 22, and thus the detection zones ZV of the optical flow sensors, as a function of the direction, in the local plane of reference L, of the maximum optical flow detected by the optical flow sensors. Indeed, the maximum optical flow direction corresponds, in particular in the presence of the feedback loop B2 leading the microdrone 2 to follow the terrain T, to the direction in which the distance between the center E of the detection head 22 and the surface S of the terrain T is minimal. In other words, the maximum optical flow direction M coincides with the direction of the normal N to the surface S of the terrain T passing through the center E of the detection head 22. One could also say that the maximum optical flow direction is the direction in which one looks right on the surface S of the terrain T being flown over.
(44) The maximum optical flow direction M is subsequently identified in the local plane of reference L by the angle {circumflex over (?)}.sub.head/slope that it supports with the axis of ascent Ze.
(45) According to the invention, the detection head 22 is rotated using the motor 32 such that the line of sight ZC of the detection zone ZV of the front ventral sensor 28.1 is separated by a determined reorientation angle ? from the maximum optical flow direction M. The reorientation angle ? is set such that the front control sensor 28.1 looks forward enough when the microdrone 2 follows a terrain T. Preferably, the reorientation angle ? is comprised between 0? and 90?, and particularly preferably, the reorientation angle ? is substantially equal to 23?.
(46) It is advantageous to choose the elevation angle ? (i.e., the fixed angle between the axis of ascent Ze of the detection head 22 and the line of sight ZC of the detection zone ZV of the ventral optical flow sensor 28.1) for the reorientation angle ?. Indeed, in this case, the feedback loop B3 keeps the axis of advance Xe of the detection head 22 parallel to the local slope P of a terrain T flown over by the microdrone 2.
(47)
(48)
(49) Mathematical Foundation
(50) We will now outline the mathematical foundation of the feedback loop B3 according to the invention.
(51) According to the article Blur zone by T. Whiteside et al., published in the scientific review Nature, 225: 94-95, it is known that the optical flow ? varies with the orientation ? of each optical flow sensor 28 relative to the local plane of reference L of the detection head 22 according to the following equation:
(52)
(53) Assuming that the optical flow sensor 28 is oriented downward, it is possible to geometrically demonstrate that the distance D(?) depends on the angle ? of the slope P of the terrain T flown over, the distance Dsurface, the angle ?.sub.head/slope, and the angle of elevation ?:
(54)
(55) Using equations 1 and 2, one deduces:
(56)
(57) Since we are looking for the direction of the maximum optical flow in order to determine ?.sub.head/slope, we differentiate equation 3:
(58)
(59) The maximum of the cosine function is then obtained for:
(60)
(61) As can be seen in equation 5, when the microdrone 2 is in motion, the maximum of the optical flow does not appear when the detection head is parallel to the followed surface, but at an angle that depends on the angle ?.sub.head/slope and the angle ?.
(62) Hypothesis: the speed vector V is always parallel to the surface followed:
?=?.sub.head/slopeEquation 6
(63) Indeed, the feedback loop B2 uses the optical flow measurements from the optical flow sensors 28 to maintain a constant ratio between the speed of advance of the microdrone 2 and the distance D.sub.surface from the surface S, which results in following of the terrain and the speed vector therefore aligns with the followed surface.
(64) In this case, equation 5 becomes:
{circumflex over (?)}.sub.head/slope=?.sub.head/slopeEquation 7
(65) The estimated reorientation angle {circumflex over (?)}.sub.head/slope can therefore be used to reorient the detection head 22 parallel to the followed surface because ?.sub.EiR=?.sub.EiR+{circumflex over (?)}.sub.head/slope leads to ?.sub.head/slope=0.
(66) Near the position of the maximum optical flow, according to equation 3, the optical flow varies according to:
(67)
(68) Consequently, the measurements should be taken for all of the optical flow sensors 28, which are all separated from one another by a known angle, and these measurements should be used to identify the angle ?.sub.head/slope.
(69) One looks for the coefficients
(70)
in the function:
(71)
(72) which give the best approximation within the meaning of the least squares method from the optical flow measurements. This is implemented easily in the microcontroller 26 because it is possible to estimate the square cosine function by a second-degree polynomial function using a Taylor series expansion in the vicinity of 0:
(73)
(74) This is equivalent to:
?(?)?a??.sup.2+b??+cEquation 10
(75) X=[(?.sup.2, ?, 1] is defined and, using the set of optical flow measurements ?, the coefficients [a, b, c] are defined with the least squares method:
[a,b,c]=inv(X*X)*X*?Equation 11
(76) In this expression, only ? depends on the optical flow measurements, while the rest is constant and depends only on the fixed orientation of each optical flow measurement in the local plane of reference L.
(77) A multiplication of matrices suffices to determine the coefficients [a; b; c], and afterwards, it is possible to use equations 9 and 10 to determine that the angle {circumflex over (?)}.sub.head/slope is the following:
{circumflex over (?)}.sub.head/slope=?b/(2a)Equation 12
(78) During simulations, it was shown that with the noise added to the optical flow measurements, it is possible arrive at erroneous measurements of {circumflex over (?)}.sub.head/slope, which creates oscillations of the detection head 22 during the feedback.
(79) In order to eliminate the erroneous measurements, a confidence index is calculated:
(80)
(81) A decrease in the confidence index corresponds to a higher similarity between the optical flow measurements and the approximate square cosine function. The angle {circumflex over (?)}.sub.head/slope is only validated if the confidence index is below a threshold.
(82) Experimental Results
(83)
(84) The task of the two microdrones K1 and K2 was to navigate in a steep environment Q comprising a pointed relief H. It was noted that the microdrone K2 was detecting the pointed relief H late, and therefore crashing against it. However, using the detection zone reorientation method according to the invention, the microdrone K1 was managing to negotiate the pointed relief H.
Second Embodiment
(85)
(86) In the second embodiment, the detection head 22 is attached to the main body 12 by a rod 40. No relative movement between the detection head 22 and the main body 12 is therefore possible. The detection head 22 is mounted on the main body 12 such that the axis of advance Xe is substantially parallel to the longitudinal axis X-X of the main body 12.
(87) The optical flow sensor 28 here is made up of a matrix 42 of 84?15 photodetectors covering a detection field of substantially 360??60? around the drift axis Ye. Among the multitude of photodetectors, a subset is chosen with four groups G1 to G4 that cover detection zones ZV similar to those of the first embodiment according to
(88) The feedback loop B3 is installed in this second microdrone so as to select different subsets of photodetectors in real-time to measure optical flows as a function of the estimated reorientation angle ?.
(89) Preferred Alternatives of the Feedback Loop B3 of the Orientation of the Detection Zones ZV of the Sensors
(90)
(91)