UAV POSITIONING SYSTEM AND METHOD FOR CONTROLLING THE POSITION OF AN UAV
20230069480 · 2023-03-02
Inventors
Cpc classification
B64U2201/104
PERFORMING OPERATIONS; TRANSPORTING
G06T7/277
PHYSICS
B64C39/024
PERFORMING OPERATIONS; TRANSPORTING
International classification
G05D1/10
PHYSICS
Abstract
The invention relates to a method of controlling an UAV along a predefined path and to an UAV positioning system, the system comprising: —at least one flexible positioning stripe (30) comprising markers (32) distributed along said positioning stripe (30) to form different configurations of patterns, each of said configurations of patterns defining a reference position along the flexible positioning stripe (30), wherein said positioning stripe (30) may be positioned along a predefined path, —an UAV (12), —a position estimation module (14) mounted on the UAV and comprising a camera (16) configured to capture images in real-time of said configuration of patterns along the flexible positioning stripe (30)—a control unit (20) configured to control the velocity of the UAV. The position estimation module (14) is configured to position the UAV (12) above, below or next to the positioning stripe (30) and along aid positioning stripe (30) based on successive configurations of patterns captured by the camera (16) of the position estimation module (14). The at least one flexible positioning stripe (30) comprises a controller (34) configured to dynamically control active markers (32) based on the velocity of the UAV (12) and the positions of said active markers (32) along the position stripe (30) in order to generate said successive configuration of patterns.
Claims
1. An Unmanned Aerial Vehicle (UAV) positioning system, comprising: at least one flexible positioning stripe comprising markers distributed along said positioning stripe to form different configurations of patterns, each of said configurations of patterns defining a reference position along the flexible positioning stripe, wherein said positioning stripe may be positioned along a predefined path, an UAV a position estimation module mounted on the UAV and comprising a camera configured to capture images in real-time of said configuration of patterns along the flexible positioning stripe, and a control unit (20) configured to control the velocity of the UAV, wherein the position estimation module is configured to position the UAV above, below or next to the positioning stripe and along said positioning stripe based on successive configurations of patterns captured by the camera of the position estimation module, and wherein said at least one positioning stripe comprises a controller configured to dynamically control active markers based on the velocity of the UAV and the positions of said active markers along the position stripe in order to generate said successive configuration of patterns.
2. The UAV positioning system according to claim 1, wherein said at least one positioning stripe is made of several removably coupled segments in order to provide a length-adjustable positioning stripe.
3. The UAV positioning system according to claim 1, wherein the UAV or the position estimation module further comprises an Inertial Measurement Unit (IMU).
4. The UAV positioning system according to claim 1, wherein said active markers are arranged along said at least one positioning stripe at constant intervals.
5. The UAV positioning system according to claim 1, wherein said markers are Light Emitted Diodes (LEDs).
6. The UAV positioning system according to claim 5, wherein said LEDs are Near-IR LEDs, preferably in the spectral range from 920 nm to 960 nm, and most preferably around 940 nm.
7. The UAV positioning system according to any preceding claim, wherein said at least one positioning stripe is flexible preferably made of a PVC base-material.
8. The UAV positioning system according to any preceding claim, comprising two flexible positioning stripes adapted to be arranged in parallel along said predefined path.
9. A method of controlling an Unmanned Aerial Vehicle (UAV) along a predefined path using a dynamically controlled positioning stripe, a position estimation module mounted the UAV and a control unit configured to control the velocity of the UAV, wherein active markers are distributed along said positioning stripe, each marker being configured to be switched between an ON state and an OFF state to form different configurations of patterns, and wherein the position estimation module comprises a camera configured to captures images of portions of said positioning stripe, the method comprising the steps of: computing the ground truth position p.sub.i of the active markers in terms of world coordinates [x.sub.i.sup.W, y.sub.i.sup.w, z.sub.i.sup.w] in order to obtain a 3D representation of the markers; capturing successive images of portions of said positioning stripe by the camera of the position estimation module while the UAV is moving along the positioning stripe to obtain successive image planes comprising each a distinctive pattern formed by a set of detected active markers, measuring the 2D coordinates Z.sub.i of each detected active marker in each image plane; assigning a unique label to each detected marker in each image plane as a function of its position relative to the other detected markers in said image plane, matching the 2D coordinates z.sub.i of each image plane to said 3D representation of the markers in order to track the camera pose over subsequent interval of times, and controlling the orientation of UAV as a function of the camera pose.
10. The method according to claim 9, wherein the UAV is positioned above the positioning stripe at a certain height which is either controlled manually by the remote-control unit, or constant over and along the entire length of the positioning stripe or as a function of the x-y position of successive portions of the positioning stripe.
11. The method according to claim 9, wherein the pose of the camera is fine-tuned based on information about yaw, pitch and roll angles sent to the UAV by the control unit.
12. The method according to claim 9, wherein the UAV or the position estimation module comprises an Inertial Measurement Unit (IMU), the pose of the camera being fine-tuned based on the measurements of the IMU.
13. The method according to claim 9, wherein only the active markers of said positioning stripe in the vicinity of the UAV are controlled based on said image planes.
14. The method according to claim 9, wherein the UAV or the position estimation module comprises a GPS sensor, and wherein a unique binary pattern is located near an end portion of the positioning stripe to send information to the UAV in order to switch from the positioning stripe to a GPS navigation system.
15. The method according to claim 9, wherein the step of computing the ground truth position p.sub.i of the markers in terms of world coordinates [x.sub.i.sup.W, y.sub.i.sup.w, z.sub.i.sup.w] is achieved from a structure from motion (SFM) algorithm.
16. The method according to claim 9, wherein the ego-motion of the camera is estimated using a non-linear estimator, for example an extended Kalman filter, in order to add temporal dependency between successive images captured by the camera and camera poses (C.sub.1, C.sub.2, C.sub.3).
17. A vehicle positioning kit, for example an Unmanned Aerial Vehicle (UAV), comprising: at least one flexible positioning stripe comprising markers distributed along said positioning stripe to form different configurations of patterns, each of said configurations of patterns defining a reference position along the positioning stripe, wherein said positioning stripe may be positioned along a predefined path, and a position estimation module adapted to be mounted on an vehicle and comprising a camera configured to capture images in real-time of said configuration of patterns along the flexible positioning stripe, wherein the position estimation module is configured to position the vehicle above and along the flexible positioning stripe based on successive configurations of patterns captured by the camera of the position estimation module.
18. The vehicle positioning kit according to claim 17, further comprising a remote-control unit configured to control the velocity of the vehicle.
19. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0044] The invention will be better understood with the aid of the description of several embodiments given by way of examples and illustrated by the figures, in which:
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
DETAILED DESCRIPTION OF SEVERAL EMBODIMENTS OF THE INVENTION
[0055]
[0056] As shown in
[0057] In an embodiment, active markers 32 are evenly distributed over the entire length of a dynamically controlled positioning stripe 30. The set of markers 32 may preferably be in the form of LEDs with a fixed inter-LED distance. By having real-time control over the LEDs 32, a unique binary pattern may be generated. Unique binary patterns allow the position estimation module 14 to recognize specific patterns formed by different group of LEDs 32 in order to locate itself relative to the flexible positioning stripe 30.
[0058] Specific pattern may also send additional information to the position estimation module 14 through decoding of the patterns. For example, a unique binary pattern may be located near one end portion of the dynamically controlled positioning stripe 30 to send information to the UAV in order to switch from the positioning stripe to another type of navigation system such as a GPS.
[0059] The light of each LED on the dynamically controlled positioning stripe 30 may be controlled based on its position on the stripe. If for example the LEDs at index positions 100-102 and 105-106 are turned on as shown in
[0060] The LEDs 32 may be attached for example to a PVC base-material which makes the positioning stripe 30 flexible and durable. The flexibility of the positioning stripe 30 enables to create curved drone trajectories and makes the handling during the setup very intuitive and easy. The positioning stripe 30 can also be attached to moving objects, walls, ceilings etc. The LEDs may advantageously be silicon coated, which makes the positioning stripe 30 waterproof for outdoor applications.
[0061] In an advantageous embodiment, the flexible positioning stripe 30 comprises near-field infrared LEDs emitting light at a wavelength ranging from 920 nm to 960 nm and preferably around 940 nm. Near-field infrared LEDs are advantageously not visible to the human eye while increasing the signal-to-noise ratio on the detection side. This makes the positioning stripe 30 particularly robust to outdoor applications.
[0062] According to this embodiment, the camera sensor 16 is an infrared sensor. For the efficient detection of the Near-field infrared LEDs 32, a band-pass filter in the same spectral frequency range is used in order to increase the signal-to-noise ratio drastically, which results in an image of mostly dark background with blobs of higher intensity corresponding to the LEDs as shown in
[0063] The camera comprises fisheye lens in order to increase the field of view for the detection of near-field infrared LEDs 32. This increases the possible dynamic range of the UAV 12 as well as the robustness of the position estimation itself. The camera sensor 16 may be of the type of a global shutter camera sensor to avoid distortions in the image when taken at high speed. The position estimation module 14 is configured to be powered directly by the UAV and includes a single-board computer
[0064] The (intensity weighted) centers of these bright blobs give the image coordinates of the corresponding LED 32. Each single LED is however not distinguishable from the others in the image plane. However, by detecting a group of markers, the underlying unique binary pattern the points belong to may be recognized. A single pattern entity may for example be recognized by a distinct preamble e.g. four subsequent LEDs 32 which are collectively turned on as shown in
[0065] With the additional knowledge of the fixed inter-LEDs distance d (
[0066] How the binary patterns are mapped onto the positioning stripe 30 is known according to the specific configuration of the positioning stripe. From recognizing a group of LEDs as a distinct pattern, the position of each single LED 32 relative to the positioning stripe 30 may be determined. Additionally, the ground truth position of each LED in terms of world coordinates [x.sub.i.sup.W, y.sub.i.sup.w, z.sub.i.sup.w] is computed with a preliminary mapping step. As a result, not only the position of each LED 32 along the positioning stripe 30 is known but also the actual spatial 3D information. From the set of LEDs positions, the shape of the positioning stripe 30 may be reconstructed with a spline interpolation.
[0067] With reference to
[0068] The variation of a structure from motion (SFM) algorithm makes the mapping more robust to the partially degenerate setting as shown in
[0069] In order to recover the full camera pose, the 2D coordinates z.sub.i of the detected LEDs in the image plane are matched to their corresponding LED indexes. Considering that the 3D world positions p.sub.i of each LED index is known, the set of 2D-3D point or z.sub.i−p.sub.i correspondences must be solved through the Perspective-n-Point (PnP) problem. The goal of this step is to estimate the full six degree of freedom rigid body transformation from the positioning stripe 30 to the camera coordinate system.
[0070] Referring to
z.sub.i=f(R.sub.W.sup.C,t.sub.W.sup.C,p.sub.i,α)
where f is a non-linear measurement function, R.sub.W.sup.C and T.sub.W.sup.C are the unknown camera orientation and position respectively, and a is a set of camera calibration parameters that fit the fisheye camera model. Additionally, extended Kalman filter (EKF) is used to track the camera pose [R.sub.W.sup.C|T.sub.W.sup.C] over subsequent time steps
[0071] According to another embodiment, the UAV positioning system comprises two flexible positioning stripes 30a, 30b adapted to be arranged in parallel along a predefined path as shown in
[0072] As shown in
[0073] More particularly, as the position/velocity of the UAV and the positions of the LEDs are known, the LEDs which are the closest to the image centre, thereafter the “middle LED”, capture by the camera sensor 16 in the next time step may be determined. As each LED belongs to a group of LEDs, which group the middle LED belongs to may be determined, so-called “middle group” thereafter. One or more groups of LEDs trailing the middle group and one or more groups of LEDs ahead of the middle group may be selectively controlled.
[0074] For example, assuming that a group of LEDs comprises 10 LEDs, LED 437 is the “middle LED” in the next time step, so group 430-439 is the middle group and LEDs 400-469 are selectively turned on.
[0075] In an advantageous embodiment, the positioning stripe 30 is made of several removably coupled segments in order to provide a length-adjustable positioning stripe. The positioning stripe 30 is therefore scalable with no theoretical upper bound on the positioning stripe length. The total length of the positioning stripe may therefore be adapted according to the application.
[0076] The positioning stripe 30 serves also as a user interface for controlling the flight path of the drone. We can use the information of the 3D shape of the stripe for planning the desired drone trajectory. The xy-dimension of the stripe will be mapped one on one to the desired flight path, while the height (z-dimension) can be a function of x and y resulting in z=f(x, y), controlled manually z=f(u) or kept constant z=C.
[0077] The invention is not limited to the above described embodiments and may comprise alternative within the scope of the appended claims. For example, active markers in the form of LEDs may be replaced by passive markers (e.g. reflective markers). Although, passive markers would not offer the possibility to actively communicate with the UAV, they would still encode position information and therefore fulfil the main purpose of the positioning stripe, which is enabling self-localization of the UAV.
[0078] Without the ability to communicate through the positioning stripe another communication channel may be used, such as a radio channel, in order to control the UAV interactively along the positioning stripe. Alternatively, a flight itinerary could be pre-programmed. For example, a flight itinerary may be pre-programmed, whereby the UAV is instructed to fly to the end of the stripe, whereupon the UAV hovers for a given period of time and return back to the start. In this case, no communication channel to the UAV is needed as all computations to fulfil the flight itinerary can be done onboard.
REFERENCE LIST
[0079] UAV positioning system 10 [0080] Unmanned Aerial Vehicle (UAV) 12 [0081] Position estimation module 14 [0082] Camera 16 [0083] Infrared sensor [0084] Global shutter camera sensor [0085] Fisheye lens [0086] Inertial Measurement Unit (IMU) 18 [0087] Processing unit 19 [0088] Remote control unit 20 [0089] UAV velocity [0090] Camera framing [0091] Dynamically controlled positioning stripe 30 [0092] Flexible stripe [0093] PVC [0094] Active markers 32 [0095] LEDs Water-resistant coating Near-IR LEDs (˜940 nm) [0096] Stripe controller 34