METHOD AND DEVICE FOR DOCKING AN UNMANNED AERIAL VEHICLE WITH A MOVEABLE DOCKING STATION
20200290753 ยท 2020-09-17
Inventors
- Edward David Nicholas ANASTASSACOS (Bedford, Bedfordshire, GB)
- Robin Francois Humbert GOJON (Bedford, Bedfordshire, GB)
- Ciaran Michael S. MCMASTER (Clyde, GB)
Cpc classification
Y02T90/16
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B64F1/18
PERFORMING OPERATIONS; TRANSPORTING
B64U10/14
PERFORMING OPERATIONS; TRANSPORTING
Y02T10/70
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B64C39/024
PERFORMING OPERATIONS; TRANSPORTING
B60L53/66
PERFORMING OPERATIONS; TRANSPORTING
B64U80/25
PERFORMING OPERATIONS; TRANSPORTING
Y02T90/12
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
Y02T10/7072
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
International classification
B60L53/66
PERFORMING OPERATIONS; TRANSPORTING
B64F1/18
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A device for docking an unmanned vehicle with a moveable docking station is provided, the device comprising: an interface surface configured to cooperatively engage a counterpart interface surface of a counterpart device; at least one identifying means arranged to be detected by a counterpart device; and at least one sensor configured to detect an at least one identifying means of a counterpart device. Also provided is a method of docking an unmanned vehicle with a moveable docking station, the method comprising: performing a cooperative docking procedure; wherein the cooperative docking procedure comprises each of the unmanned vehicle and docking station capturing the attitude and/or position of the other and, responsive to the captured attitude and/or position, adjusting the respective attitude and/or position of the unmanned vehicle and docking station to a mutually cooperative docking configuration so as to provide docking of the unmanned vehicle with the docking station.
Claims
1. A device for docking an unmanned vehicle with a moveable docking station, the device comprising: an interface surface configured to cooperatively engage a counterpart interface surface of a counterpart device; at least one identifying means arranged to be detected by a counterpart device; and at least one sensor configured to detect an at least one identifying means of a counterpart device.
2. A device according to claim 1, wherein the device comprises a sensor for detecting the attitude of the docking device or a counterpart device.
3. A device according to claim 1 or 2, wherein the device interface surface or counterpart interface surface has a male connector configuration.
4. A device according to claim 3, wherein at least a portion of the device interface surface or counterpart interface surface has a convex surface.
5. A device according to claim 1 or 2, wherein the device interface surface or counterpart interface surface has a female connector configuration.
6. A device according to claim 5, wherein at least a portion of the device interface surface or counterpart interface surface has a concave surface.
7. A device according to any one of claims 1 to 6, wherein the identifying means comprises a visual marker.
8. A device according to any one of claims 1 to 7, wherein the at least one sensor comprises a camera.
9. A device according to claim 8, wherein the camera is an infrared camera.
10. A device according to any one of claims 1 to 9, wherein the identifying means comprises a plurality of visual markers and/or identifying signal emitters.
11. A device according to claim 10, comprising three visual markers and/or identifying signal emitters arranged in a triangular configuration.
12. A device according to any one of claims 1 to 11, wherein the sensor comprises a plurality of sensors such as a plurality of image sensors,
13. A device according to any one of claims 1 to 12, further comprising a coupling to hold the unmanned vehicle in position on the docking station once docking has been achieved.
14. A device according to claim 13, wherein the coupling is electromechanical.
15. A device according to claim 13 or 14, wherein the coupling is electromagnetic permanently magnetic, or semi-permanently magnetic.
16. A device according to any one of claims 13 to 15 wherein the device comprises one or more connectors for performing one or more of the following: refuelling or recharging the unmanned vehicle; powering the unmanned vehicle; and transferring data between the unmanned vehicle and the docking station.
17. A device according to any one of claims 1 to 16, further comprising a transmitter and a receiver for communicating with a counterpart device.
18. A method of docking an unmanned vehicle with a moveable docking station, the method comprising: performing a cooperative docking procedure; wherein the cooperative docking procedure comprises each of the unmanned vehicle and docking station capturing the attitude and/or position of the other and, responsive to the captured attitude and/or position, adjusting the respective attitude and/or position of the unmanned vehicle and docking station to a mutually cooperative docking configuration so as to provide docking of the unmanned vehicle with the docking station.
19. A method of docking an unmanned vehicle with a moveable docking station, using a device according to any one of claims 1-17, the method comprising: performing a cooperative docking procedure; wherein the cooperative docking procedure comprises each of the unmanned vehicle and docking station capturing the attitude and/or position of the other and, responsive to the captured attitude and/or position, adjusting the respective attitude and/or position of the unmanned vehicle and docking station to a mutually cooperative docking configuration so as to provide docking of the unmanned vehicle with the docking station.
20. A method according to claim 18 or 19, wherein capturing the attitude and/or position comprises at least one of the unmanned vehicle and docking station identifying at least one visual marker located on the other or receiving at least one identifying signal from the other and, responsive to the location of the at least one visual marker or to the identifying signal, calculating a relative position between the unmanned vehicle and docking station.
21. A method according to any one of claims 18 to 20, wherein capturing the attitude and/or position comprises at least one of the unmanned vehicle and docking station receiving an identifying signal and determining its attitude based on the identifying signal.
22. A method according to any one of claims 18 to 21 wherein capturing the attitude and/or position comprises the docking station identifying at least one visual marker on the unmanned vehicle, or the docking station receiving at least one identifying signal from the unmanned vehicle, and, responsive to the location of the at least one visual marker or to the identifying signal, calculating a relative position between the unmanned vehicle and docking station and determining an attitude.
23. A method according to any one of claims 18 to 22, wherein adjusting the respective attitude and/or position of the unmanned vehicle and docking station comprises: each of the unmanned vehicle and docking station calculating a respective vector in which to move themselves to reduce the relative position between the unmanned vehicle and docking station; and sending an actuation signal to at least one actuator of each of the unmanned vehicle and docking station respectively to move the unmanned vehicle and docking station along their respective calculated vectors to reduce the relative position between the unmanned vehicle and docking station.
24. A method according to any one of claims 21 to 23, wherein adjusting the respective attitude and/or position of the unmanned vehicle and docking station comprises: each of the unmanned vehicle and docking station calculating a manoeuvre based on the identifying signal to move themselves to a mutually corresponding attitude; and sending an actuation signal to at least one actuator of each of the unmanned vehicle and docking station respectively to perform the calculated manoeuvre to move themselves into the mutually corresponding attitude.
25. A method according to claim 24, wherein the mutually corresponding attitude comprises a horizontal attitude.
26. A method according to any one of claims 20 to 25, wherein identifying at least one visual marker or identifying signal comprises capturing an image or location of the at least one visual marker or identifying signal.
27. A method according to claim 26, wherein adjusting the respective attitude and/or position of the unmanned vehicle and docking station comprises: each of the unmanned vehicle and docking station calculating a respective vector in which to move themselves to move the at least one visual marker or identifying signal towards the centre of the image or captured location; and sending an actuation signal to at least one actuator of each of the unmanned vehicle and docking station respectively to move the unmanned vehicle and docking station along their respective calculated vectors to move the at least one visual marker or identifying signal towards the centre of the image or captured location.
28. A method according to claim 26 or 27, wherein adjusting the respective attitude and/or position of the unmanned vehicle and docking station comprises performing a manoeuvre such that the at least one visual marker is captured in the image in a certain orientation.
29. A method according to any one of claims 18 to 28, wherein capturing the attitude and/or position of the other is performed as part of an iterative loop.
30. A method according to claim 29, wherein the iterative loop iterates 200 times per second.
31. A method according to any one of claims 18 to 30, wherein the method comprises: identifying an available docking station approaching the available docking station; sending a docking request; and receiving a signal from an available docking station.
32. A method according to any one of claims 18 to 31, the method further comprising engaging a coupling to hold the unmanned vehicle in position on the docking station once docking has been achieved,
33. A method according to claim 32, the method further comprising: determining when the unmanned vehicle and docking station are within a threshold proximity range and a threshold relative position range of each other such that docking can be successfully achieved; and engaging the coupling to complete the cooperative docking procedure.
34. A method according to any one of claims 18 to 33, the method further comprising performing a task selected from one or more of the following once the unmanned vehicle has docked with the docking station: refuelling or recharging the unmanned vehicle; transferring data between the unmanned vehicle and the docking station; diagnosing mechanical or electrical faults; carrying out maintenance on the unmanned vehicle; retrieving a payload which has been carried by the unmanned vehicle; and loading a new payload onto the unmanned vehicle.
35. A kit of parts comprising two or more devices according to any one of claims 1 to 17.
36. An unmanned vehicle comprising a device according to any one of claims 1 to 17.
37. An unmanned vehicle according to claim 36, wherein the unmanned vehicle comprises an unmanned aerial vehicle.
38. A moveable docking station comprising a device according to any one of claims 1 to 17.
39. A moveable docking station according to claim 38, wherein the moveable docking station comprises a robotic arm having an end effector, wherein the end effector comprises a device according to any one of claims 1 to 17.
40. A system comprising an unmanned vehicle according to claim 36 or 37 and a moveable docking station according to claim 38 or 39.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0061] One or more specific embodiments in accordance with aspects of the present invention will be described, by way of example only, and with reference to the following drawings in which:
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
[0072]
[0073]
[0074]
[0075]
[0076]
DETAILED DESCRIPTION
[0077] A description of examples in accordance with the disclosure will now be provided with reference to the drawings and in which like reference numerals refer to like parts.
[0078] In the first example of the disclosure illustrated in
[0079] The moveable docking station shall be understood to be a docking station which may be manipulated to alter its position, attitude and/or orientation, Such manipulation may be by any means, and may be autonomous or manually controlled or any combination thereof. Optionally, manual control may be by an extraneous controller or signal. Optionally, autonomous control may be in response to an observed visual marker signal or received signal. In some options an extraneous controller or signal may trigger a manipulation of the docking station, whereby the precise manipulation is then autonomously controlled. Movement may be achieved by the provision of one or more actuators. For example, movement may be achieved by the provision of robotic arm actuation, CNC-driven actuation, vehicle-driven actuation or the like.
[0080] Preferably, the movement may have multiple degrees of freedom. For example the manipulation may have three degrees of freedom, or more than three degrees of freedom. The degrees of freedom may include any combination of pitch, roll, yaw and movement in the X, Y and/or Z planes.
[0081] The controller 110 may be a desktop running an operating system similar to windows tailored for applications with unmanned vehicles. The instructions are provided to the motors 106 by controller 110. The inputs of the controller 110 may be the information captured by the camera 114 and the outputs of the controller 110 are the speeds of the motors 106. The control of the speeds of motors 106 will change the position and relative position to allow the unmanned vehicle to change its pitch, roll, and yaw (referred to as attitude hereinafter).
[0082] In the first example of this disclosure illustrated in
[0083]
[0084] An illustrative example of frames 200 captured by camera 114 measures responsive to the instructions provided by the controller 110 to motors 106 is shown in
[0085] The camera 114 then subsequently captures another frame 200 which depicts the second position of the unmanned vehicle 100 after the initial instructions provided by controller 110, which is shown in
[0086] The final frame 200 which is captured by camera 114 as shown in
[0087]
[0088] Such an image captured with camera 302 is shown by frame 500 of
[0089] An illustrative example of frames 500 captured by camera 302 measures responsive to the instructions provided by the controller 304 to servos 308 is shown in
[0090] The camera 302 then subsequently captures another frame 500 which depicts the second position of the moveable docking station 300 after the initial instructions provided by controller 304, which is shown in
[0091] The final frame 500 which is captured by camera 302 as shown in
[0092]
[0093]
[0094] The complementary interface device 900 is illustratively presented in
[0095]
[0096] During the docking procedure a corresponding process is performed by interface device 900, The image sensor 906 also captures an image of the visual markers 802 which are situated on interface surface 801 of interface device 800. Based on the spatial relationship between the visual markers 802, interface device 900 calculates the position and attitude of the unmanned vehicle 100 which interface device 800 is attached to and therefore the relative position between the two interface devices 800 and 900, as described in more detail below,
[0097]
[0098] Upon subsequent compensation of the observing body a further image 1100 is captured by its image sensor as shown in
[0099]
[0100]
[0101] The interface devices therefore maintains the unmanned vehicle 1200 and moveable docking station 1300 in alignment, as shown in
[0102] In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the invention.
[0103] For example, instead of using computer vision to determine the attitude of the body being observed, the interface device of the observing body may comprise an inertial measuring unit which adjusts the attitude of the observing body to a predetermined mutually corresponding attitude, for example, a horizontal attitude. Therefore the image sensors only need to correct for relative position which reduces the amount of computing power required.
[0104] As used herein any reference to one embodiment or an embodiment means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase in one embodiment or the phrase in an embodiment in various places in the specification are not necessarily all referring to the same embodiment.
[0105] As used herein, the terms comprises, comprising, includes, including, has, having or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, or refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
[0106] In addition, use of the a or an are employed to describe elements and components of the invention. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
[0107] The scope of the present disclosure includes any novel feature or combination of features disclosed therein either explicitly or implicitly or any generalisation thereof irrespective of whether or not it relates to the claimed invention or mitigate against any or all of the problems addressed by the present invention. The applicant hereby gives notice that new claims may be formulated to such features during prosecution of this application or of any such further application derived therefrom. In particular, with reference to the appended claims, features from dependent claims may be combined with those of the independent claims and features from respective independent claims may be combined in any appropriate manner and not merely in specific combinations enumerated in the claims.