UNMANNED ASSET CONTROL SYSTEM
20250068168 ยท 2025-02-27
Inventors
Cpc classification
G05D1/646
PHYSICS
G05D1/69
PHYSICS
G06F3/04845
PHYSICS
International classification
G05D1/223
PHYSICS
G05D1/224
PHYSICS
G05D1/646
PHYSICS
Abstract
A system for controlling operation of unmanned assets. The system includes a graphical user interface displaying a map. The graphical user interface accepts graphical inputs drawn on the map. The system also includes a calculation unit having an input interpretation module that recognizes the graphical inputs and translates the graphical inputs into tasks and/or commands. The calculation unit also includes an operation planning module that generates operation instructions based on the tasks and/or commands; and a journey planning module that generates a journey plan for the unmanned assets based on the operation instructions. The system also includes a communications module that communicates with the unmanned assets to instruct the unmanned assets to operate according to the generated journey plan.
Claims
1. A system for controlling operation of one or more unmanned assets, the system comprising: a graphical user interface displaying a map, wherein the graphical user interface is configured to accept one or more graphical inputs which are drawn on the map by an unmanned asset commander; a calculation unit comprising: an input interpretation module configured to: recognize the graphical inputs; and translate the graphical inputs into tasks and/or commands; an operation planning module configured to: generate operation instructions for one or more of the unmanned assets based at least on the tasks and/or commands; and a journey planning module configured to: generate a journey plan for one or more of the unmanned assets based at least on the operation instructions; and a communications module configured to communicate with the one or more unmanned assets to instruct the unmanned assets to operate according to the generated journey plan.
2. The system according to claim 1, wherein the operation planning module is configured to: obtain data indicative of external and/or environmental constraints; and generate operation instructions based on the data indicative of external and/or environmental constraints.
3. The system according to claim 2, wherein the data indicative of external and/or environmental constraints comprises data indicative of one or more of: terrain information, weather, obstacles, air traffic, restricted and hazardous areas, published navigation information.
4. The system according to claim 1, wherein the graphical user interface is configured to accept hand-drawn inputs.
5. The system according to claim 1, wherein the system is for controlling operation of one or more unmanned assets including one or more unmanned aerial vehicles (UAVs).
6. The system according to claim 1, wherein the system is for controlling operation of a plurality of unmanned assets comprising a plurality of different unmanned asset types.
7. The system according to claim 1, wherein the input interpretation module is configured to recognize the graphical inputs by: identifying the shapes of the graphical inputs; and identifying geographical locations corresponding to the portions of the map on which the graphical inputs were drawn.
8. The system according to claim 1, wherein the input interpretation module is configured to translate the graphical inputs by: comparing the graphical inputs to a library of known input symbols and their corresponding commands.
9. The system according to claim 8, wherein the input interpretation module comprises a local memory, wherein the library of known input symbols and corresponding commands is stored in the local memory, or wherein the library of known input symbols and corresponding commands is stored in a remote memory, and the input interpretation module is configured to: obtain the library of known input symbols and corresponding commands from the remote memory.
10. The system according to claim 1, wherein the operation planning module is configured to: receive position and/or status data from the one or more unmanned assets.
11. The system according to claim 1, wherein the operation planning module is configured to: allocate tasks associated with graphical inputs to different ones of a plurality of unmanned assets.
12. The system according to claim 11, wherein the operation planning module is configured to: receive position and/or status data from the one or more unmanned assets; and wherein the allocation of tasks is based at least partially on the position and/or status data.
13. The system according to claim 1, wherein the graphical inputs correspond to commands including one or more of: take off from this location, land in this location, ditch in this location in case of emergency, rendezvous at this location at this time, deliver payload to this location, pick up payload from this location, inspect (one time) this portion of highway/border/railway/powerline, patrol (several times) this portion of highway/border/railway/powerline from this location, inspect (one time) this portion of highway/border/railway/powerline, patrol/scan this area, avoid flying over this area, loiter in this location, monitor this point of interest in orbit, monitor this point of interest in a figure of eight, search this area for targets, search along this border for targets, gather at this location, and split from this location.
14. A system comprising: one or more unmanned assets; and a system for controlling one or more unmanned assets according to claim 1.
15. A method for controlling operation of one or more unmanned assets, the method comprising: displaying a map on a graphical user interface; accepting, via the graphical user interface, one or more graphical inputs which are drawn on the map by an unmanned asset commander; recognizing the graphical inputs; translating the inputs into tasks and/or commands; generating operation instructions for one or more of the unmanned assets based at least on the tasks and/or commands; generating a journey plan for one or more of the unmanned assets based at least on the operation instructions; and communicating with the one or more unmanned assets to instruct the unmanned assets to operate according to the generated journey plan(s).
Description
BRIEF DESCRIPTION OF DRAWINGS
[0059] One or more non-limiting examples will now be described, by way of example only, and with reference to the accompanying figures in which:
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
DETAILED DESCRIPTION
[0066] The below described examples will be understood to be exemplary only. It will be understood that where used herein, terms such as up and down refer to directions as viewed in the reference frame of the appended Figures.
[0067]
[0068] In
[0069] Operation of the system 1 will now be explained with reference to the flowchart of
[0070]
[0071] At step 23, the graphical user interface 3 accepts one or more illustrated inputs from the commander. In the illustrated example, the graphical user interface 3 is a touchscreen and so the commander provides their graphical inputs by drawing on the touchscreen with a finger or a stylus. In some examples, step 23 may also comprise accepting a further input from the commander confirming that they have finished providing graphical inputs, and the system 1 should proceed to analysing the graphical inputs.
[0072] At step 25, the input interpretation module 7 analyses the graphical inputs, and translates the graphical inputs into tasks and/or commands. This analysis comprises three steps: [0073] a) recognize the shape which has been drawn; [0074] b) determine the location or locations which correspond to the position on the map at which the graphical input has been drawn; [0075] c) compare the recognized shape to a library of known shapes and, if the shape matches a known shape, output the task and/or command which is associated with that shape to be performed at the location or locations which were determined at step (b).
[0076] It will be understood that, in examples, the order of steps (b) and (c) may be reversed.
[0077] In examples, the input interpretation module 7 may be configured to output an error message, for example to be displayed to a commander (e.g. using the graphical user interface) if at step (a) the shape is not recognized, or at step (c) the recognized shape does not match a known shape.
[0078] The shape recognition from step (a) can use any known shape recognition tool, for example, Adobe's shaper tool. Example shapes, and their associated tasks and/or commands, are discussed below in relation to
[0079] It will be understood that the recognition step (a) comprises recognising the shape which has been drawn, without yet processing its significance, e.g. recognising that the commander has drawn a circle with a cross through it. It is at step (c) that the significance of this shape is determined. There may be scenarios where a shape is recognized at step (a) by the shape recognition tool, but the shape does not correspond to a known shape which has an associated task or command. For example, if the commander drew an octagon, the shape recognition tool may recognize (at step (a)) that an octagon had been drawn, but at step (c) no task and/or command is found which is associated with an octagon, and so an error message may be output at step (c).
[0080] The tasks and/or commands are then forwarded by the input interpretation module 7 to the operation planning module 9.
[0081] At step 27, external and environmental constraint data is obtained by the operation planning module 9. This data may comprise a plurality of different data types which may be relevant. In some examples, the external and environmental constraint data comprises terrain elevation data, weather data, obstacle data, and air traffic data. This data can be obtained in any suitable and desired way, for example via a ground network and/or via wireless communications. In some examples, the terrain elevation data may be stored in local memory.
[0082] At step 29, unmanned asset position and status data is obtained. The communication module 17 of each unmanned asset is configured to send position and status data to the communication module 13 of the control system 1. This data is then forwarded by the system's communication module 13 to the operation planning module 9. The unmanned asset status data may comprise details such as availability, health status, charge/fuel level, payload status, etc.
[0083] Obtaining unmanned asset position and status data may be performed by means other than the communication module 17 of each asset. For example, the system may be configured to interface with an unmanned asset tracking database which contains the current position and status of all unmanned assets in a fleet. The information in the unmanned asset tracking database may come from the unmanned assets themselves, or the position of unmanned assets may be determined using a system external from the unmanned assets, e.g. a GPS or RADAR tracking system. In another example, the information in the unmanned asset tracking database may come from logs. For example, it may be determined, from a usage log, that a fleet of unmanned assets are currently available for use and being stored at a base or in a hangar.
[0084] At step 31, the operation planning module 9 generates operation instructions based on the tasks and/or commands, the external and environmental constraint data, and the unmanned asset position and status data. The operation instructions may be considered macro-operations which outline a mission strategy, and a sequence of operations to be performed by the one or more unmanned assets 15. The operation planning module is configured to sequence and allocate (e.g. optimally sequence and allocate) the tasks and/or commands to the one or more unmanned assets. As such it may not be necessary for the commander to sketch the mission in the exact order the operations will be executed. The commander may start from the main objective (e.g. search that area), and then define where to take-off from, where to land, no-fly zones, emergency landing zones, etc. As such, once step 31 is complete, a set of operation instructions has been generated for each of the unmanned assets which has been allocated any tasks and/or commands
[0085] At step 33, the operation instructions are passed to the journey planning module 11. The journey planning module 11 uses the macro-operations outlined by the operation instructions to generate specific journey plans for one or more of the individual unmanned assets 15. These journey plans include all sufficient detail for a commander to carry out its assigned mission, e.g. take off location, exact route, altitude and groundspeed instructions, and payload commands.
[0086] At step 35, the communication module 13 sends the respective journey plans to the one or more unmanned assets 15 via the communications modules 17 of the unmanned assets 15. The guidance, navigation, control, and payload manager 19 of the or each unmanned asset 15 can then operate the unmanned asset 15 to follow the assigned journey plan.
[0087] The method 20 will now be explained in relation to an example.
[0088] At step 21, a map is displayed on a graphical user interface, and at step 23, the unmanned asset commander draws graphical inputs corresponding to tasks. At step 25, the inputs are analysed, and translated into a list of tasks and commands: [0089] patrol this area [0090] deliver a payload at this location, [0091] take off from this location [0092] land at this location.
[0093] At step 27 and 29 external and environmental constraint data, and unmanned asset position and status data are obtained.
[0094] At step 31, operation instructions are generated based on the list of tasks and commands, the external and environmental constraint data, and the unmanned asset position and status data. The operation instructions are specific to each unmanned asset and result from the sequencing and allocation of the tasks and/or commands. In this example, the unmanned asset position and status data indicates that there are two unmanned assets, A and B available at the instructed take-off location, and that unmanned asset A has surveillance capabilities and unmanned asset B has payload capabilities. As such, the operation instructions are as follows:
[0095] Asset A operation instructions: [0096] 1. take off from the specified location [0097] 2. patrol the specified area [0098] 3. land at the specified landing location
[0099] Asset B operation instructions: [0100] 1. take off from the specified location [0101] 2. deliver the payload at the specified delivery location [0102] 3. land at the specified landing location
[0103] At step 33, a journey plan is generated for unmanned asset A and unmanned asset B, based on their respective operation instructions. The journey plan contains detailed instructions for the journey in a format which the assets understand. As such, the journey plan instructions are as follows:
[0104] Asset A journey plan: [0105] 1. take off from the specified location [0106] 2. climb to an altitude of 500 m [0107] 3. fly on heading 40 degrees for 10 km [0108] 4. fly in a zig-zag pattern over these coordinates at an altitude of 500 m for 30 minutes [0109] 5. fly on heading 160 degrees for 15 km [0110] 6. land at the specified landing location
[0111] Asset B journey plan: [0112] 1. take off from the specified location [0113] 2. climb to an altitude of 300 m [0114] 3. fly on heading 160 degrees for 15 km [0115] 4. descend to an altitude of 100 m and drop the payload at these coordinates [0116] 5. fly on heading 400 degrees for 10 km [0117] 6. land at the specified landing location
[0118] At step 35, assets A and B are sent their journey plans.
[0119]
[0120]
[0121]
[0122]
[0123]
[0124]
[0125]
[0126]
[0127]
[0128]
[0129]
[0130] Use of the drawings shown in
[0131]
[0132] An operation may be interrupted by an unmanned asset 15 for example in a search and rescue mission where the instruction is to scan a certain area until the missing person is located. Upon locating the person, the unmanned asset 15 would then interrupt its own scanning operation. In the defined vocabulary, commander interruptible operations are denoted by an upwardly pointing triangle whilst unmanned asset 15 interruptible operations are denoted by a downwardly pointing triangle.
[0133] The symbol shown in
[0134] The symbol shown in
[0135] The symbol shown in
[0136] It will be understood that for operations to be interrupted by the commander, the system may provide functionality for the commander to provide dynamic instructions such that the calculation unit 5 can make alterations to the journey plan of an unmanned asset 15 even after the asset has been deployed.
[0137]
[0138]
[0139]
[0140] It can be seen from
[0141] In the example of
[0142] The example of
[0143] The symbol of
[0144]
[0145] Input 51 corresponds to the symbol of
[0146] In the case of the unmanned assets 15 assigned to perform the interruptible operations associated with inputs 52 and 53 they will continue to monitor the points of interest until the commander interrupts them, at which point they will move to the rendezvous point at the location of input 56. Meanwhile, the unmanned asset 15 assigned to deliver the payload at the location of input 55 will drop its payload, and then proceed directly to the rendezvous point. Likewise, the unmanned asset 15 assigned to perform the patrol according to input 54 will do so, and then proceed to the rendezvous point. Depending on the unmanned asset 15 position and status data, the calculation unit may determine that the same unmanned asset 15 should perform the tasks associated with inputs 55 and 54 since, as can be seen from
[0147] Whilst the above described examples are primarily concerned with unmanned aerial vehicles (UAVs), the unmanned assets 15 being controlled could comprise any unmanned assets 15, e.g. unmanned ground vehicles (UGVs), unmanned surface vehicles (USVs) for operation on the surface of water, and unmanned underwater vehicles (UUVs). Further, the disclosed systems and methods could be applied to fleets of unmanned assets 15 comprising a plurality of different types of unmanned assets 15.
[0148] By way of example, in the example situation presented by
[0149] It will be understood that although the above examples are mainly focussed around the case of UAVs, the present disclosure is applicable to the control of many types of unmanned assets.
[0150] It will be seen that the system of the present disclosure has the potential to simplify the level of detail which needs to be input by an unmanned asset commander in order to effectively command one or more unmanned assets. As such, unmanned-manned teamwork can be more seamlessly provided, since the pilot or driver of a manned asset can command their unmanned teammates whilst still having the capacity to operate their own vehicle.