Controller for an unmanned aerial vehicle
11573565 · 2023-02-07
Assignee
Inventors
Cpc classification
B64U2201/104
PERFORMING OPERATIONS; TRANSPORTING
B64U2101/30
PERFORMING OPERATIONS; TRANSPORTING
B64U2101/00
PERFORMING OPERATIONS; TRANSPORTING
G05D1/0033
PHYSICS
B64C39/024
PERFORMING OPERATIONS; TRANSPORTING
G05D1/0094
PHYSICS
International classification
G05D1/00
PHYSICS
G05D1/10
PHYSICS
Abstract
A controller for an unmanned aerial vehicle (UAV) comprising an image capture means, the controller comprising: inputs arranged to receive: positional data relating to the UAV, a vehicle and a user device; image data captured by the image capture means; a processor arranged to process the received positional data to determine the relative locations of the UAV, vehicle and user device; an output arranged to output a control signal for controlling the UAV and to output an image signal comprising captured image data; wherein the processor is arranged to: generate the control signal for the UAV such that the image data captured by the image capture means comprises at least an image of an obscured portion of the vehicle that is obscured from a field of view of a user of the user device.
Claims
1. A controller for an unmanned aerial vehicle (UAV) comprising an image capture means, the controller comprising: inputs arranged to receive: positional data relating to the UAV, a vehicle and a user device; and image data captured by the image capture means; a processor arranged to process the received positional data to determine relative locations of the UAV, vehicle and user device; and an output arranged to output a control signal for controlling the UAV and to output an image signal comprising captured image data; wherein the processor is arranged to: determine obscured portions of the vehicle that are obscured from a direct line of sight from a user of the user device based at least in part on the determined relative location of the vehicle and the user device; and generate the control signal for the UAV such that the image data captured by the image capture means comprises at least an image of the determined obscured portions of the vehicle.
2. A method of controlling an unmanned aerial vehicle (UAV) comprising an image capture means, the method comprising: receiving positional data relating to the UAV, a vehicle and a user device; image data captured by the image capture means; processing at a processor the received positional data to determine the relative locations of the UAV, vehicle and user device; outputting a control signal for controlling the UAV and outputting an image signal comprising captured image data; determining obscured portions of the vehicle that are obscured from a direct line of sight from a user of the user device based at least in part on the determined relative location of the vehicle and the user device; wherein the processor generates the control signal for the UAV such that the image data captured by the image capture means comprises at least an image of the determined obscured portions of the vehicle.
3. The controller as claimed in claim 1, wherein UAV positional data comprises data from an inertial navigation system on the UAV.
4. The controller as claimed in claim 3, wherein the processor is arranged to use positional data from the UAV inertial navigation system to correct GPS positioning errors.
5. The controller as claimed in claim 1, wherein positional data comprises time of flight measurement data between one or more of: the UAV/vehicle; the UAV/user device; vehicle/user device.
6. The controller as claimed in claim 5, wherein the inputs are arranged to receive vehicle sensor data and the processor is arranged to determine relative locations of the UAV, vehicle and user or user device from the vehicle sensor data and time of flight measurement data.
7. The controller as claimed in claim 1, wherein the positional data received at the inputs comprises data from a vehicle entry system.
8. The controller as claimed in claim 1, wherein the processor is arranged to use an image recognition algorithm to determine the relative location of the vehicle user and vehicle from image data received from the image capture means.
9. The controller as claimed in claim 1, wherein the processor is arranged to generate a control signal to control the UAV position such that the image capture means is directed toward at least some of the obscured portions of the vehicle that are obscured from the line of sight of the vehicle user.
10. The controller as claimed in claim 1, wherein the processor is arranged to generate a control signal that changes the orientation of the image capture means relative to the UAV in order to direct the image capture means toward at least some of the obscured portions of the vehicle that are obscured from the line of sight of the vehicle user.
11. The controller as claimed in claim 1, wherein the processor is arranged to generate a control signal to control the position of the UAV.
12. The controller as claimed in claim 1, wherein the inputs are arranged to receive vehicle sensor data relating to the proximity of the vehicle to an object and the processor is arranged to generate a control signal for the UAV such that the image data captured by the image capture means comprises a portion of the vehicle in proximity to the object.
13. The controller as claimed in claim 1, wherein image data is output to the user device for display on a display screen of the user device.
14. The controller as claimed in claim 1, wherein the processor is arranged to generate a driving control signal for manoeuvring the vehicle and the output is arranged to output the driving control signal to the vehicle.
15. An unmanned aerial vehicle comprising the controller as claimed in claim 1.
16. A vehicle comprising the controller as claimed in claim 1.
17. A remote control device for remotely controlling a vehicle comprising the controller as claimed in claim 1.
18. A non-transitory computer-readable medium having stored thereon instructions that, when executed by a processing device, cause the processing device to carry out the method of claim 2.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
DETAILED DESCRIPTION
(11)
(12) The drone 60 comprises an embedded image capture means such as a camera (not shown in
(13) As described below the drone may be controlled to fly around the vehicle 40 and to maintain a hovering position that is generally opposite the vehicle user's position such that the embedded camera can provide images (to the device 70) of the portions of the vehicle 40 that the driver cannot see because the vehicle is obstructing them.
(14) A line of sight 72 of the vehicle user 50 is shown in
(15)
(16) The processor is arranged to generate the drone control signal such that the image data captured by the image capture means comprises at least some of the obscured portions of the vehicle 40 that are obscured from the line of sight 72/field of view 73 of the vehicle user 50.
(17) It is noted that the image capture device may be fixed relative to the drone 60 in which case the control signal for the drone 60 may comprise a flight related control signal only. Alternatively, the image capture device may be moveable relative to the drone 60 in which case the control signal for the drone 60 may additionally comprise a directional control signal for the image capture device in addition to a flight related control signal.
(18) The controller 80 is a computing device that can comprise a control unit or computational device having one or more electronic processors (e.g., a microprocessor, a microcontroller, an application specific integrated circuit (ASIC), etc.), and may comprise a single device or multiple devices that collectively operate as the controller 80. The term “controller,” “control unit,” or “computational device” may include a single controller, control unit, or computational device, and a plurality of controllers, control units, or computational devices collectively operating to provide the required control functionality. A set of instructions is provided in some embodiments which, when executed, cause the controller to implement the control techniques mentioned in this description (including some or all of the functionality required for the described method). The set of instructions could be embedded in one or more electronic processors of the computing device; or alternatively, the set of instructions could be provided as software to be executed in the computing device. Given this description those skilled in the art will realize what type of hardware, software, firmware, or a combination of these will best suit their particular needs. It is noted that in embodiments of the present invention the controller 80 may be located within the vehicle, the drone or the user device. For example, the controller may be located in the user device and receive inputs from the drone and the vehicle and then generate control signals to be sent to the drone. Alternatively, the controller 80 might be located within the vehicle (or the drone) and receive inputs from the user device (the user device in this instance essentially operating as a remote peripheral device to the controller).
(19)
(20) In Step 90 positional and image data is received at the inputs 82 of the controller. In step 92 the processor 84 determines the relative positions of the user 50, vehicle 40 and drone 60. In step 94 the processor is arranged to generate a control signal for the drone 60 such that the drone will be commanded to take up a position such that the vehicle 40 is between the drone 60 and the user 50. In step 96 the control signal is output to the drone 60 and image data is output to the user device 70 via the outputs 86.
(21)
(22) Turning to
(23) It is noted that the vehicle 40 may additionally provide image data from image capture means 106 to the inputs 82 of the controller 80. The controller may further either output the image data from the image capture means 106 to the user device 70 or additionally provide the user 50 with a choice to view such image data. Image data from the vehicle 50 may be displayed side by side or above/below with image data from the drone 60 or may be displayed in a “picture in picture” format. In alternative arrangements the image data from the drone 60 could be output to a wearable device such as virtual reality glasses in order to provide a first person view to the vehicle user 50.
(24) As shown in
(25) In the event that the drone 60 senses 118 an obstacle 120 while operating to the default height 112 and distance 116 values then the drone may modify the default values in order to avoid a collision. Such modified values may be related to the controller 80.
(26)
(27) In
(28) Returning to
(29) The vehicle 40 comprises a location determining means 134. In practice the vehicle may comprise a number of methods for determining its location, e.g. via a GPS system, a simultaneous localization and mapping (SLAM) algorithm and/or an inertial navigation system. One or more of such systems are arranged to output positional data 135 via the vehicle communication means 130 to the controller 80 (such data being received at the controller inputs 82 via the communication means 128 of the user device 70).
(30) The vehicle 40 comprises a number of sensors 136 for example radar, ultrasonic sensors, lidar, image capture devices etc. The vehicle 40 may further supply sensor related data 137 via the vehicle communication means 130 to the controller 80.
(31) The drone 60 comprises a location determining means 138. In practice the drone may comprise a number of methods for determining its location, e.g. via a GPS system, a simultaneous localization and mapping (SLAM) algorithm and/or an inertial navigation system. One or more of such systems are arranged to output positional data 139 via the drone communication means 132 to the controller 80 (such data being received at the controller inputs 82 via the communication means 128 of the user device 70).
(32) The drone 60 comprises a number of sensors 140 for example radar, ultrasonic sensors, lidar, image capture devices etc. The drone 60 may further supply sensor related data 141 via the drone communication means 132 to the controller 80. In particular, the drone 60 may supply image data from an image capture means 100 to the controller 80.
(33) The user device 70 also comprises a location determining means 142. In practice, the user device may comprise a number of methods for determining its location, e.g. via a GPS system, a simultaneous localization and mapping (SLAM) algorithm and/or an inertial navigation system. One or more of such systems are arranged to output positional data 143 to the controller 80. The user device may also comprise an image capture means 144 which may send image data 146 to the controller 80.
(34) The data received at the controller 80 may be used to generate a control signal 148 for the drone 60 as described above in relation to
(35) The user device 70 further comprises a display screen 150 and the controller 80 additionally outputs a control signal 152 to the display screen, the control signal 152 including image data received from the image capture means 100 of the drone 60.
(36) The controller may be further configured to allow the vehicle user to manoeuvre the vehicle (by outputting suitable driving control signals) while they are outside the vehicle via a suitable control interface on the user device. By providing image data from the drone 60 to the display screen of the user device 70, the user is able to both manoeuvre the vehicle and also assess and avoid obstacles that are outside their field of view 73/line of sight 72.
(37)
(38) In
(39) The controller 80 may send a control signal 148 to the drone 60 to take up a position 166 directly behind the vehicle 40. Due to the GPS positioning error however the drone 60 may take up position at location 168 as indicated in
(40) The drone 60 may be arranged to detect and correct for the positioning error between locations 166 and 168 in a number of ways. In the event that the drone takes off from a location in or on the vehicle 40 then an inertial navigation system within the drone 60 may detect the positioning error between locations 166 and 168. Additionally or alternatively a time of flight measurement may be made between the vehicle 40 and the drone 60 from which the drone can determine the positioning error. Additionally or alternatively a pattern matching method may be used on image data captured by the image capture means 100 of the drone 60 to identify features on the vehicle 40. The positioning error may be determined by identifying specific features on the vehicle, such as the roof of the vehicle or the wheels of the vehicle, and adjusting the position of the drone 60.
(41) The controller 80 may initiate a positioning calibration step in which the user 50 is directed to take up a certain position relative to the vehicle 40. For example, the user may be requested to stand with the user device 70 directly in front of the vehicle 40. The drone 60 may then be sent a control signal 148 to take up position directly behind the vehicle 40 based on the GPS data received from the drone 60. A calibration process may then be initiated in which any GPS positioning errors are compensated for using any of the methods described above, e.g. using inertial navigation system data from the drone 60, using time of flight measurements between the drone 60 and vehicle 40, using pattern matching methods to determine a relative location of the drone and vehicle.
(42) Once the positioning error is known the drone 60 may correct for the error by moving to location 166. The correction applied to its position may be supplied to both the user device 70 and the vehicle 40 so that they can correct for similar GPS positioning errors. The correction to the position of the drone 60 may be supplied directly between the user device 70, drone 60 and vehicle 40. Alternatively, the drone may inform the controller 80 which then updates the locations of the user device 70 and vehicle 40.
(43) Many modifications may be made to the above examples without departing from the scope of the present invention as defined in the accompanying claims.