A Method for Validating Sensor Units in a UAV, and a UAV
20210082147 · 2021-03-18
Inventors
Cpc classification
H04N23/81
ELECTRICITY
B64U2101/30
PERFORMING OPERATIONS; TRANSPORTING
G06T7/80
PHYSICS
B64U10/16
PERFORMING OPERATIONS; TRANSPORTING
B64C39/024
PERFORMING OPERATIONS; TRANSPORTING
B64U50/30
PERFORMING OPERATIONS; TRANSPORTING
G06T3/4038
PHYSICS
International classification
G06T7/80
PHYSICS
G06T3/40
PHYSICS
Abstract
The present invention relates to a method for validating sensor units in a UAV. The UAV comprising: a first sensor unit and a second sensor unit, each sensor unit being configured to create an image of the surroundings. The method comprising the steps of: taking a first image by the first sensor unit, taking a second image by the second sensor unit, wherein the second image and the first image at least partly overlap, and comparing the overlapping portions between the first image and the second image. Based on a result in which the overlapping portions of the first image and the second image do not correlate to each other, it is determined that at least one of the first sensor unit and the second sensor unit is dysfunctional.
Claims
1. A method for validating sensor units in a UAV comprising a first sensor unit and a second sensor unit said method comprising: taking a first image by said first sensor unit; taking a second image by said second sensor unit, wherein said second image and said first image at least partly overlap; and comparing overlapping portions between the first image and the second image, and based on a result in which said overlapping portions of said first image and said second image do not correlate to each other, determine that at least one of said first sensor unit and said second sensor unit is dysfunctional.
2. The method according to claim 1, wherein in an airborne state, said first sensor unit is in a first position, and wherein the first image is taken in said first position, said method further comprises: arranging said UAV such that said second sensor unit is positioned in said first position, wherein the second image is taken when said second sensor unit is in said first position.
3. The method according to claim 1, further comprising: processing at least one of said first image and said second image before comparing the overlapping portions.
4. The method according to claim 1, wherein the UAV further comprises a third sensor unit, and said method further comprises: taking a third image by said third sensor unit, wherein said third image at least partly overlaps with the overlapping portions of said first image and said second image, said comparing also includes said third image, and based on a result in which overlapping portions of said first image, said second image, and said third image do not correlate to each other, determine which of the first sensor unit, the second sensor unit, and the third sensor unit said is dysfunctional.
5. The method according to claim 1, wherein said first sensor unit and said second sensor unit are angularly offset in relation to each other.
6. The method according to claim 1, further comprising: directly landing the UAV when at least one of said first sensor unit and said second sensor unit is determined to be dysfunctional.
7. The method according to claim 1, further comprising: launching the UAV to an airborne state, wherein the UAV is hovering when performing the steps of taking said first image and taking said second image.
8. The method according to claim 1, wherein said first sensor unit and said second sensor unit each comprise at least two sensors.
9. The method according to claim 8, wherein any one of said at least two sensors is one of: an RGB camera, an IR camera, a radar receiver, or a hyperspectral camera.
10. A UAV, comprising: a first sensor unit and a second sensor unit, each of the first sensor unit and the second sensor unit being configured to create an image of surroundings; and a control unit configured to: instruct the first sensor unit to take a first image; instruct the second sensor unit to take a second image, wherein said second image and said first image at least partly overlap; and compare overlapping portions between the first image and the second image, and based on a result in which said overlapping portions of said first image and said second image do not correlate to each other, determine that at least one of said first sensor unit and said second sensor unit is dysfunctional.
11. The UAV according to claim 10, further comprising a third sensor unit, and wherein the control unit is further configured to: instruct the third sensor unit to take a third image, wherein said third image at least partly overlaps with the overlapping portions of said first image and said second image, perform a comparison with said first image, said second image, and said third image, and based on a result in which overlapping portions of said first image, said second image, and said third image do not correlate to each other, determine which of the first sensor unit, the second sensor unit, and the third sensor unit is dysfunctional.
12. The UAV according to claim 10, wherein said first sensor unit and said second sensor unit are angularly offset in relation to each other.
13. The UAV according to claim 10, wherein said control unit is further configured to instruct the UAV to launch to an airborne state, and to hover while taking the first image and while taking the second image.
14. A method of using a first sensor unit and a second sensor unit comprised by a UAV, to carry out validation of said sensor units, the method comprising: taking a first image by said first sensor unit; taking a second image by said second sensor unit, wherein said second image and said first image at least partly overlap; and comparing overlapping portions between the first image and the second image, and based on a result in which said overlapping portions of said first image and said second image do not correlate to each other, determine that at least one of said first sensor unit and said second sensor unit is dysfunctional.
15. The UAV according to claim 10, wherein in an airborne state, said first sensor unit is in a first position, and wherein the first image is taken in said first position, and said control unit is further configured to: control said second sensor unit to take said second image when said second sensor unit is positioned in said first position.
16. The UAV according to claim 10, wherein said control unit is further configured to: process at least one of said first image and said second image before comparing the overlapping portions.
17. The UAV according to claim 10, and said control unit is further configured to: instruct the UAV to land directly when at least one of said first sensor unit and said second sensor unit is determined to be dysfunctional.
18. The UAV according to claim 10, wherein said first sensor unit and said second sensor unit each comprise at least two sensors.
19. The UAV according to claim 18, wherein any one of said at least two sensors is one of: an RGB camera, an IR camera, a radar receiver, or a hyperspectral camera.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0052] These and other features and advantages of the present invention will in the following be further clarified and described in more detail, with reference to the appended drawings showing exemplary embodiments of the present invention.
[0053]
[0054]
[0055]
[0056]
DETAILED DESCRIPTION OF EMBODIMENTS
[0057] In the following detailed description, some embodiments of the present invention will be described. However, it is to be understood that features of the different embodiments are exchangeable between the embodiments and may be combined in different ways, unless anything else is specifically indicated. Even though in the following description, numerous specific details are set forth to provide a more thorough understanding of the present invention, it will be apparent to one skilled in the art that the present invention may be practiced without these specific details. In other instances, well known constructions or functions are not described in detail, so as not to obscure the present invention.
[0058]
[0059] The UAV 1 comprises a body 2 having two leg portions 21. The body 2 is adapted to carry all of the other components comprised by the UAV 1, and the leg portions 21 are adapted to support the UAV 1 when it is not being airborne. The UAV 1 further comprises six actuators 3 arranged on six arm portions 22 extending from the body 2. The actuators 3 are connected to six propellers 31. The actuators 3 may suitably be electrical engines or combustion engines. By controlling the actuators 3, the rotation of the propellers 31 and hence the movement of the UAV 1 may be controlled. This is preferably done by a control unit 4. The control unit 4 may be connected to the actuators 3 wirelessly, or they may be wired. The control unit 4 will be further described below.
[0060] The actuators 3 and the control unit 4 are powered by a power supply unit 5, which may suitably be some type of battery, e.g. a lithium-polymer battery, or an electrical generator of some type. The power supply unit 5 may comprise a plurality of subunits, e.g. a plurality of batteries. The size and capacity of the power supply unit 5 may be adapted to the size/weight of the UAV 1, the size/weight of potential goods that the UAV 1 is to carry, and the length of the flights that the UAV 1 is intended to perform. In some embodiments, the power supply unit may not be a part of the UAV, but the UAV may be connected to an external power supply unit, e.g. by wiring the UAV to the mains electricity.
[0061] The UAV 1 further comprises a first sensor unit 61 and a second sensor unit 62 which is angularly offset in relation to each other. In this exemplary embodiment, the UAV 1 further comprises a third sensor unit 63, a fourth sensor unit 64, a fifth sensor unit 65, and a sixth sensor unit 66 angularly offset in relation to each other. Each one of the sensor units is configured to create an image of the surroundings. All of the sensor units are mounted circumferentially of the UAV, angularly offset in relation to each other. In some embodiments, a seventh sensor unit may be mounted at the centre of the UAV, facing downwards. Although only the first sensor unit 61, the second sensor unit 62 and the third sensor unit 63 are described in the following detailed description, any features and method steps described in relation to the first, second and third sensor units 61, 62, 63 may also be applied to the fourth, fifth and sixth sensor units 64, 65, 66. The sensor units 61-66 will be further described in relation to
[0062] The UAV 1 further comprises a control unit 4. The control unit 4 may for example be manifested as a general-purpose processor, an application specific processor, a circuit containing processing components, a group of distributed processing components, a group of distributed computers configured for processing, a field programmable gate array (FPGA), etc. The control unit 4 may further include a microprocessor, microcontroller, programmable digital signal processor or another programmable device. The control unit 4 may also, or instead, include an application specific integrated circuit, a programmable gate array or programmable array logic, a programmable logic device, or a digital signal processor. Where the control unit 4 includes a programmable device such as the microprocessor, microcontroller or programmable digital signal processor mentioned above, the processor may further include computer executable code that controls operation of the programmable device.
[0063] The UAV 1 according to the illustrated exemplary embodiment further comprises a GPS module 7, for navigation of the UAV 1. Other embodiments may not comprise a GPS module, or may comprise a GPS module but may not use it for navigation. In this exemplary embodiment however, correspondingly to the control unit 4, the GPS module 7 may for example include a GPS receiver, a microprocessor, microcontroller, programmable digital signal processor or another programmable device. The GPS module 7 may also, or instead, include an application specific integrated circuit, a programmable gate array or programmable array logic, a programmable logic device, or a digital signal processor arranged and configured for digital communication with the control unit 4. Where the control unit 4 includes a programmable device such as the microprocessor, microcontroller or programmable digital signal processor mentioned above, the GPS module 7 may simply comprise a GPS receiver and circuits for digital communication with the control unit 4.
[0064] The processor (of the control unit 4 and/or the GPS module 7) may be or include any number of hardware components for conducting data or signal processing or for executing computer code stored in memory. The memory may be one or more devices for storing data and/or computer code for completing or facilitating the various methods described in the present description. The memory may include volatile memory or non-volatile memory. The memory may include database components, object code components, script components, or any other type of information structure for supporting the various activities of the present description. According to an exemplary embodiment, any distributed or local memory device may be utilized with the systems and methods of this description. According to an exemplary embodiment the memory is communicably connected to the processor (e.g., via a circuit or any other wired, wireless, or network connection) and includes computer code for executing one or more processes described herein.
[0065] The control unit 4 is connected to the various described features of the UAV 1, such as e.g. the GPS module 7, the sensor units 61-66 and the actuators 3, and is configured to control system parameters. Moreover, the control unit 4 may be embodied by one or more control units, where each control unit may be either a general purpose control unit or a dedicated control unit for performing a specific function.
[0066] The present disclosure contemplates methods, devices and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor.
[0067] By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine, the machine properly views the connection as a machine-readable medium. Thus, any such connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data that cause a general-purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
[0068] It should be understood that the control unit 4 may comprise a digital signal processor arranged and configured for digital communication with an off-site server or cloud based server. Thus data may be sent to and from the control unit 4.
[0069]
[0070] In
[0071]
[0072] The process may be initiated by the UAV receiving instructions to perform a validation of the sensor units. These instructions may be given directly, or the control unit may for example be programmed to perform a validation every time the UAV is instructed to initiate a flight. The process may then start with step a, wherein the UAV is launched to an airborne state, similar to what is described in relation to
[0073] After step d, three optional steps follow. In step e, the UAV may be arranged such that a third sensor unit is in the first position. This is suitably followed by a step f in which a third image is taken by the third sensor unit, in the first position, such that the third image at least partly overlaps with the overlapping portions of the first image and the second image.
[0074] All of the steps related to taking images, i.e. steps b, d and f, are suitably performed by the control unit giving instructions to the sensor units to take such images. All of the steps of arranging the UAV are suitably performed by the control unit giving instructions to the actuators which control the propellers of the UAV.
[0075] After the first image, the second image, and optionally the third image have been taken, the process may include a step g of processing any one(s) of, or all of the images. The processing step may include any suitable type of image or data processing. This is followed by the step h of comparing the overlapping portions of the first image, the second image and optionally the third image. The comparison is suitably performed by the control unit. Based on a result from the comparison in which the overlapping portions of the first image, the second image and optionally the third image do not correlate to each other, in step i it may be determined that at least one of the sensor units is dysfunctional. If the process includes step f of taking a third image, step i may be followed by a step j of determining which one of the sensor units that is dysfunctional. This may be desired to simplify the process of repairing the UAV.
[0076] In some embodiments, the UAV may be instructed by the control unit to land, in step j, if it is determined that at least one of the sensor units is dysfunctional. If instead it is determined in step l that both the first sensor unit and the second sensor unit, and optionally the third sensor unit, are functional, the UAV may be allowed to continue to fly.
[0077] Although
[0078] The person skilled in the art realizes that the present invention by no means is limited to the embodiments described above. The features of the described embodiments may be combined in different ways, and many modifications and variations are possible within the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting to the claim. The word comprising does not exclude the presence of other elements or steps than those listed in the claim. The word a or an preceding an element does not exclude the presence of a plurality of such elements.