Abstract
The invention relates to an unmanned aerial vehicle (UAV), the operation of a UAV, and the control of a UAV. Aspects of the invention relate to a UAV including a directional distance measuring module for inspecting/surveying/measuring/digitizing the UAV's environment.
Claims
1. An unmanned aerial vehicle, UAV, for flying in a physical environment including: a body extending along an axis from a front end to a back end and having a housing, a first mounting structure attached to the body and extending away from the body in a direction to a left side of the axis, a second mounting structure attached to the body and extending away from the body in a direction to a right side of the axis being an opposite direction to the direction to the left side, four propulsion units, in particular rotor assemblies, two of which are mounted to the first mounting structure and two of which are mounted to the second mounting structure, a directional distance measuring module including: a measuring field of view with a main view direction, within which measuring field of view directions and distances to surfaces in the physical environment are measurable by directionally emitting distance measurement radiation into the field of view, a detector unit for detecting distance measurement radiation reflected from a surface, and a distance measurement radiation source, wherein: the directional distance measuring module is integrated in the front end of the body inside the housing, and the distance measurement radiation is directionally emittable by the directional distance measuring module through the housing out of the front end of the body.
2. The UAV according to claim 1, wherein the directional distance measuring module having a deflector unit deflecting distance measurement radiation from the distance measurement radiation source through the housing into the field of view.
3. The UAV according to claim 1, the deflector unit deflecting distance measurement radiation, reflected from a surface through the housing, to the detector unit.
4. The UAV according to claim 1, the deflector unit being mounted to rotate around a first rotation axis and a second rotation axis being transverse to the first rotation axis.
5. The UAV according to claim 4, the first rotation axis being aligned or parallel, to the axis along which the body extends.
6. The UAV according to claim 1, the radiation source including an array of single emitting radiation sources.
7. The UAV according to claim 6, the radiation source being configured to emit by the single emitting radiation sources radiation combining into the distance measurement radiation according to the phased array principle.
8. The UAV according to claim 1, wherein: the UAV includes at least one sensor module generating and/or providing environment data, and/or the directional distance measuring module is configured to provide directional distance information relating to measured distances and directions to an object in the physical environment.
9. The UAV according to claim 1, wherein the directional distance measuring module measures distances and directions based on the light detection and ranging (lidar) principle.
10. The UAV according to claim 1, wherein: the first mounting structure includes a mounting part to which the propulsion units or the rotor assemblies, are mounted, a first protective frame at least partly running curved around a portion of an outer edge of the propulsion units, in particular of the rotor assemblies, is attached to the first mounting structure, the second mounting structure includes a mounting part to which the propulsion units, in particular the rotor assemblies, are mounted, and a second protective frame at least partly running curved around a portion of an outer edge of the propulsion units, in particular of the rotor assemblies, is attached to the second mounting structure.
11. The UAV according to claim 10, the mounting parts include at least one strut element having a hollow interior, wherein the hollow interior forms a hidden cable routing from a propulsion unit to the body.
12. The UAV according to claim 11, each of the propulsion units or the rotor assemblies, is mounted to the mounting structure where three strut elements are connecting.
13. The UAV according to claim 10, wherein a protective frame includes a foamed core being surrounded by a fiber-reinforced shell.
14. The UAV according to claim 13, wherein at least one of the protective frames includes a therein integrated antenna, wherein the antenna is embedded between the foamed core and the fiber-reinforced shell.
15. The UAV according to claim 13, wherein at least one of the protective frames includes a therein integrated radar sensor, in particular wherein the radar sensor is joined with the fiber-reinforced shell.
16. The UAV according to claim 13, wherein a protective frame provides a hidden cable routing inside the protective frame by embedding a cable in the foamed core.
17. The UAV according to claim 1, wherein a mounting structure includes a shell forming an outer surface of the mounting structure, wherein the shell is formed as a monolithic part.
18. The UAV according to claim 17, wherein the shell is formed by a fiber reinforced polymer, in particular a carbon fiber reinforced polymer.
19. The UAV according to claim 1, wherein each of the mounting structures is attached to the body such that the mounting structure is rotatable around the axis along which the body extends, from a first snap-in position to a second position, in particular to a second snap-in position.
20. The UAV according to claim 19, wherein: with the mounting structures in the first snap-in position, the first mounting structure is extending away from the body in a direction to the left side of the axis, the second mounting structure is extending away from the body in a direction to the right side of the axis being an opposite direction to the direction to the left side, and with the mounting structures in the second position, in particular second snap-in position, both mounting structures are extending away from the body in a same direction.
21. The UAV according to claim 19, wherein landing support structures are: located at the mounting structures and/or the protective frames, and protruding from the mounting structures and/or protective frames in a direction transverse to a plane in which a mounting structure mainly extends, wherein the landing support structures are located such that with the mounting structures in the second position, the landing support structures intertwine.
22. The UAV according to claim 1, the UAV including a camera system.
23. The UAV according to claim 22, the camera system including a plurality of cameras arranged peripherally at the UAV, with: each camera having a field of view with a fixed orientation in relation to the UAV and directed away from the UAV, one front camera facing forward, one top camera facing up, one bottom camera facing down, and at least one side camera facing sideways, wherein the cameras are arranged such that: each field of view overlaps to a predefined degree at least one adjacent field of view, and the camera system provides an all-round view to the physical environment.
24. The UAV according to claim 23, wherein: the front camera is mounted to one of the mounting structures, and the at least one side camera is mounted to one of the mounting structures of the UAV.
25. The UAV according to claim 23, wherein at least one of the cameras is mounted at a mounting structure and a protective frame at a location where the protective frame is attached to the mounting structure.
26. The UAV according to any of claim 23, wherein the directional distance measuring module is configured to measure a distance and direction to an object surface of the physical environment of the UAV, and at least part of which is within at least one field of view of a camera.
27. The UAV according to claim 1, the UAV including a UAV powering system supplying the UAV with power, the UAV powering system being configured to provide: battery charge level information of a battery powering the UAV, a switchability between a battery powered and capacitor powered supply mode, and a selective deactivatability to selectively deactivate predetermined power consuming units of the UAV, and alternatively power the UAV both by battery power or by capacitor power, such that an uninterrupted power supply is provided, while the battery is being replaced, or while switching between the battery powered and capacitor powered supply mode, and the UAV powering system including: a capacitor for enabling to alternatively power the UAV both by battery power or by capacitor power, and a battery charge level information generator.
28. The UAV according to claim 10, the UAV including a UAV indicator light system, wherein: the first protective frame forms a front left corner section and a rear left corner section, and the second protective frame forms a front right corner section and a rear right corner section, wherein the indicator light system includes a first linear indicator for emitting light and running curved around a portion of an outer edge of the propulsion units and along the first protective frame in the front left corner section, a second linear indicator for emitting light and running curved around a portion of an outer edge of the propulsion units and along the first protective frame in the rear left corner section, a third linear indicator for emitting light and running curved around a portion of an outer edge of the propulsion units and along the second protective frame in the front right corner section, and a fourth linear indicator for emitting light and running curved around a portion of an outer edge of the propulsion units and along the second protective frame in the rear right corner section, wherein each linear indicator is arranged, with the UAV in a flying state, to emit light away from the UAV and towards ground into a confined emission sector (67, 67, 67, 67), and enables a variable emission of light, such that: an orientation-specific user perception of the UAV, and an indicating of a UAV-status to a user is enabled.
29. The UAV according to claim 8, wherein the UAV includes: a GNSS receiver module for receiving GNSS positioning signals, a local navigation sensor module generating local navigation sensor signals, and an autonomous navigation control unit, communicatively connected to the GNSS receiver module, at least one sensor module and the local navigation sensor module, and being configured to: continuously receive: GNSS positioning signals, environment data, and local navigation sensor signals, and based thereon, autonomously navigate the UAV.
30. The UAV according claim 22, the camera system being configured to provide image data, and the directional distance measuring module being configured to provide directional distance information.
31. The UAV according to claim 22, the UAV including a multipurpose sensor system including: the camera system, an inertial measurement unit (IMU), and a GNSS receiver module, wherein the multipurpose sensor system is configured to generate sensor raw data in the form of: image data from the camera system, motion data from the inertial measurement unit, measurement data, in particular 3D point data, from the directional distance measuring module of the UAV, in particular wherein the directional distance measuring module measures distances and directions to object surfaces based on the light detection and ranging (lidar) principle, and global position data from the GNSS receiver module of the UAV.
32. The UAV according to claim 1, the UAV being configured to communicatively connect to a battery management terminal, and transmit battery management data related to a UAV-battery on board the UAV to the battery management terminal.
33. The UAV according to claim 1, the UAV being configured to receive instructions related to performing a measurement task, autonomously fly, supported by an autonomous navigation control unit, in a physical environment based on the instructions, while autonomously flying: scan and thereby measure the physical environment by the directional distance measuring module, generate measurement data in the form of 3D point data, view the physical environment by the camera system and generate image data, sense the physical environment by at least one sensor module and/or by a multipurpose sensor system of the UAV and generate sensor data, and provide measurement data, image data and sensor data: for generating 3D point cloud data representing the physical environment of the UAV, and to the autonomous navigation control unit for supporting the autonomous flying of the UAV.
34. The UAV according to claim 1, the UAV being a rotary wing drone.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0616] The first to fifteenth aspects of the invention are described below in more detail purely by way of example with the aid of concrete exemplary embodiments of the first to fifteenth aspect of the invention illustrated schematically in the figures, further advantages of the first to fifteenth aspects of the invention also being examined. In detail:
[0617] FIG. 1 shows a mobile control device and a stroke-based touch input, and in a top view a UAV, an environment, and an object surface;
[0618] FIG. 2 shows a top view to an object surface and a movement of the UAV;
[0619] FIG. 3 relates to a stroke-based touch input;
[0620] FIG. 4 shows a top view to a UAV;
[0621] FIG. 5 shows a touch sensitive display and a two-finger pinch touch input and in a top view a UAV, an environment, and an object;
[0622] FIG. 6 shows a digitally scaled view in a live-view;
[0623] FIG. 7 shows an un-scaled view;
[0624] FIG. 8 shows a touch sensitive display and a two-finger pinch touch input and in a top view a UAV, an environment, and an object;
[0625] FIG. 9 shows a digitally scaled view in a live-view;
[0626] FIG. 10 shows an un-scaled view;
[0627] FIG. 11 relates to a two-finger pinch touch input;
[0628] FIG. 12 relates to a two-finger stroke touch input;
[0629] FIG. 13 shows in a top view a UAV, an environment and an object in the environment;
[0630] FIG. 14 shows a touch sensitive display and a single tap touch input and in a top view a UAV, an environment, and an object;
[0631] FIG. 15 shows a digitally rotated view in a live-view;
[0632] FIG. 16 shows a view after a rotational movement of a UAV;
[0633] FIG. 17 shows a UAV with its principle axes;
[0634] FIG. 18 shows a touch sensitive display including a plurality of touch zones;
[0635] FIG. 19 shows a touch sensitive display including a plurality of touch zones;
[0636] FIG. 20 shows a virtual camera position with view directions;
[0637] FIG. 21 shows a touch sensitive display receiving a one-finger stroke touch input;
[0638] FIG. 22 shows an adapted live-view;
[0639] FIG. 23 shows in a live-view, a view to the physical environment;
[0640] FIG. 24 relates to a one-finger stroke touch input;
[0641] FIG. 25 shows a virtual camera position with view directions;
[0642] FIG. 26 shows a touch sensitive display receiving a double tap touch input;
[0643] FIG. 27 shows an adapted live-view;
[0644] FIG. 28 shows a flying UAV with a mobile control device;
[0645] FIG. 29 shows a UAV flying in a physical environment;
[0646] FIG. 30 shows a UAV with an indicator light system;
[0647] FIG. 31 shows a cross section of a band-shaped protective frame;
[0648] FIG. 32 shows a linear indicator comprising a light source;
[0649] FIG. 33 shows a linear indicator comprising a plurality of light sources;
[0650] FIG. 34 shows a schematic of a workflow related to an autonomous navigation control unit;
[0651] FIG. 35 shows a top view to a UAV in a physical environment and different views, which can be generated and displayed in a live-view;
[0652] FIG. 36 shows a front view to a UAV in a physical environment and different views, which can be generated and displayed in a live-view
[0653] FIG. 37 shows a schematic of a workflow related to displaying a view in a live-view;
[0654] FIG. 38 shows a schematic of a workflow related to the raw data conditioning unit;
[0655] FIG. 39 shows a schematic of a workflow related to the tracking functionality;
[0656] FIG. 40 shows a situation with a UAV scanning a faade of a building and being referenced to a reference coordinate system;
[0657] FIG. 41 shows a situation with a UAV moving propulsion free around a corner of a building and being referenced to a reference coordinate system;
[0658] FIG. 42 shows a situation with a UAV scanning a faade of a building and being referenced to a reference coordinate system;
[0659] FIG. 43 shows a system for managing a set of UAV batteries;
[0660] FIG. 44 shows a UAV having a body, propulsion units and mounting structures;
[0661] FIG. 45 shows a schematic of a directional distance measuring module;
[0662] FIG. 46 shows a UAV with mounting structures having mounting parts with strut elements; and
[0663] FIG. 47 shows a UAV with rotatable mounting structures.
DETAILED DESCRIPTION
[0664] FIG. 1 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, stroke based touch inputs 8 and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on environment data, which are received from at least one sensor module of the UAV 1. The sensor module typically relates to a camera of a camera system of the UAV 1.
[0665] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0666] The UAV 1 includes a directional distance measuring module as sensor module. The directional distance measuring module is configured to measure distances and directions to surfaces/points on surfaces of the physical environment 2. Directions can be measured, for example, by determining a horizontal and vertical angle under which a distance measurement radiation is emitted towards a surface/point on a surface of the physical environment 2. A distance can be measured, for example, based on the time-of-flight principle using the emitted distance measurement radiation.
[0667] Measuring surfaces/point of surfaces of the physical environment 2 enables measuring/surveying physical environments and objects in the physical environments. Physical environments 2 can relate, for example, to landscapes, acres, fields, forests, hillsides etc. Objects can relate, for example, to buildings, roads, bridges, humans, airplanes, objects of a construction site, tunnels etc.
[0668] Measuring/surveying physical environments 2 and objects in the physical environments relates to generating and recording measurement data, which enable a digital reconstruction of the measured/surveyed physical environment and/or objects. Such measurement data typically relate to 3D point cloud data, which enable the reconstruction in the form of a 3D point cloud.
[0669] The distance measuring module has a field of view, wherein physical environments 2 and objects, which are within the field of view, are measurable. The field of view has a main view direction 4. For a specific, for example optimized, measuring/surveying of physical environments 2 and objects, the field of view with its main view direction is aligned in a predefined way to the physical environment 2 or to an object, which is to be measured/surveyed. Thereby the alignment of the field of view of the distance measuring module is controlled by controlling the movement of the UAV 1.
[0670] FIG. 1 shows on the right side the UAV 1, which is flying in the physical environment 2. The UAV 1 is in front of a building with its distance measuring module facing the building. Thereby, the building is the object to be measured/surveyed. The faade of the building is the object surface 6. At least a part of the faade 5 is within the field of view of the distance measuring module. This at least a part of the faade 5 can be determined by measuring/surveying.
[0671] As shown in FIG. 2, if the touch sensitive display receives, within the constant distance mode/the constant distance mode being activated, a stroke-based touch input 8, the UAV 1 is positioning and orienting itself such, that the main view direction 4 of the distance measuring unit is aligned in a specific/predetermined way 10 with respect to the at least a part of the faade. Furthermore, the UAV 1 moves along the faade and maintains during moving a constant distance 11 to the faade and the specific/predetermined alignment with respect to the faade.
[0672] It can be provided that the UAV 1 is moving only as long as the stroke-based touch input 8 is being received. With other words, as soon as a touching, related to the stroke-based touch input 8, of the touch sensitive display is interrupted or terminated the UAV 1 is stopped.
[0673] FIG. 3 shows a stroke-based touch input 8. The touch sensitive display is touched by a finger, wherein while maintaining touching the touch sensitive display with the finger the finger is stroked across the touch sensitive display. Stroking across the touch sensitive display relates to generating the stroke progression 9.
[0674] The stroke-based touch input 8 can also relate to touching the touch sensitive display with two fingers in analogy to touching the touch sensitive display with one finger.
[0675] The movement of UAV 1 along the faadewhile maintaining during moving a constant distance 11 to the faade and the specific/predetermined alignment with respect to the faadeis further based on the stroke progression 9. Based on the stroke progression 9 control commands can be derived for controlling the movement of the UAV 1. A control command can relate to, for example, a stroke direction 14, in which the UAV 1 moves along the faade, a velocity with which the UAV 1 moves along the faade etc.
[0676] The stroke-based touch input 8, shown in FIG. 3, includes a stroke progression start state 12 and a stroke progression end state 13. It can be provided that between the stroke progression start state 12 and the stroke progression end state 13 a plurality of further stroke progression start states 12 and stroke progression end states 13 are included, which ad up to the stroke-based touch input 8 and can be used to derive, for example, the stroke progression 9 and, based thereon, control commands for controlling the movement of the UAV 1 during receiving the stroke-based touch input 8.
[0677] It can be provided, that a selectability of the constant distance mode is provided to a user. The selectability of the constant distance mode can be based on a selectability of the at least a part of the faade 5 which is determined by the UAV 1. For example, the at least a part of the faade is captured by the distance measuring module of the UAV 1. As soon as the at least a part of the faade is determined its selectability by touching the determined at least a part of the faade can be provided. If the determined at least a part of the faade is selected/touched, the constant distance mode can automatically be activated or a selectability of the constant distance mode can be provided to a user.
[0678] A selectability of the constant distance mode can also be triggered by, for example, geometric properties of the at least a part of the faade 5. For example, if the faade is the faade of a lighthouse, the faade is curved. Without moving, only a small part of the faade will be capturable by the distance measuring module. Based on a determined curvature of the at least a part of the faade the constant distance mode can be selectable or automatically activated.
[0679] A geometric property of the faade can also relate to a discontinuity along the faade. A discontinuity can relate, for example, to a corner, which includes an abrupt direction-change in the course of the faade.
[0680] A selectability of the constant distance mode can be provided, for example, by overlaying a symbol such as a ruler or distance indicator to the determined at least a part of the faade 5.
[0681] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The directional distance measuring module is integrated in the front end 16 of the UAV 1. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
[0682] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input, a two-finger stroke touch input a one-finger stroke touch input, a single tap touch input, a double tap touch input etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0683] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display are typically indicative of moving the UAV 1 in the physical environment 2.
[0684] The touch sensitive display shown in FIG. 5 is receiving a two-finger pinch touch input 23 with two pitch points 24. While receiving the two-finger pinch touch input 23 the two pitch points 24 are moving across the touch sensitive display away from each other, which relates to generating a pitch point progression 25.
[0685] FIG. 6 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 5. The live-view in FIG. 6 results from adapting by digitally scaling based on the pitch point progression 25 the live-view of FIG. 5.
[0686] FIG. 7 shows in a live-view, a view, which is generated by the UAV after the UAV has moved towards the object in the physical environment, wherein the movement is based on the pitch point progression 25 and the pinch direction.
[0687] The touch sensitive display shown in FIG. 8 is receiving a two-finger pinch touch input 23 with two pitch points 24. While receiving the two-finger pinch touch input 23 the two pitch points 24 are moving across the touch sensitive display towards each other, which relates to generating a pitch point progression 25.
[0688] FIG. 9 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 8. The live-view in FIG. 9 results from adapting by digitally scaling based on the pitch point progression 25 the live-view of FIG. 8.
[0689] FIG. 10 shows in a live-view, a view, which is generated by the UAV after the UAV has moved away from the object in the physical environment, wherein the movement is based on the pitch point progression 25 and the pinch direction.
[0690] FIG. 11 relates to a two-finger pinch touch input 23. The touch sensitive display is touched by two fingers, wherein while maintaining touching the touch sensitive display with the two fingers the two fingers are moved across the touch sensitive display away from each other.
[0691] The movement of UAV 1 towards or away from an object is based on the pitch point progression 25. Based on the pitch point progression 25 control commands can be derived for controlling the movement of the UAV 1. A control command can relate to, for example, a pinch direction, in which the UAV 1 moves towards or away from an object, a velocity with which the UAV 1 moves towards or away from an object etc.
[0692] The two-finger pinch touch input 23, to which FIG. 11 is relating, includes a pitch point progression start state 26 and a pitch point progression end state 27. It can be provided that between the pitch point progression start state 26 and the pitch point progression end state 27 a plurality of further pitch point progression start states 26 and pitch point progression end states 27 are included, which ad up to the two-finger pinch touch input 23 and can be used to derive, for example, the pitch point progression 25 and, based thereon, control commands for controlling the movement of the UAV 1 during receiving the two-finger pinch touch input 23.
[0693] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
[0694] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input, a single tap touch input, a double tap touch input etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0695] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0696] The touch sensitive display shown in FIG. 5 is receiving a two-finger stroke touch input 30.
[0697] FIG. 12 relates to a two-finger stroke touch input 30. The touch sensitive display is touched by two fingers, wherein while maintaining touching the touch sensitive display with the two fingers the fingers are stroked across the touch sensitive display. Stroking across the touch sensitive display relates to generating the stroke progression 9.
[0698] The two-finger stroke touch input 30, of FIG. 12, includes a stroke progression start state 12 and a stroke progression end state 13. It can be provided that between the stroke progression start state 12 and the stroke progression end state 13 a plurality of further stroke progression start states 12 and stroke progression end states 13 are included, which ad up to the two-finger stroke touch input 30 and can be used to derive, for example, the stroke progression 9 and, based thereon, control commands for controlling the movement of the UAV 1 during receiving the two-finger stroke touch input 30.
[0699] FIG. 13 shows a top view to a UAV 1 having a main view direction 28. Thereby, the view of the physical environment is displayed in a first view direction 29. Furthermore, FIG. 13 shows a direction 31 transverse to the first view direction.
[0700] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
[0701] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input, a single tap touch input 32, a double tap touch input etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0702] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0703] The touch sensitive display shown in FIG. 14 is receiving a single tap touch input 32 with a tap point 33 at a location 35.
[0704] FIG. 15 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 14. The live-view in FIG. 15 results from adapting by digitally rotating the live-view of FIG. 14 around at least one of the principle axes in order to centre the location of the tap point 35 in the view.
[0705] FIG. 16 shows in a live-view, a view, which is generated by the UAV after the UAV has moved based on a rotational motion pattern, by a rotational movement around the UAV's yaw axis, and based on digitally rotating the live-view of FIG. 14, in order to centre the location of the tap point 35 in the view.
[0706] FIG. 17 shows a UAV 1 with its principle axes 34, 34, and 34. More specifically a UAV 1 with its yaw axis 34, roll axis 34 and pitch axis 34.
[0707] FIG. 18 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are strip shaped.
[0708] FIG. 19 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are square shaped.
[0709] FIG. 20 shows a virtual camera position 37 from which the view of the physical environment is continuously displayed in the view direction 38. Upon receiving a single tap touch input, the view is digitally rotated around the UAV's yaw axis 34. Then the view is continuously displayed from the virtual camera position 37 in the digitally rotated view direction 38.
[0710] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
[0711] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input 39, a single tap touch input 32, a double tap touch input etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0712] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0713] The touch sensitive display shown in FIG. 5 or FIG. 21 is receiving a one-finger stroke touch input 39, with a stroke progression 9 and a stroke direction 14.
[0714] FIG. 22 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 21. The live-view in FIG. 22 results from adapting by digitally rotating the live-view of FIG. 21 around at least one of the principle axes and along a stroke direction (14) based on the stroke progression (9).
[0715] FIG. 23 shows in a live-view, a view, which is generated by the UAV after the UAV has moved around at least one of the principle axes, in particular around the UAV's yaw axis, and along the stroke direction (14), and based on digitally rotating the live-view of FIG. 21.
[0716] FIG. 17 shows a UAV 1 with its principle axes 34, 34, and 34. More specifically a UAV 1 with its yaw axis 34, roll axis 34 and pitch axis 34.
[0717] FIG. 24 relates to a one-finger stroke touch input 39. The touch sensitive display is touched by one finger, wherein while maintaining touching the touch sensitive display with the one finger the finger is stroked across the touch sensitive display. Stroking across the touch sensitive display relates to generating the stroke progression 9.
[0718] The one-finger stroke touch input 39, of FIG. 24, includes a stroke progression start state 12 and a stroke progression end state 13. It can be provided that between the stroke progression start state 12 and the stroke progression end state 13 a plurality of further stroke progression start states 12 and stroke progression end states 13 are included, which ad up to the one-finger stroke touch input 39 and can be used to derive, for example, the stroke progression 9 and, based thereon, control commands for controlling the movement of the UAV 1 during receiving the one-finger stroke touch input 39.
[0719] FIG. 25 shows a virtual camera position 37 from which the view of the physical environment is continuously displayed in the view direction 38. While receiving a one-finger stroke touch input the view is digitally rotated around the UAV's yaw axis 34. Then the view is continuously displayed from the virtual camera position 37 in the digitally rotated view direction 38.
[0720] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
[0721] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input 39, a single tap touch input 32, a double tap touch input 40 etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0722] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0723] The touch sensitive display shown in FIG. 5 or FIG. 26 is receiving a double tap touch input 40 with a tap point 33.
[0724] FIG. 27 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 26. The live-view in FIG. 27 results from adapting by digitally rotating the live-view of FIG. 26 around at least one of the principle axes, and after the UAV has moved by a rotational movement around at least one of its principle axes 34, 34, 34 and at a predetermined amount along the tap direction 41, in order to centre the location 35 of the tap point 33 in the view.
[0725] FIG. 17 shows a UAV 1 with its principle axes 34, 34, and 34. More specifically a UAV 1 with its yaw axis 34, roll axis 34 and pitch axis 34.
[0726] FIG. 18 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are strip shaped.
[0727] FIG. 19 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are square shaped.
[0728] FIG. 20 shows a virtual camera position 37 from which the view of the physical environment is continuously displayed in the view direction 38. Upon receiving a double tap touch input, the view is digitally rotated around the UAV's yaw axis 34. Then the view is continuously displayed from the virtual camera position 37 in the digitally rotated view direction 38.
[0729] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
[0730] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE EIGHTH ASPECT OF THE INVENTION
[0731] FIG. 28 shows a flying UAV 1, which is powered by a single battery 43. While flying the UAV is continuously providing battery charge level information 42, related to the battery 43, either to a computing unit of the UAV or to a mobile control device 3 for determining a most recent battery charge level 44. The mobile control device 3 has a touch sensitive display and displays a charge level notification 45 to a user of the UAV 1 based on the battery charge level 44.
[0732] In case the battery charge level 44 is determined to be at a level, which necessitates swapping the battery 43, the UAV 1 is instructed to autonomously move to a predetermined location. The location can be, for example, the location, from where the UAV has taken off/been launched, or the location from where a user is controlling the UAV. The UAV then is further instructed to land at the location, such that the battery 43 can be replaced within a predetermined time window. The predetermined time window is strongly depending on the amount of time during which the UAV can be powered by capacitor power. The predetermined time window is as well depending on which and how many of the power consuming units of the UAV have to be supplied by capacitor power during replacing/swapping the batteries.
[0733] After the UAV has landed at the location, typically the propulsion units, the cameras 19, 20, 21, 22, or at least one of the cameras and at least the most power consuming sensor module, for example, the directional distance measuring module, are deactivated.
[0734] At the latest, when the battery is being disconnected from the UAV the UAV switches from the battery powered supply mode to the capacitor powered supply mode, such that an uninterrupted power supply is provided.
[0735] After the battery has been replaced with a fully charged battery, the UAV can provide an operability, which relates to the UAV autonomously moving back to the location, where it was before returning to the location for replacing the battery. For example, if the UAV has been on a flight-mission for inspecting/surveying/measuring/digitizing the UAV's environment, the UAV can provide for an operability, which makes the UAV to autonomously continue the flight-mission after the battery has been replaced.
[0736] After the battery has been replaced with a fully charged battery, the UAV can provide an operability, which relates to the UAV automatically transmitting recorded flight data to a storage, for example, a cloud storage, or to the mobile control device. The recorded flight data can relate to, for example, sensor data relating to inspecting/surveying/measuring/digitizing the UAV's environment.
[0737] As indicated in FIG. 28, the UAV has integrated inside the body of the UAV a UAV powering system 47, a capacitor 48 and a battery charge level information generator 49.
[0738] FIG. 29 and FIG. 30 show a UAV 1 flying in a physical environment 2. The UAV has a body 15, a first plurality of propulsion units 50 arranged on the left side of the body 15 and a second plurality of propulsion units 51 arranged on the right side of the body 15. The UAV 1 further has a first protective frame 52 running curved 53 around a portion of an outer edge 54 of the first plurality of propulsion units and a second protective frame 55 running curved 56 around a portion of an outer edge 57 of the second plurality of propulsion units.
[0739] The first protective frame forms a front left corner section 58 and a rear left corner section 59. The second protective frame 55 forms a front right corner section 60 and a rear right corner section 61.
[0740] The UAV indicator light system includes a first linear indicator 62 for emitting light and running curved around a portion of an outer edge of the front left propulsion unit and along the first protective frame in the front left corner section.
[0741] The UAV indicator light system includes a second linear indicator 63 for emitting light and running curved around a portion of an outer edge of the rear left propulsion unit and along the first protective frame in the rear left corner section.
[0742] The UAV indicator light system includes a third linear indicator 64 for emitting light and running curved around a portion of an outer edge of the front right propulsion unit and along the second protective frame in the front right corner section.
[0743] The UAV indicator light system includes a fourth linear indicator 65 for emitting light and running curved around a portion of an outer edge of rear right propulsion unit and along the second protective frame in the rear right corner section.
[0744] Thereby, each linear indicator is arranged to emit light away from the UAV and towards ground 66 into a confined emission sector 67, 67, 67, 67, and enables a variable emission of light.
[0745] The UAV indicator light system shown in FIG. 29 and FIG. 30 enables an orientation-specific user perception of the UAV, and an indicating of a UAV-status to a user.
[0746] The UAV shown in FIG. 29 has a fifth linear indicator 68 for emitting light and running curved around an outer section of the body.
[0747] FIG. 31 shows a cross section of a band-shaped protective frame. The band-shaped protective frame is bulged in a first direction 69 transverse to its running direction. A linear indicator can be arranged at the protective frame at a location being offset, in a second direction 70 transverse to the protective frame's running direction and to the first direction 69, from a maximum bulging 71 of the protective frame.
[0748] FIG. 32 shows a linear indicator comprising at least one light source 72, 72, wherein the linear indicator is configured to linearly guide light, emitted by the at least one light source 72, 72, curved around a portion of an outer edge of one of a plurality of propulsion units and along a protective frame in a corner section.
[0749] FIG. 33 shows a plurality of light sources 72, 72, wherein the plurality of light sources is arranged to linearly run curved around a portion of an outer edge of one of a plurality of propulsion units and along a protective frame in a corner section.
[0750] FIG. 28 and FIG. 29 show a typical UAV 1 flying in a physical environment 2.
[0751] FIG. 34 shows a schematic of how an autonomous navigation control unit 73 of the UAV determines the UAV's position 78, flight-velocity 79 and orientation 80 using GNSS positioning signals 74 received from a GNSS receiver module 75, which is receiving GNSS positioning signals from GNSS satellites 81, environment data 76 and local navigation sensor signals 77.
[0752] FIG. 35 and FIG. 36 show a UAV 1 in a physical environment 2. FIG. 35 and FIG. 36 further show schematically an all-round view 83, which is generated based on available image data generated by a front camera 19, a top camera 20, a bottom camera 21, and side cameras 22. Each of the cameras has a field of view 18 with a fixed orientation in relation to the UAV and directed away from the UAV.
[0753] FIG. 35 and FIG. 36 further show different views 82, 82, 82, 82, 82, 82, which can be generated and continuously displayed in a live-view.
[0754] FIG. 35 shows a top view to the UAV 1. FIG. 36 shows a front view to the UAV. Thereby, same reference numbers in FIG. 35 and FIG. 36 do not necessarily relate to the same feature. For example, the view 82 in FIG. 35 does not relate to the same view 82 in FIG. 36.
[0755] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input 39, a single tap touch input 32, a double tap touch input 40 etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0756] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0757] For example, in case a touch input is received by the touch sensitive display and identified by the mobile control device, which touch input indicates a desired viewing direction 84, 84, 84, a view to the physical environment in this desired viewing direction is generated and displayed in a live-view. Thereby, based on the desired viewing direction, the image data, which is needed for generating the desired view in the desired viewing direction, is selected from the available image data. A touch input can be, for example, a single tap touch input.
[0758] FIG. 18 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are strip shaped.
[0759] FIG. 19 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are square shaped.
[0760] FIG. 37 shows a schematic of a method according to an embodiment of the eleventh aspect of the invention. A user input 85 related to indicating a desired viewing direction is generated and received. A virtual camera position and the desired viewing direction are computed 86 aiding the position and orientation of the drone 91. By the plurality of cameras 92 available image data is provided. Therefrom, image data is selected 87. The selected image data is then stitched 88 using an image stitching algorithm. The stitching algorithm can make use of directional distance information 93, which is recorded by a directional distance measuring module of the UAV by measuring the physical environment. Thereby the directional distance information is correlated with the selected image data such that selected image data with depth information is generated. The image stitching algorithm is stitching the selected image data then aiding the depth information. Then, the stitched image data is cropped and rendered 89 according to virtual camera settings 94, for example resolution, distorsion etc. Finally, the cropped and rendered image data is streamed 90 to the mobile control device to be displayed in a live-view.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE TWELFTH ASPECT OF THE INVENTION
[0761] FIG. 1, FIG. 4, FIG. 17 all show a UAV 1 flying in a physical environment 2.
[0762] FIG. 38 shows a schematic of the method according to an embodiment of the twelfth aspect of the invention. Sensor raw data are generated by the multipurpose sensor system 95. The raw data are in the form of image data 96, motion data 97, measurement data 98 and global position data 99. The sensor raw data are provided to/received by the sensor raw data conditioning unit 104. Then, depending on the purpose for which the sensor raw data is to be used, the sensor raw data conditioning unit 104 is conditioning the sensor raw data. The sensor raw data is conditioned to form support data 101 and/or to form sensor data 103. Then the support data 101 is used by the flight control support functionality 100 to autonomously support the flight control and the sensor data 103 is used by the sensor data recording functionality 102 to be recorded for enabling a generation of a representation of the physical environment of the UAV.
[0763] FIG. 41, FIG. 42 and FIG. 43 all show a UAV 1 flying in a physical environment 2.
[0764] FIG. 39 shows a schematic of the method according to an embodiment of the thirteenth aspect of the invention. The multipurpose sensor system 95 of the UAV 1 generates sensor data. The sensor data are image data 96, motion data 97, measurement data 98 and global position data 99. The referencing and tracking functionality 105 uses sensor data for referencing and tracking the UAV's position and orientation. The referencing and tracking functionality 105 has modes in which it operates. A first mode 107, which is activated while the UAV is flying and, for example, performing a measurement task. While the UAV is flying, typically, as many of the sensor data are used by the referencing and tracking functionality 105 as needed to track the position and orientation of the UAV with a predefined quality. The referencing and tracking functionality 105 further has a second mode 108. The second mode is activated, while the UAV is moving propulsion-free, for example, in case when the UAV is being carried by a user from a first location to a second location. While the UAV is moving propulsion-free, typically, only the sensor data, which are needed for keeping the position and orientation of UAV referenced to the reference coordinate system 106 are used, for example, only motion data 97, or motion data 97 and image data 96, or motion data 97 and global position data 99 etc. Sensor data, which can not reliably be generated while the UAV is moving propulsion-free, for example, because the UAV is being carried in a bag, can be left unconsidered by the referencing and tracking functionality. Thereby, also power consumption of the UAV can be optimised in such situations.
[0765] FIG. 40, FIG. 41 and FIG. 42 show a situation, where the method according to an embodiment of the thirteenth aspect of the invention enables an uninterrupted continuous tracking of the UAV's position and orientation in a reference coordinate system 106.
[0766] In FIG. 41, the UAV 1 includes a directional distance measuring module. The UAV 1 is measuring a first faade of the building. Therefore, the UAV is referenced to the reference coordinate system 106.
[0767] After the measurement is completed, in FIG. 41, the UAV 1 is carried, for example by a user, around a corner of the building. While being carried the UAV 1 is moving propulsion-free and is uninterruptedly being tracked by the referencing and tracking functionality 105.
[0768] Around the corner in front of another faade of the building, in FIG. 42, the UAV 1 is launched continues measuring the building. Thereby, all the measurement data, which is recorded, is referenced to the same reference coordinate system 106, without necessitating a fresh initial referencing procedure.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE FOURTEENTH ASPECT OF THE INVENTION
[0769] FIG. 43 shows a system according to an embodiment of the fourteenth aspect of the invention for managing a set of UAV-batteries 109 used for powering a UAV 1. The system includes a mobile battery charger unit 110. The mobile battery charger unit 110 is designed for receiving at least one UAV-battery 43, 111. The mobile battery charger unit 110 is communicatively connected by its transceiver unit to the battery management terminal 112. The battery management terminal 112 is communicatively connected to the UAV 1. The battery management terminal 112 is receiving battery management data from the mobile battery charger unit 110 and the UAV 1, synchronizes the battery management data and determines a charge state of UAV-batteries. Each determined charge state is displayed 113.
[0770] FIG. 44 shows a UAV 1 according to an embodiment of the fifteenth aspect of the invention. The UAV 1 has a body 114, 15. The body extends along an axis from its front end 16, 115 to its back end 17, 116. The body further has a housing 117. The housing's material in the front end is different from the housing's material at the back end. The UAV has a first mounting structure 118 on the left side of the body and attached to the body. The UAV further has a second mounting structure 118 on the right side of the body and attached to the body. Furthermore the UAV includes four propulsion units 119, 119, 119, 119 for flying. In the front end of the body, a directional distance measuring module 120 is integrated. Therefore, the housing's material in the front end of the body is different from the housing's other material. The housing's material at front end is such, that distance measurement radiation 121 can be emitted/received through it.
[0771] FIG. 45 shows a schematic of the directional distance measuring module 120 being integrated in the front end of the UAV 1. The directional distance measuring module has a deflector unit 122, which deflects distance measurement radiation 121 from the distance measurement radiation source 123 through the housing 118 into the field of view. The deflector unit 122, furthermore, deflects distance measurement radiation 121, reflected from a surface through the housing, to the detector unit. The deflector unit 122 is rotatable around a first rotation axis 124 and a second rotation axis 125 being transverse to the first rotation axis.
[0772] FIG. 29 and FIG. 30 show a UAV 1 flying in a physical environment 2. The UAV has a body 15, a first plurality of propulsion units 50 arranged on the left side of the body 15 and a second plurality of propulsion units 51 arranged on the right side of the body 15. The UAV 1 further has a first protective frame 52 running curved 53 around a portion of an outer edge 54 of the first plurality of propulsion units and a second protective frame 55 running curved 56 around a portion of an outer edge 57 of the second plurality of propulsion units.
[0773] FIG. 46 shows a UAV 1 with mounting structures having mounting parts 126, 126, where the propulsion units are mounted. Each of the mounting parts has a at least one strut element 127 having a hollow interior, wherein the hollow interior forms a hidden cable routing from a propulsion unit to the body.
[0774] FIG. 47 shows a UAV 1, where the mounting structures are attached to the body such that the mounting structures are rotatable around the axis along which the body extends, from a first snap-in position 128 to a second position 129, in particular to a second snap-in position. Landing support structures 130 are located at the protective frames. The landing support structures 130 are located such that with the mounting structures in the second position 129, the landing support structures intertwine 131.
[0775] FIG. 35 and FIG. 36 show a UAV 1 in a physical environment 2. FIG. 35 and FIG. 36 further show schematically an all-round view 83, which is generated based on available image data generated by a front camera 19, a top camera 20, a bottom camera 21, and side cameras 22. Each of the cameras has a field of view 18 with a fixed orientation in relation to the UAV and directed away from the UAV.
[0776] FIG. 35 and FIG. 36 further show different views 82, 82, 82, 82, 82, 82, which can be generated and continuously displayed in a live-view.
[0777] FIG. 35 shows a top view to the UAV 1. FIG. 36 shows a front view to the UAV. Thereby, same reference numbers in FIG. 35 and FIG. 36 do not necessarily relate to the same feature. For example, the view 82 in FIG. 35 does not relate to the same view 82 in FIG. 36.
[0778] FIG. 28 shows a flying UAV 1, which is powered by a single battery 43. While flying the UAV is continuously providing battery charge level information 42, related to the battery 43, either to a computing unit of the UAV or to a mobile control device 3 for determining a most recent battery charge level 44. The mobile control device 3 has a touch sensitive display and displays a charge level notification 45 to a user of the UAV 1 based on the battery charge level 44.
[0779] As indicated in FIG. 28, the UAV has integrated inside the body of the UAV a UAV powering system 47, a capacitor 48 and a battery charge level information generator 49.
[0780] FIG. 29 and FIG. 30 show a UAV 1 flying in a physical environment 2. The UAV has a body 15, a first plurality of propulsion units 50 arranged on the left side of the body 15 and a second plurality of propulsion units 51 arranged on the right side of the body 15. The UAV 1 further has a first protective frame 52 running curved 53 around a portion of an outer edge 54 of the first plurality of propulsion units and a second protective frame 55 running curved 56 around a portion of an outer edge 57 of the second plurality of propulsion units.
[0781] The first protective frame forms a front left corner section 58 and a rear left corner section 59. The second protective frame 55 forms a front right corner section 60 and a rear right corner section 61.
[0782] The UAV indicator light system includes a first linear indicator 62 for emitting light and running curved around a portion of an outer edge of the front left propulsion unit and along the first protective frame in the front left corner section.
[0783] The UAV indicator light system includes a second linear indicator 63 for emitting light and running curved around a portion of an outer edge of the rear left propulsion unit and along the first protective frame in the rear left corner section.
[0784] The UAV indicator light system includes a third linear indicator 64 for emitting light and running curved around a portion of an outer edge of the front right propulsion unit and along the second protective frame in the front right corner section.
[0785] The UAV indicator light system includes a fourth linear indicator 65 for emitting light and running curved around a portion of an outer edge of rear right propulsion unit and along the second protective frame in the rear right corner section.
[0786] Thereby, each linear indicator is arranged to emit light away from the UAV and towards ground 66 into a confined emission sector 67, 67, 67, 67, and enables a variable emission of light.
[0787] The UAV indicator light system shown in FIG. 29 and FIG. 30 enables an orientation-specific user perception of the UAV, and an indicating of a UAV-status to a user.
[0788] The UAV shown in FIG. 29 has a fifth linear indicator 68 for emitting light and running curved around an outer section of the body.
[0789] FIG. 34 shows a schematic of how an autonomous navigation control unit 73 of the UAV determines the UAV's position 78, flight-velocity 79 and orientation 80 using GNSS positioning signals 74 received from a GNSS receiver module 75, which is receiving GNSS positioning signals from GNSS satellites 81, environment data 76 and local navigation sensor signals 77.
[0790] FIG. 38 shows a schematic of the method according to an embodiment of the twelfth aspect of the invention. Sensor raw data are generated by the multipurpose sensor system 95. The raw data are in the form of image data 96, motion data 97, measurement data 98 and global position data 99. The sensor raw data are provided to/received by the sensor raw data conditioning unit 104. Then, depending on the purpose for which the is to be used, the sensor raw data conditioning unit 104 is conditioning the sensor raw data. The sensor raw data is conditioned to form support data 101 and/or to form sensor data 103. Then the support data 101 is used by the flight control support functionality 100 to autonomously support the flight control and the sensor data 103 is used by the sensor data recording functionality 102 to be recorded for enabling a generation of a representation of the physical environment of the UAV.
[0791] FIG. 43 shows a system according to an embodiment of the fourteenth aspect of the invention for managing a set of UAV-batteries 109 used for powering a UAV 1. The system includes a mobile battery charger unit 110. The mobile battery charger unit 110 is designed for receiving at least one UAV-battery 43, 111. The mobile battery charger unit 110 is communicatively connected by its transceiver unit to the battery management terminal 112. The battery management terminal 112 is communicatively connected to the UAV 1. The battery management terminal 112 is receiving battery management data from the mobile battery charger unit 110 and the UAV 1, synchronizes the battery management data and determines a charge state of UAV-batteries. Each determined charge state is displayed 113.