Abstract
The invention relates to an unmanned aerial vehicle (UAV), the operation of a UAV, and the control of a UAV. Aspects of the invention relate to a UAV including a directional distance measuring module for inspecting/surveying/measuring/digitizing the UAV's environment.
Claims
1-227. (canceled)
228. Computer implemented method for providing a live-view of a UAV's physical environment, the method including: continuously generating a view of the physical environment of the UAV based on image data from a camera system of the UAV, the camera system including a plurality of cameras arranged peripherally at the UAV, the cameras each having a field of view with a fixed orientation in relation to the UAV and directed away from the UAV, and the camera system providing available image data of the plurality of cameras for generating an all-round view to the physical environment, continuously displaying the view of the physical environment in a live-view by a touch sensitive display, receiving and identifying a touch input, indicative of a desired viewing direction in which the view of the physical environment is to be generated, and based thereon generating and displaying in the live-view a view of the physical environment in the desired viewing direction, wherein selecting the image data from the available image data based on the desired viewing direction, and generating and displaying in the live-view the view of the physical environment in the desired viewing direction based on the selected image data.
229. The method according to claim 228, including receiving the touch input by the touch sensitive display, the touch sensitive display comprising a plurality of touch zones spread to the live-view, wherein the desired viewing direction is determined based on identifying the touch zone, where the touch input is received, and a touch zone having assigned thereto, predetermined image data selection information based on which the image data is selected from the available image data.
230. The method according to claim 228, including stitching the selected image data using an image stitching algorithm and based thereon generating and displaying in the live-view the view of the physical environment in the desired viewing direction.
231. The method according to claim 230, including correlating directional distance information, recorded by a directional distance measuring module of the UAV by measuring the physical environment, with the selected image data such that selected image data with depth information is generated, wherein the image stitching algorithm is stitching the selected image data based on the depth information.
232. The method according to claim 228, including correcting a parallax-offset between cameras of the camera system based on the depth information and, based thereon, generating and displaying in the live-view the view of the physical environment in the desired viewing direction.
233. The method according to claim 228, including, by selecting the image data from the available image data based on the desired viewing direction, and generating and displaying in the live-view the view of the physical environment in the desired viewing direction based on the selected image data, providing a virtual gimbal functionality, which enables to virtually gimbal the view, by the touch input, wherein virtually gimbal the view is decoupled from the movement of the UAV.
234. A computer program product comprising machine readable program code stored in a non-transitory machine-readable medium, which when executed by processing units related to a mobile control device having a touch sensitive display and/or a UAV enables providing a live-view of a UAV's physical environment according to the method of claim 228.
235. The system for controlling the flight of a UAV in a physical environment, the system including: a UAV having a camera system including a plurality of cameras arranged peripherally at the UAV, the cameras having a field of view with a fixed orientation in relation to the UAV and directed away from the UAV, and the camera system providing available image data of the plurality of cameras for generating an all-round view to the physical environment, and a computer program product according to claim 234.
236. The system according to claim 235, the UAV having a directional distance measuring module recording directional distance information by measuring the physical environment.
237. The system according to claim 235, further including a mobile control device having a touch sensitive display.
238. The system according to claim 235, the camera system including a plurality of cameras arranged peripherally at the UAV, with each camera having a field of view with a fixed orientation in relation to the UAV and directed away from the UAV, one front camera facing forward, one top camera facing up, one bottom camera facing down, and at least one side camera facing sideways, wherein the cameras are arranged such that each field of view overlaps to a predefined degree at least one adjacent field of view, and the camera system provides an all-round view to the physical environment, and the camera system provides available image data of the plurality of cameras for generating an all-round view to the physical environment.
239. A computer implemented method for conditioning sensor raw data generated by a multipurpose sensor system of a UAV flying in a physical environment, the method including: generating sensor raw data by the multipurpose sensor system in the form of image data from a camera system of the UAV, motion data from an inertial measurement unit (IMU) of the UAV, measurement data, in particular 3D point data, from a directional distance measuring module of the UAV, in particular wherein the directional distance measuring module measures distances and directions to object surfaces based on the light detection and ranging (lidar) principle, and global position data from a GNSS receiver module of the UAV, providing a flight control support functionality using support data for supporting the flight control, and a sensor data recording functionality for recording sensor data, which enable a generation of a representation of the physical environment of the UAV, autonomously supporting, by the flight control support functionality, the control of the flight of the UAV, and recording, by the sensor data recording functionality, sensor data, which enable the generation of a representation of the physical environment of the UAV, wherein receiving the sensor raw data, by a sensor raw data conditioning unit of the UAV, and conditioning, by the sensor raw data conditioning unit, the sensor raw data to generate the support data and the sensor data, wherein at least one of the image data, the motion data, the measurement data, and the global position data is used for both to generate the support data and the sensor data.
240. The method according to claim 239, the flight control support functionality including a visual inertial system (VIS), the visual inertial system using support data in the form of conditioned image data to derive motion data related to the movement/motion of the UAV based on tracking predetermined features in the image data.
241. The method according to claim 240, conditioning including conditioning the image data based on a criterion relating to using the image data by the VIS.
242. The method according to claim 239, wherein the sensor data enable a generation and display of a view of the physical environment of the UAV to a user, and the generation and display is based on sensor data in the form of conditioned image data.
243. The method according to claim 240, conditioning including conditioning the image data based on a criterion relating to generating sensor data based on the image data to enable a generation and display of a view of the physical environment of the UAV to a user.
244. The method according to claim 239, the flight control support functionality including a collision avoidance functionality, the collision avoidance functionality using support data in the form of conditioned measurement data to detect obstacles in the physical environment and avoid the obstacles.
245. The method according to claim 239, wherein the sensor data enable the generation and display of a view of the physical environment of the UAV to a user, and the generation and display is based on sensor data in the form of conditioned measurement data, wherein the conditioned measurement data is used for supporting a stitching of conditioned image data.
246. The method according to claim 245, conditioning including conditioning the measurement data based on a criterion relating to generating sensor data based on the measurement data to enable the generation and display of a view of the physical environment of the UAV to a user.
247. The method according to claim 239, wherein the sensor data enable a display of the representation of the physical environment of the UAV to a user, and the display is based on sensor data in the form of conditioned measurement data including 3D point data.
248. The method according to claim 247, conditioning including conditioning the measurement data based on a criterion relating to generating sensor data based on the measurement data to enable a display of the representation of the physical environment of the UAV to a user.
249. The method according to claim 239, the flight control support functionality using support data in the form of conditioned global position data for controlling the flight of the UAV in the physical environment.
250. The method according to claim 239, wherein the sensor data enable a display of the representation of the physical environment of the UAV to a user, and the display is based on sensor data in the form of conditioned global position data by using the conditioned global position data to assign a global position to the representation.
251. The method according to claim 250, conditioning including conditioning the global position data based on a criterion relating to generating sensor data based on the global position data to enable a display of the representation of the physical environment of the UAV to a user with the representation having assigned thereto a global position.
252. The method according to claim 239, the support data and the sensor data each including a combination of at least two of image data, motion data, measurement data, and global position data.
253. The method according to claim 239, the sensor raw data being generated at a predefined maximum rate and at a predefined maximum resolution, wherein conditioning the sensor raw data includes providing the sensor raw data at a predefined resolution and/or at a predefined rate based on a criterion relating to generating support data and/or sensor data from the sensor raw data.
254. A computer program product comprising machine readable program code stored in a non-transitory computer-readable medium, which when executed by processing units related to a mobile control device having a touch sensitive display and/or a UAV enables conditioning of sensor raw data according to the method of claim 12.
Description
BRIEF DESCRIPTION OF THE FIGURES OF THE INVENTION
[0616] The first to fifteenth aspects of the invention are described below in more detail purely by way of example with the aid of concrete exemplary embodiments of the first to fifteenth aspect of the invention illustrated schematically in the figures, further advantages of the first to fifteenth aspects of the invention also being examined. In detail:
[0617] FIG. 1 shows a mobile control device and a stroke-based touch input, and in a top view a UAV, an environment, and an object surface,
[0618] FIG. 2 shows a top view to an object surface and a movement of the UAV;
[0619] FIG. 3 relates to a stroke-based touch input;
[0620] FIG. 4 shows a top view to a UAV;
[0621] FIG. 5 shows a touch sensitive display and a two-finger pinch touch input and in a top view a UAV, an environment, and an object;
[0622] FIG. 6 shows a digitally scaled view in a live-view;
[0623] FIG. 7 shows an un-scaled view;
[0624] FIG. 8 shows a touch sensitive display and a two-finger pinch touch input and in a top view a UAV, an environment, and an object;
[0625] FIG. 9 shows a digitally scaled view in a live-view;
[0626] FIG. 10 shows an un-scaled view;
[0627] FIG. 11 relates to a two-finger pinch touch input;
[0628] FIG. 12 relates to a two-finger stroke touch input;
[0629] FIG. 13 shows in a top view a UAV, an environment and an object in the environment;
[0630] FIG. 14 shows a touch sensitive display and a single tap touch input and in a top view a UAV, an environment, and an object;
[0631] FIG. 15 shows a digitally rotated view in a live-view;
[0632] FIG. 16 shows a view after a rotational movement of a UAV;
[0633] FIG. 17 shows a UAV with its principle axes;
[0634] FIG. 18 shows a touch sensitive display including a plurality of touch zones;
[0635] FIG. 19 shows a touch sensitive display including a plurality of touch zones;
[0636] FIG. 20 shows a virtual camera position with view directions;
[0637] FIG. 21 shows a touch sensitive display receiving a one-finger stroke touch input;
[0638] FIG. 22 shows an adapted live-view;
[0639] FIG. 23 shows in a live-view, a view to the physical environment;
[0640] FIG. 24 relates to a one-finger stroke touch input;
[0641] FIG. 25 shows a virtual camera position with view directions;
[0642] FIG. 26 shows a touch sensitive display receiving a double tap touch input;
[0643] FIG. 27 shows an adapted live-view;
[0644] FIG. 28 shows a flying UAV with a mobile control device;
[0645] FIG. 29 shows a UAV flying in a physical environment;
[0646] FIG. 30 shows a UAV with an indicator light system;
[0647] FIG. 31 shows a cross section of a band-shaped protective frame;
[0648] FIG. 32 shows a linear indicator comprising a light source;
[0649] FIG. 33 shows a linear indicator comprising a plurality of light sources;
[0650] FIG. 34 shows a schematic of a workflow related to an autonomous navigation control unit;
[0651] FIG. 35 shows a top view to a UAV in a physical environment and different views, which can be generated and displayed in a live-view;
[0652] FIG. 36 shows a front view to a UAV in a physical environment and different views, which can be generated and displayed in a live-view
[0653] FIG. 37 shows a schematic of a workflow related to displaying a view in a live-view;
[0654] FIG. 38 shows a schematic of a workflow related to the raw data conditioning unit;
[0655] FIG. 39 shows a schematic of a workflow related to the tracking functionality;
[0656] FIG. 40 shows a situation with a UAV scanning a faade of a building and being referenced to a reference coordinate system;
[0657] FIG. 41 shows a situation with a UAV moving propulsion free around a corner of a building and being referenced to a reference coordinate system;
[0658] FIG. 42 shows a situation with a UAV scanning a faade of a building and being referenced to a reference coordinate system;
[0659] FIG. 43 shows a system for managing a set of UAV batteries;
[0660] FIG. 44 shows a UAV having a body, propulsion units and mounting structures;
[0661] FIG. 45 shows a schematic of a directional distance measuring module;
[0662] FIG. 46 shows a UAV with mounting structures having mounting parts with strut elements; and
[0663] FIG. 47 shows a UAV with rotatable mounting structures.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE FIRST ASPECT OF THE INVENTION
[0664] FIG. 1 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, stroke based touch inputs 8 and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on environment data, which are received from at least one sensor module of the UAV 1. The sensor module typically relates to a camera of a camera system of the UAV 1.
[0665] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0666] The UAV 1 includes a directional distance measuring module as sensor module. The directional distance measuring module is configured to measure distances and directions to surfaces/points on surfaces of the physical environment 2. Directions can be measured, for example, by determining a horizontal and vertical angle under which a distance measurement radiation is emitted towards a surface/point on a surface of the physical environment 2. A distance can be measured, for example, based on the time-of-flight principle using the emitted distance measurement radiation.
[0667] Measuring surfaces/point of surfaces of the physical environment 2 enables measuring/surveying physical environments and objects in the physical environments. Physical environments 2 can relate, for example, to landscapes, acres, fields, forests, hillsides etc. Objects can relate, for example, to buildings, roads, bridges, humans, airplanes, objects of a construction site, tunnels etc.
[0668] Measuring/surveying physical environments 2 and objects in the physical environments relates to generating and recording measurement data, which enable a digital reconstruction of the measured/surveyed physical environment and/or objects. Such measurement data typically relate to 3D point cloud data, which enable the reconstruction in the form of a 3D point cloud.
[0669] The distance measuring module has a field of view, wherein physical environments 2 and objects, which are within the field of view, are measurable. The field of view has a main view direction 4. For a specific, for example optimized, measuring/surveying of physical environments 2 and objects, the field of view with its main view direction is aligned in a predefined way to the physical environment 2 or to an object, which is to be measured/surveyed. Thereby the alignment of the field of view of the distance measuring module is controlled by controlling the movement of the UAV 1.
[0670] FIG. 1 shows on the right side the UAV 1, which is flying in the physical environment 2. The UAV 1 is in front of a building with its distance measuring module facing the building. Thereby, the building is the object to be measured/surveyed. The faade of the building is the object surface 6. At least a part of the faade 5 is within the field of view of the distance measuring module. This at least a part of the faade 5 can be determined by measuring/surveying.
[0671] As shown in FIG. 2, if the touch sensitive display receives, within the constant distance mode/the constant distance mode being activated, a stroke-based touch input 8, the UAV 1 is positioning and orienting itself such, that the main view direction 4 of the distance measuring unit is aligned in a specific/predetermined way 10 with respect to the at least a part of the faade. Furthermore, the UAV 1 moves along the faade and maintains during moving a constant distance 11 to the faade and the specific/predetermined alignment with respect to the faade.
[0672] It can be provided that the UAV 1 is moving only as long as the stroke-based touch input 8 is being received. With other words, as soon as a touching, related to the stroke-based touch input 8, of the touch sensitive display is interrupted or terminated the UAV 1 is stopped.
[0673] FIG. 3 shows a stroke-based touch input 8. The touch sensitive display is touched by a finger, whereinwhile maintaining touching the touch sensitive display with the fingerthe finger is stroked across the touch sensitive display. Stroking across the touch sensitive display relates to generating the stroke progression 9.
[0674] The stroke-based touch input 8 can also relate to touching the touch sensitive display with two fingers in analogy to touching the touch sensitive display with one finger.
[0675] The movement of UAV 1 along the faadewhile maintaining during moving a constant distance 11 to the faade and the specific/predetermined alignment with respect to the faadeis further based on the stroke progression 9. Based on the stroke progression 9 control commands can be derived for controlling the movement of the UAV 1. A control command can relate to, for example, a stroke direction 14, in which the UAV 1 moves along the faade, a velocity with which the UAV 1 moves along the faade etc.
[0676] The stroke-based touch input 8, shown in FIG. 3, includes a stroke progression start state 12 and a stroke progression end state 13. It can be provided that between the stroke progression start state 12 and the stroke progression end state 13 a plurality of further stroke progression start states 12 and stroke progression end states 13 are included, which ad up to the stroke-based touch input 8 and can be used to derive, for example, the stroke progression 9 and, based thereon, control commands for controlling the movement of the UAV 1 during receiving the stroke-based touch input 8.
[0677] It can be provided, that a selectability of the constant distance mode is provided to a user. The selectability of the constant distance mode can be based on a selectability of the at least a part of the faade 5 which is determined by the UAV 1. For example, the at least a part of the faade is captured by the distance measuring module of the UAV 1. As soon as the at least a part of the faade is determined its selectability by touching the determined at least a part of the faade can be provided. If the determined at least a part of the faade is selected/touched, the constant distance mode can automatically be activated or a selectability of the constant distance mode can be provided to a user.
[0678] A selectability of the constant distance mode can also be triggered by, for example, geometric properties of the at least a part of the faade 5. For example, if the faade is the faade of a lighthouse, the faade is curved. Without moving, only a small part of the faade will be capturable by the distance measuring module. Based on a determined curvature of the at least a part of the faade the constant distance mode can be selectable or automatically activated.
[0679] A geometric property of the faade can also relate to a discontinuity along the faade. A discontinuity can relate, for example, to a corner, which includes an abrupt direction-change in the course of the faade.
[0680] A selectability of the constant distance mode can be provided, for example, by overlaying a symbol such as a ruler or distance indicator to the determined at least a part of the faade 5.
[0681] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The directional distance measuring module is integrated in the front end 16 of the UAV 1. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE SECOND ASPECT OF THE INVENTION
[0682] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input, a two-finger stroke touch input a one-finger stroke touch input, a single tap touch input, a double tap touch input etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0683] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display are typically indicative of moving the UAV 1 in the physical environment 2.
[0684] The touch sensitive display shown in FIG. 5 is receiving a two-finger pinch touch input 23 with two pitch points 24. While receiving the two-finger pinch touch input 23 the two pitch points 24 are moving across the touch sensitive display away from each other, which relates to generating a pitch point progression 25.
[0685] FIG. 6 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 5. The live-view in FIG. 6 results from adapting by digitally scaling based on the pitch point progression 25 the live-view of FIG. 5.
[0686] FIG. 7 shows in a live-view, a view, which is generated by the UAV after the UAV has moved towards the object in the physical environment, wherein the movement is based on the pitch point progression 25 and the pinch direction.
[0687] The touch sensitive display shown in FIG. 8 is receiving a two-finger pinch touch input 23 with two pitch points 24. While receiving the two-finger pinch touch input 23 the two pitch points 24 are moving across the touch sensitive display towards each other, which relates to generating a pitch point progression 25.
[0688] FIG. 9 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 8. The live-view in FIG. 9 results from adapting by digitally scaling based on the pitch point progression 25 the live-view of FIG. 8.
[0689] FIG. 10 shows in a live-view, a view, which is generated by the UAV after the UAV has moved away from the object in the physical environment, wherein the movement is based on the pitch point progression 25 and the pinch direction.
[0690] FIG. 11 relates to a two-finger pinch touch input 23. The touch sensitive display is touched by two fingers, whereinwhile maintaining touching the touch sensitive display with the two fingersthe two fingers are moved across the touch sensitive display away from each other.
[0691] The movement of UAV 1 towards or away from an object is based on the pitch point progression 25. Based on the pitch point progression 25 control commands can be derived for controlling the movement of the UAV 1. A control command can relate to, for example, a pinch direction, in which the UAV 1 moves towards or away from an object, a velocity with which the UAV 1 moves towards or away from an object etc.
[0692] The two-finger pinch touch input 23, to which FIG. 11 is relating, includes a pitch point progression start state 26 and a pitch point progression end state 27. It can be provided that between the pitch point progression start state 26 and the pitch point progression end state 27 a plurality of further pitch point progression start states 26 and pitch point progression end states 27 are included, which ad up to the two-finger pinch touch input 23 and can be used to derive, for example, the pitch point progression 25 and, based thereon, control commands for controlling the movement of the UAV 1 during receiving the two-finger pinch touch input 23.
[0693] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE THIRD ASPECT OF THE INVENTION
[0694] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input, a single tap touch input, a double tap touch input etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0695] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0696] The touch sensitive display shown in FIG. 5 is receiving a two-finger stroke touch input 30.
[0697] FIG. 12 relates to a two-finger stroke touch input 30. The touch sensitive display is touched by two fingers, whereinwhile maintaining touching the touch sensitive display with the two fingersthe fingers are stroked across the touch sensitive display. Stroking across the touch sensitive display relates to generating the stroke progression 9.
[0698] The two-finger stroke touch input 30, of FIG. 12, includes a stroke progression start state 12 and a stroke progression end state 13. It can be provided that between the stroke progression start state 12 and the stroke progression end state 13 a plurality of further stroke progression start states 12 and stroke progression end states 13 are included, which ad up to the two-finger stroke touch input 30 and can be used to derive, for example, the stroke progression 9 and, based thereon, control commands for controlling the movement of the UAV 1 during receiving the two-finger stroke touch input 30.
[0699] FIG. 13 shows a top view to a UAV 1 having a main view direction 28. Thereby, the view of the physical environment is displayed in a first view direction 29. Furthermore, FIG. 13 shows a direction 31 transverse to the first view direction.
[0700] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE FOURTH ASPECT OF THE INVENTION
[0701] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input, a single tap touch input 32, a double tap touch input etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0702] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0703] The touch sensitive display shown in FIG. 14 is receiving a single tap touch input 32 with a tap point 33 at a location 35.
[0704] FIG. 15 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 14. The live-view in FIG. 15 results from adapting by digitally rotating the live-view of FIG. 14 around at least one of the principle axes in order to centre the location of the tap point 35 in the view.
[0705] FIG. 16 shows in a live-view, a view, which is generated by the UAV after the UAV has moved based on a rotational motion pattern, by a rotational movement around the UAV's yaw axis, and based on digitally rotating the live-view of FIG. 14, in order to centre the location of the tap point 35 in the view.
[0706] FIG. 17 shows a UAV 1 with its principle axes 34, 34, and 34. More specifically a UAV 1 with its yaw axis 34, roll axis 34 and pitch axis 34.
[0707] FIG. 18 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are strip shaped.
[0708] FIG. 19 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are square shaped.
[0709] FIG. 20 shows a virtual camera position 37 from which the view of the physical environment is continuously displayed in the view direction 38. Upon receiving a single tap touch input, the view is digitally rotated around the UAV's yaw axis 34. Then the view is continuously displayed from the virtual camera position 37 in the digitally rotated view direction 38.
[0710] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE FIFTH ASPECT OF THE INVENTION
[0711] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input 39, a single tap touch input 32, a double tap touch input etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0712] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0713] The touch sensitive display shown in FIG. 5 or FIG. 21 is receiving a one-finger stroke touch input 39, with a stroke progression 9 and a stroke direction 14.
[0714] FIG. 22 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 21. The live-view in FIG. 22 results from adapting by digitally rotating the live-view of FIG. 21 around at least one of the principle axes and along a stroke direction (14) based on the stroke progression (9).
[0715] FIG. 23 shows in a live-view, a view, which is generated by the UAV after the UAV has moved around at least one of the principle axes, in particular around the UAV's yaw axis, and along the stroke direction (14), and based on digitally rotating the live-view of FIG. 21.
[0716] FIG. 17 shows a UAV 1 with its principle axes 34, 34, and 34. More specifically a UAV 1 with its yaw axis 34, roll axis 34 and pitch axis 34.
[0717] FIG. 24 relates to a one-finger stroke touch input 39. The touch sensitive display is touched by one finger, whereinwhile maintaining touching the touch sensitive display with the one fingerthe finger is stroked across the touch sensitive display. Stroking across the touch sensitive display relates to generating the stroke progression 9.
[0718] The one-finger stroke touch input 39, of FIG. 24, includes a stroke progression start state 12 and a stroke progression end state 13. It can be provided that between the stroke progression start state 12 and the stroke progression end state 13 a plurality of further stroke progression start states 12 and stroke progression end states 13 are included, which ad up to the one-finger stroke touch input 39 and can be used to derive, for example, the stroke progression 9 and, based thereon, control commands for controlling the movement of the UAV 1 during receiving the one-finger stroke touch input 39.
[0719] FIG. 25 shows a virtual camera position 37 from which the view of the physical environment is continuously displayed in the view direction 38. While receiving a one-finger stroke touch input the view is digitally rotated around the UAV's yaw axis 34. Then the view is continuously displayed from the virtual camera position 37 in the digitally rotated view direction 38.
[0720] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE SIXTH ASPECT OF THE INVENTION
[0721] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input 39, a single tap touch input 32, a double tap touch input 40 etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0722] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0723] The touch sensitive display shown in FIG. 5 or FIG. 26 is receiving a double tap touch input 40 with a tap point 33.
[0724] FIG. 27 shows an adapted live-view in comparison to the live-view 7 shown in FIG. 26. The live-view in FIG. 27 results from adapting by digitally rotating the live-view of FIG. 26 around at least one of the principle axes, and after the UAV has moved by a rotational movement around at least one of its principle axes 34, 34, 34 and at a predetermined amount along the tap direction 41, in order to centre the location 35 of the tap point 33 in the view.
[0725] FIG. 17 shows a UAV 1 with its principle axes 34, 34, and 34. More specifically a UAV 1 with its yaw axis 34, roll axis 34 and pitch axis 34.
[0726] FIG. 18 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are strip shaped.
[0727] FIG. 19 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are square shaped.
[0728] FIG. 20 shows a virtual camera position 37 from which the view of the physical environment is continuously displayed in the view direction 38. Upon receiving a double tap touch input, the view is digitally rotated around the UAV's yaw axis 34. Then the view is continuously displayed from the virtual camera position 37 in the digitally rotated view direction 38.
[0729] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE SEVENTH ASPECT OF THE INVENTION
[0730] FIG. 4 shows a top view to a UAV 1, having a body 15 with a front end 16 and a back end 17. The UAV 1 includes a front 19 and side 22 cameras, which are arranged at locations, where the shrouding of the UAV 1 is connecting to strut elements of the mounting structures. On the upper side, with respect to a flying position, a top camera 20 is arranged on the body 15. On the bottom side, with respect to a flying position, a bottom camera 21 is arranged on the body 15. Each camera has a field of view 18 with a fixed orientation in relation to the UAV 1. The UAV 1 further has four propulsion units realized in form of rotary wing/propeller units. Two propulsion units are arranged on a left side and on a right side of the body 15.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE EIGHTH ASPECT OF THE INVENTION
[0731] FIG. 28 shows a flying UAV 1, which is powered by a single battery 43. While flying the UAV is continuously providing battery charge level information 42, related to the battery 43, either to a computing unit of the UAV or to a mobile control device 3 for determining a most recent battery charge level 44. The mobile control device 3 has a touch sensitive display and displays a charge level notification 45 to a user of the UAV 1 based on the battery charge level 44.
[0732] In case the battery charge level 44 is determined to be at a level, which necessitates swapping the battery 43, the UAV 1 is instructed to autonomously move to a predetermined location. The location can be, for example, the location, from where the UAV has taken off/been launched, or the location from where a user is controlling the UAV. The UAV then is further instructed to land at the location, such that the battery 43 can be replaced within a predetermined time window. The predetermined time window is strongly depending on the amount of time during which the UAV can be powered by capacitor power. The predetermined time window is as well depending on which and how many of the power consuming units of the UAV have to be supplied by capacitor power during replacing/swapping the batteries.
[0733] After the UAV has landed at the location, typically the propulsion units, the cameras 19, 20, 21, 22, or at least one of the cameras and at least the most power consuming sensor module, for example, the directional distance measuring module, are deactivated.
[0734] At the latest, when the battery is being disconnected from the UAV the UAV switches from the battery powered supply mode to the capacitor powered supply mode, such that an uninterrupted power supply is provided.
[0735] After the battery has been replaced with a fully charged battery, the UAV can provide an operability, which relates to the UAV autonomously moving back to the location, where it was before returning to the location for replacing the battery. For example, if the UAV has been on a flight-mission for inspecting/surveying/measuring/digitizing the UAV's environment, the UAV can provide for an operability, which makes the UAV to autonomously continue the flight-mission after the battery has been replaced.
[0736] After the battery has been replaced with a fully charged battery, the UAV can provide an operability, which relates to the UAV automatically transmitting recorded flight data to a storage, for example, a cloud storage, or to the mobile control device.
[0737] The recorded flight data can relate to, for example, sensor data relating to inspecting/surveying/measuring/digitizing the UAV's environment.
[0738] As indicated in FIG. 28, the UAV has integrated inside the body of the UAV a UAV powering system 47, a capacitor 48 and a battery charge level information generator 49.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE NINTH ASPECT OF THE INVENTION
[0739] FIG. 29 and FIG. 30 show a UAV 1 flying in a physical environment 2. The UAV has a body 15, a first plurality of propulsion units 50 arranged on the left side of the body 15 and a second plurality of propulsion units 51 arranged on the right side of the body 15. The UAV 1 further has a first protective frame 52 running curved 53 around a portion of an outer edge 54 of the first plurality of propulsion units and a second protective frame 55 running curved 56 around a portion of an outer edge 57 of the second plurality of propulsion units.
[0740] The first protective frame forms a front left corner section 58 and a rear left corner section 59. The second protective frame 55 forms a front right corner section 60 and a rear right corner section 61.
[0741] The UAV indicator light system includes a first linear indicator 62 for emitting light and running curved around a portion of an outer edge of the front left propulsion unit and along the first protective frame in the front left corner section.
[0742] The UAV indicator light system includes a second linear indicator 63 for emitting light and running curved around a portion of an outer edge of the rear left propulsion unit and along the first protective frame in the rear left corner section.
[0743] The UAV indicator light system includes a third linear indicator 64 for emitting light and running curved around a portion of an outer edge of the front right propulsion unit and along the second protective frame in the front right corner section.
[0744] The UAV indicator light system includes a fourth linear indicator 65 for emitting light and running curved around a portion of an outer edge of rear right propulsion unit and along the second protective frame in the rear right corner section.
[0745] Thereby, each linear indicator is arranged to emit light away from the UAV and towards ground 66 into a confined emission sector 67, 67, 67, 67, and enables a variable emission of light.
[0746] The UAV indicator light system shown in FIG. 29 and FIG. 30 enables an orientation-specific user perception of the UAV, and an indicating of a UAV-status to a user.
[0747] The UAV shown in FIG. 29 has a fifth linear indicator 68 for emitting light and running curved around an outer section of the body.
[0748] FIG. 31 shows a cross section of a band-shaped protective frame. The band-shaped protective frame is bulged in a first direction 69 transverse to its running direction. A linear indicator can be arranged at the protective frame at a location being offset, in a second direction 70 transverse to the protective frame's running direction and to the first direction 69, from a maximum bulging 71 of the protective frame.
[0749] FIG. 32 shows a linear indicator comprising at least one light source 72, 72, wherein the linear indicator is configured to linearly guide light, emitted by the at least one light source 72, 72, curved around a portion of an outer edge of one of a plurality of propulsion units and along a protective frame in a corner section.
[0750] FIG. 33 shows a plurality of light sources 72, 72, wherein the plurality of light sources is arranged to linearly run curved around a portion of an outer edge of one of a plurality of propulsion units and along a protective frame in a corner section.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE TENTH ASPECT OF THE INVENTION
[0751] FIG. 28 and FIG. 29 show a typical UAV 1 flying in a physical environment 2.
[0752] FIG. 34 shows a schematic of how an autonomous navigation control unit 73 of the UAV determines the UAV's position 78, flight-velocity 79 and orientation 80 using GNSS positioning signals 74 received from a GNSS receiver module 75, which is receiving GNSS positioning signals from GNSS satellites 81, environment data 76 and local navigation sensor signals 77.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE ELEVENTH ASPECT OF THE INVENTION
[0753] FIG. 35 and FIG. 36 show a UAV 1 in a physical environment 2. FIG. 35 and FIG. 36 further show schematically an all-round view 83, which is generated based on available image data generated by a front camera 19, a top camera 20, a bottom camera 21, and side cameras 22. Each of the cameras has a field of view 18 with a fixed orientation in relation to the UAV and directed away from the UAV.
[0754] FIG. 35 and FIG. 36 further show different views 82, 82, 82, 82, 82, 82, which can be generated and continuously displayed in a live-view.
[0755] FIG. 35 shows a top view to the UAV 1. FIG. 36 shows a front view to the UAV. Thereby, same reference numbers in FIG. 35 and FIG. 36 do not necessarily relate to the same feature. For example, the view 82 in FIG. 35 does not relate to the same view 82 in FIG. 36.
[0756] FIG. 5 shows on the left side a typical mobile control device 3 having a touch sensitive display. The touch sensitive display is configured to receive touch inputs, for example, a two-finger pinch touch input 23, a stroke based touch input 8, a two-finger stroke touch input 30, a one-finger stroke touch input 39, a single tap touch input 32, a double tap touch input 40 etc., and to display a view of the physical environment 2 in a live view 7. Thereby, the view of the physical environment 2 is generated based on image data, which are received from a camera system of the UAV 1.
[0757] The mobile control device 3 is configured to control the flight and operation of a UAV 1. Therefore, a communicative connection between the mobile control device and the UAV is established. Touch inputs, which are received by the touch sensitive display, are typically indicative of moving the UAV 1 in the physical environment 2.
[0758] For example, in case a touch input is received by the touch sensitive display and identified by the mobile control device, which touch input indicates a desired viewing direction 84, 84, 84, a view to the physical environment in this desired viewing direction is generated and displayed in a live-view. Thereby, based on the desired viewing direction, the image data, which is needed for generating the desired view in the desired viewing direction, is selected from the available image data. A touch input can be, for example, a single tap touch input.
[0759] FIG. 18 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are strip shaped.
[0760] FIG. 19 shows a touch sensitive display including a plurality of touch zones 36 spread to the live-view. The touch zones are square shaped.
[0761] FIG. 37 shows a schematic of a method according to an embodiment of the eleventh aspect of the invention. A user input 85 related to indicating a desired viewing direction is generated and received. A virtual camera position and the desired viewing direction are computed 86 aiding the position and orientation of the drone 91. By the plurality of cameras 92 available image data is provided. Therefrom, image data is selected 87. The selected image data is then stitched 88 using an image stitching algorithm. The stitching algorithm can make use of directional distance information 93, which is recorded by a directional distance measuring module of the UAV by measuring the physical environment. Thereby the directional distance information is correlated with the selected image data such that selected image data with depth information is generated. The image stitching algorithm is stitching the selected image data then aiding the depth information. Then, the stitched image data is cropped and rendered 89 according to virtual camera settings 94, for example resolution, distortion etc. Finally, the cropped and rendered image data is streamed 90 to the mobile control device to be displayed in a live-view.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE TWELFTH ASPECT OF THE INVENTION
[0762] FIG. 1, FIG. 4, FIG. 17 all show a UAV 1 flying in a physical environment 2.
[0763] FIG. 38 shows a schematic of the method according to an embodiment of the twelfth aspect of the invention. Sensor raw data are generated by the multipurpose sensor system 95. The raw data are in the form of image data 96, motion data 97, measurement data 98 and global position data 99. The sensor raw data are provided to/received by the sensor raw data conditioning unit 104. Then, depending on the purpose for which the sensor raw data is to be used, the sensor raw data conditioning unit 104 is conditioning the sensor raw data. The sensor raw data is conditioned to form support data 101 and/or to form sensor data 103. Then the support data 101 is used by the flight control support functionality 100 to autonomously support the flight control and the sensor data 103 is used by the sensor data recording functionality 102 to be recorded for enabling a generation of a representation of the physical environment of the UAV.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE THIRTEENTH ASPECT OF THE INVENTION
[0764] FIG. 41, FIG. 42 and FIG. 43 all show a UAV 1 flying in a physical environment 2.
[0765] FIG. 39 shows a schematic of the method according to an embodiment of the thirteenth aspect of the invention. The multipurpose sensor system 95 of the UAV 1 generates sensor data. The sensor data are image data 96, motion data 97, measurement data 98 and global position data 99. The referencing and tracking functionality 105 uses sensor data for referencing and tracking the UAV's position and orientation. The referencing and tracking functionality 105 has modes in which it operates. A first mode 107, which is activated while the UAV is flying and, for example, performing a measurement task. While the UAV is flying, typically, as many of the sensor data are used by the referencing and tracking functionality 105 as needed to track the position and orientation of the UAV with a predefined quality. The referencing and tracking functionality 105 further has a second mode 108. The second mode is activated, while the UAV is moving propulsion-free, for example, in case when the UAV is being carried by a user from a first location to a second location. While the UAV is moving propulsion-free, typically, only the sensor data, which are needed for keeping the position and orientation of UAV referenced to the reference coordinate system 106 are used, for example, only motion data 97, or motion data 97 and image data 96, or motion data 97 and global position data 99 etc. Sensor data, which can not reliably be generated while the UAV is moving propulsion-free, for example, because the UAV is being carried in a bag, can be left unconsidered by the referencing and tracking functionality. Thereby, also power consumption of the UAV can be optimised in such situations.
[0766] FIG. 40, FIG. 41 and FIG. 42 show a situation, where the method according to an embodiment of the thirteenth aspect of the invention enables an uninterrupted continuous tracking of the UAV's position and orientation in a reference coordinate system 106.
[0767] In FIG. 41, the UAV 1 includes a directional distance measuring module. The UAV 1 is measuring a first faade of the building. Therefore, the UAV is referenced to the reference coordinate system 106.
[0768] After the measurement is completed, in FIG. 41, the UAV 1 is carried, for example by a user, around a corner of the building. While being carried the UAV 1 is moving propulsion-free and is uninterruptedly being tracked by the referencing and tracking functionality 105.
[0769] Around the corner in front of another faade of the building, in FIG. 42, the UAV 1 is launched continues measuring the building. Thereby, all the measurement data, which is recorded, is referenced to the same reference coordinate system 106, without necessitating a fresh initial referencing procedure.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE FOURTEENTH ASPECT OF THE INVENTION
[0770] FIG. 43 shows a system according to an embodiment of the fourteenth aspect of the invention for managing a set of UAV-batteries 109 used for powering a UAV 1. The system includes a mobile battery charger unit 110. The mobile battery charger unit 110 is designed for receiving at least one UAV-battery 43, 111. The mobile battery charger unit 110 is communicatively connected by its transceiver unit to the battery management terminal 112. The battery management terminal 112 is communicatively connected to the UAV 1. The battery management terminal 112 is receiving battery management data from the mobile battery charger unit 110 and the UAV 1, synchronizes the battery management data and determines a charge state of UAV-batteries. Each determined charge state is displayed 113.
DETAILED DESCRIPTION OF THE FIGURES ACCORDING TO THE FIFTEENTH ASPECT OF THE INVENTION
[0771] FIG. 44 shows a UAV 1 according to an embodiment of the fifteenth aspect of the invention. The UAV 1 has a body 114, 15. The body extends along an axis from its front end 16, 115 to its back end 17, 116. The body further has a housing 117. The housing's material in the front end is different from the housing's material at the back end. The UAV has a first mounting structure 118 on the left side of the body and attached to the body. The UAV further has a second mounting structure 118 on the right side of the body and attached to the body. Furthermore the UAV includes four propulsion units 119, 119, 119, 119 for flying. In the front end of the body, a directional distance measuring module 120 is integrated. Therefore, the housing's material in the front end of the body is different from the housing's other material. The housing's material at front end is such, that distance measurement radiation 121 can be emitted/received through it.
[0772] FIG. 45 shows a schematic of the directional distance measuring module 120 being integrated in the front end of the UAV 1. The directional distance measuring module has a deflector unit 122, which deflects distance measurement radiation 121 from the distance measurement radiation source 123 through the housing 118 into the field of view. The deflector unit 122, furthermore, deflects distance measurement radiation 121, reflected from a surface through the housing, to the detector unit. The deflector unit 122 is rotatable around a first rotation axis 124 and a second rotation axis 125 being transverse to the first rotation axis.
[0773] FIG. 29 and FIG. 30 show a UAV 1 flying in a physical environment 2. The UAV has a body 15, a first plurality of propulsion units 50 arranged on the left side of the body 15 and a second plurality of propulsion units 51 arranged on the right side of the body 15. The UAV 1 further has a first protective frame 52 running curved 53 around a portion of an outer edge 54 of the first plurality of propulsion units and a second protective frame 55 running curved 56 around a portion of an outer edge 57 of the second plurality of propulsion units.
[0774] FIG. 46 shows a UAV 1 with mounting structures having mounting parts 126, 126, where the propulsion units are mounted. Each of the mounting parts has a at least one strut element 127 having a hollow interior, wherein the hollow interior forms a hidden cable routing from a propulsion unit to the body.
[0775] FIG. 47 shows a UAV 1, where the mounting structures are attached to the body such that the mounting structures are rotatable around the axis along which the body extends, from a first snap-in position 128 to a second position 129, in particular to a second snap-in position. Landing support structures 130 are located at the protective frames. The landing support structures 130 are located such that with the mounting structures in the second position 129, the landing support structures intertwine 131.
[0776] FIG. 35 and FIG. 36 show a UAV 1 in a physical environment 2. FIG. 35 and FIG. 36 further show schematically an all-round view 83, which is generated based on available image data generated by a front camera 19, a top camera 20, a bottom camera 21, and side cameras 22. Each of the cameras has a field of view 18 with a fixed orientation in relation to the UAV and directed away from the UAV.
[0777] FIG. 35 and FIG. 36 further show different views 82, 82, 82, 82, 82, 82, which can be generated and continuously displayed in a live-view.
[0778] FIG. 35 shows a top view to the UAV 1. FIG. 36 shows a front view to the UAV. Thereby, same reference numbers in FIG. 35 and FIG. 36 do not necessarily relate to the same feature. For example, the view 82 in FIG. 35 does not relate to the same view 82 in FIG. 36.
[0779] FIG. 28 shows a flying UAV 1, which is powered by a single battery 43. While flying the UAV is continuously providing battery charge level information 42, related to the battery 43, either to a computing unit of the UAV or to a mobile control device 3 for determining a most recent battery charge level 44. The mobile control device 3 has a touch sensitive display and displays a charge level notification 45 to a user of the UAV 1 based on the battery charge level 44.
[0780] As indicated in FIG. 28, the UAV has integrated inside the body of the UAV a UAV powering system 47, a capacitor 48 and a battery charge level information generator 49.
[0781] FIG. 29 and FIG. 30 show a UAV 1 flying in a physical environment 2. The UAV has a body 15, a first plurality of propulsion units 50 arranged on the left side of the body 15 and a second plurality of propulsion units 51 arranged on the right side of the body 15. The UAV 1 further has a first protective frame 52 running curved 53 around a portion of an outer edge 54 of the first plurality of propulsion units and a second protective frame 55 running curved 56 around a portion of an outer edge 57 of the second plurality of propulsion units.
[0782] The first protective frame forms a front left corner section 58 and a rear left corner section 59. The second protective frame 55 forms a front right corner section 60 and a rear right corner section 61.
[0783] The UAV indicator light system includes a first linear indicator 62 for emitting light and running curved around a portion of an outer edge of the front left propulsion unit and along the first protective frame in the front left corner section.
[0784] The UAV indicator light system includes a second linear indicator 63 for emitting light and running curved around a portion of an outer edge of the rear left propulsion unit and along the first protective frame in the rear left corner section.
[0785] The UAV indicator light system includes a third linear indicator 64 for emitting light and running curved around a portion of an outer edge of the front right propulsion unit and along the second protective frame in the front right corner section.
[0786] The UAV indicator light system includes a fourth linear indicator 65 for emitting light and running curved around a portion of an outer edge of rear right propulsion unit and along the second protective frame in the rear right corner section.
[0787] Thereby, each linear indicator is arranged to emit light away from the UAV and towards ground 66 into a confined emission sector 67, 67, 67, 67, and enables a variable emission of light.
[0788] The UAV indicator light system shown in FIG. 29 and FIG. 30 enables an orientation-specific user perception of the UAV, and an indicating of a UAV-status to a user.
[0789] The UAV shown in FIG. 29 has a fifth linear indicator 68 for emitting light and running curved around an outer section of the body.
[0790] FIG. 34 shows a schematic of how an autonomous navigation control unit 73 of the UAV determines the UAV's position 78, flight-velocity 79 and orientation 80 using GNSS positioning signals 74 received from a GNSS receiver module 75, which is receiving GNSS positioning signals from GNSS satellites 81, environment data 76 and local navigation sensor signals 77.
[0791] FIG. 38 shows a schematic of the method according to an embodiment of the twelfth aspect of the invention. Sensor raw data are generated by the multipurpose sensor system 95. The raw data are in the form of image data 96, motion data 97, measurement data 98 and global position data 99. The sensor raw data are provided to/received by the sensor raw data conditioning unit 104. Then, depending on the purpose for which the is to be used, the sensor raw data conditioning unit 104 is conditioning the sensor raw data. The sensor raw data is conditioned to form support data 101 and/or to form sensor data 103. Then the support data 101 is used by the flight control support functionality 100 to autonomously support the flight control and the sensor data 103 is used by the sensor data recording functionality 102 to be recorded for enabling a generation of a representation of the physical environment of the UAV.
[0792] FIG. 43 shows a system according to an embodiment of the fourteenth aspect of the invention for managing a set of UAV-batteries 109 used for powering a UAV 1. The system includes a mobile battery charger unit 110. The mobile battery charger unit 110 is designed for receiving at least one UAV-battery 43, 111. The mobile battery charger unit 110 is communicatively connected by its transceiver unit to the battery management terminal 112. The battery management terminal 112 is communicatively connected to the UAV 1. The battery management terminal 112 is receiving battery management data from the mobile battery charger unit 110 and the UAV 1, synchronizes the battery management data and determines a charge state of UAV-batteries. Each determined charge state is displayed 113.