AUTONOMOUS SYSTEM FOR SHOOTING MOVING IMAGES FROM A DRONE, WITH TARGET TRACKING AND HOLDING OF THE TARGET SHOOTING ANGLE
20180143636 ยท 2018-05-24
Inventors
Cpc classification
B64U2201/10
PERFORMING OPERATIONS; TRANSPORTING
G05D1/0033
PHYSICS
G05D1/0094
PHYSICS
G05D1/0088
PHYSICS
B64U10/14
PERFORMING OPERATIONS; TRANSPORTING
B64U2101/30
PERFORMING OPERATIONS; TRANSPORTING
H04N23/633
ELECTRICITY
G05D1/0038
PHYSICS
H04N23/69
ELECTRICITY
International classification
G05D1/00
PHYSICS
H04N7/18
ELECTRICITY
Abstract
The invention relates to a system for shooting moving images, includes a drone provided with a camera and a ground station communicating with the drone, the camera being directed along a sight axis, the displacements of the drone being defined by flight instructions applied to a set of propulsion units of the drone, the drone being adapted to fly autonomously to shoot moving images of a target moving with the ground station. The system includes control means configured to generate the flight instructions so as to hold substantially constant the angle between the sight axis of the camera and the direction of displacement of the target upon activation of the target tracking. The ground station includes means, controlled by at least one piloting means forming a button for activating the target tracking, to alternatively switch the drone piloting mode between a mode of activation of the target tracking system and a deactivation mode.
Claims
1. A system for shooting moving images, comprising a drone provided with a camera and a ground station communicating with the drone through a wireless link, the camera being directed along a sight axis, the displacements of the drone being defined by flight instructions applied to a propulsion unit or a set of propulsion units of the drone, the drone being adapted to fly autonomously to shoot moving images of a target moving with the ground station, the direction of the sight axis being such that the target remains present in the successive images produced by said shooting, wherein the control means is configured to generate said flight instructions so as to hold substantially constant the angle between the sight axis of the camera and the direction of displacement of the target upon activation of the tracking of the target, and wherein the ground station comprises means, controlled by at least one piloting means forming a button for activating the target tracking, to alternatively switch the drone piloting mode between a mode of activation of the target tracking system adapted to activate means for activating the tracking of the target by the drone, and a deactivation mode adapted to deactivate said means for activating the tracking of the target by the drone.
2. The system according to claim 1, wherein the ground station further comprises means adapted to detect signals emitted by at least one piloting means having a user piloting function and means for transforming said detected signals into flight instructions, and for transmitting these flight instructions to the drone when the activation mode is activated.
3. The system according to claim 1, wherein the ground station further comprises: a screen, means for displaying on the screen an image taken by a camera on-board the drone, said image comprising the target, and means for displaying a dynamic icon on the screen when the activation mode is activated, the icon comprising at least a representation of the target and a representation of the sight angle of the on-board camera.
4. The system according to claim 3, wherein the dynamic icon comprises a first representation of the target in the mode of activation of the target tracking and a second representation of the target at the time of displacement of the target, showing the direction of displacement of the target.
5. The system according to claim 1, wherein the ground station further comprises means for locking the angle between the sight axis of the camera and the direction of displacement of the target, the locking means forming an activation/deactivation button, to alternatively lock and unlock the value of said angle.
6. The system according to claim 1, wherein the system further comprises: means for determining the speed vector of the target and the position of the target in a given reference system, said control means being configured to generate said flight instructions based on the speed vector determined, the position determined, and a predetermined direction angle, so as to hold the angle between the sight axis of the camera and the direction of the speed vector substantially to the value of said predetermined direction angle.
7. The system according to claim 6, wherein the means for activating the tracking of the target by the drone are further adapted to calculate the value of said predetermined direction angle based on the displacement of the target during a predetermined time period consecutive to the activation of the tracking of said target.
8. The system according to claim 1, wherein the deactivation mode is a mode in which the piloting commands will generate flight instructions based on the determined position of the target.
9. The system according to claim 1, wherein the ground station comprises: a touch screen displaying a plurality of touch areas; means for detecting contact signals emitted by the touch areas and at least one touch area forms said at least one piloting means.
10. The system according to claim 9, wherein the ground station further comprises means for locking the angle between the sight axis of the camera and the direction of displacement of the target, the locking means forming an activation/deactivation button, to alternatively lock and unlock the value of said angle, and wherein at least one touch area forms said means for locking said angle.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0043] An exemplary embodiment of the present invention will now be described, with reference to the appended drawings in which the same references denote identical or functionally similar elements throughout the figures.
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
DETAILED DESCRIPTION OF THE INVENTION
[0051] In reference to
[0052] Inertial sensors (accelerometers and gyrometers) allow measuring with a certain accuracy the angular speeds and the attitude angles of the drone, i.e. the Euler angles (pitch, roll and yaw) describing the inclination of the drone with respect to a horizontal plane of a fixed terrestrial reference system. An ultrasonic range finder placed under the drone D moreover provides a measurement of the altitude with respect to the ground. The drone D is also provided with location means allowing determining its absolute position DP1, DP2 in space, in particular based on data coming from a GPS receiver.
[0053] The drone D is piloted by the ground station S, typically in the form of a remote-control device, for example of the model aircraft remote-control type, a smartphone or a smart tablet, as shown for example in
[0054] According to a particular embodiment, the screen E is a touch screen. According to this embodiment, the touch screen, in superimposition with the captured image displayed, displays a certain number of touch areas provided with symbols forming piloting means allowing the activation of piloting commands by simple contact of a user's finger on the touch screen E.
[0055] The ground station S further includes means for detecting contact signals emitted by the piloting means, in particular by the touch areas. When the drone D is piloted by a station S of the remote-control type, the user may be provided with immersive piloting glasses, often called FPV (First Person View) glasses. The station S is also provided with means for radio link with the drone D, for example of the Wi-Fi (IEEE 802.11) local network type, for the bidirectional exchange of data from the drone D to the station S, in particular for the transmission of the image captured by the camera C and of flight data, and from the station S to the drone D for the sending of piloting commands.
[0056] The system consisted by the drone D and the station S is configured so that the drone is provided with the ability to autonomously track and film a target. Typically, the target is consisted by the station S itself carried by the user.
[0057] According to the invention, the tracking of the target by the drone is performed by keeping the same target shooting angle for the camera C of the drone D. The displacements of the drone D are defined by flight instructions generated by control means of the navigation system of the drone D, and applied to the propulsion unit or to the set of propulsion units of the drone D.
[0058] According to the invention illustrated in
[0059] In particular, according to an embodiment, the instructions of the control means 2 are generated so as to maintain a predetermined direction angle ?.sub.p formed between the sight axis 3 of the camera C and the direction of the speed vector VT1, VT2 of the target T. This angle corresponds substantially to the target shooting angle of the camera C of the drone D.
[0060] According to the invention, the ground station S includes means, controlled by at least one piloting means forming a button for activating the target tracking, to alternatively switch the drone piloting mode between a mode of activation of the target tracking system adapted to activate means 7 for activating the tracking of the target T by the drone D, and a deactivation mode adapted to deactivate the means 7 for activating the tracking of the target T by the drone D.
[0061] That way, the user can, by means of the ground station S, activate the tracking of the target T so as to hold substantially constant the angle between the sight axis of the camera and the direction of displacement of the target and to deactivate the tracking of the target T.
[0062] The deactivation of the tracking of the target T allows for example switching the piloting of the drone to a mode in which the piloting commands will generate flight instructions based on the target position determined in particular by the determination means 6. Hence, the deactivation of the tracking of the target T allows for example having a piloting of the drone according to the so-called tracking mode. Namely, the drone follows the target T by means of the coordinates of the latter by adjusting its position and/or the position of the camera unit so that the target is always filmed by the drone. In particular, when the target moves, the drone determines its trajectory as a function of the movements of the target and controls the camera so that the latter is always directed towards the target to be filmed. In other words, the drone positions itself and steers the camera in such a manner that the sight axis thereof points towards the target.
[0063] According to a particular embodiment, the ground station S further includes means adapted to detect signals emitted by at least one piloting means having a user piloting function and means for transforming the detected signals into flight instructions, and for transmitting these flight instructions to the drone when the activation mode is activated.
[0064] According to this embodiment, the user can, via the piloting means having a piloting function for piloting the ground station S, pilot the drone as he desires despite the activation of the target tracking system. That way, the flight instructions from the user will have priority for the piloting of the drone with respect to the flight instructions generated by the control means 2. The piloting means having a piloting function are the elementary piloting functions. They includes in particular the following flight instructions: move up, move down, turn to the right, turn to the left, move forward, move rearward, move to the right, move the left, as illustrated in
[0065] According to a particular embodiment detailed hereinafter with reference to
[0066] According to another embodiment, the value of the predetermined direction angle ?.sub.p may be chosen among a set of values pre-recorded in the system 1.
[0067] For that purpose, the control means 2 of the system are configured to generate the flight instructions based on:
[0068] the speed vector VT1, VT2 of the target T,
[0069] the position of the target TP1, TP2, and
[0070] a predetermined direction angle ?.sub.p.
[0071] The direction of the sight axis 3 of the camera of the drone is such that the target T remains present on the successive images produced by the shooting.
[0072] In a first embodiment, the direction of the sight axis 3 of the camera C is fixed with respect to a main axis of the drone. The control means 2 are hence configured to generate flight instructions so as to position the main axis of the drone in such a manner that the sight axis 3 of the camera C is directed towards the target T during the tracking of the target T by the drone D.
[0073] In a second embodiment, the direction of the sight axis 3 of the camera C is modifiable with respect to a main axis of the drone thanks to modification means. The modification means are configured to direct at least in part the sight axis 3 of the camera towards the target T during the tracking of the target by the drone D. The camera C is for example a fixed camera of the hemispherical-field, fisheye type, as described for example in the EP 2 933 775 A1 (Parrot). With such a camera, the changes of the sight axis 3 of the camera C are not performed by physical displacement of the camera, but by reframing and reprocessing of the images taken by the camera as a function of a virtual sight angle, determined with respect to the main axis of the drone, given as a command. The camera C may also be a mobile camera assembled to the drone, for example under the drone body, in this case, the modification means includes motors to rotate the camera about at least one of the three axes, or even the three axes, in order to direct the sight axis of the camera in such a way that the target remains present in the successive images produced by the shooting.
[0074] The coordinates of the position TP1, TP2 of the target T allow determining the direction of the sight axis 3 of the camera C so that the target T remains present on the successive images produced during the shooting. The coordinates of the sight axis 3 of the camera C are determined thanks to the sensors of the drone, that determine the position of the drone D.
[0075] The coordinates of the speed vector VT1, VT2 and the position TP1, TP2 of the target T with respect to the drone D allow determining the current direction angle ? between the sight axis 3 of the camera C and the direction of the speed vector VT1, VT2.
[0076] The control means 2 are for example configured to generate the flight instructions based on a feedback loop on a command for holding the predetermined direction angle ?.sub.p, for example by means of a computing unit provided with an execution program provided for that purpose. The principle of the feedback control is to continuously measure the difference between the current value of the quantity to be controlled and the predetermined value that is desired to be reached, to determine the suitable control instructions to reach the predetermined value. Hence, the control means 2 first determine the current direction angle ?, then give flight instructions so that the drone D moves to a position DP2 in which the current direction angle ? corresponds to the predetermined direction angle ?.sub.p. The feedback loop is repeated in continuous by the control means 2 to hold the value of the predetermined direction angle ?.sub.p.
[0077] With reference to
[0078] As shown in
[0079] In a first embodiment, the means 6 for determining the speed vector VT1, VT2 and the position of the target T operate by observation of the successive GPS geographical positions of the target T. The given reference system allowing the determination of the speed vector VT1, VT2 and of the position TP1, TP2 of the target T is hence a terrestrial reference system. The determination means 6 receive the successive GPS positions of the target T over time. The determination means 6 can hence deduce therefrom the coordinates of the speed vector VT1, VT2 of the target T. The position of the target T is given by the GPS coordinates of the ground station.
[0080] According to a first variant of this first embodiment, the determination means 6 are arranged in the drone D. The GPS positions of the target T are transmitted by the target T to the determination means 6 of the drone D.
[0081] According to a second variant of this first embodiment, the determination means 6 are arranged in the ground station of the target T. Herein, the coordinates of the speed vector VT1, VT2 and of the position TP1, TP2 of the target T are determined in the ground station, then transmitted to the drone D.
[0082] In a second embodiment, the means 6 for determining the speed vector VT1, VT2 and the position of the target T operate by analysis of the images delivered by the camera C of the drone T. The given reference system is herein a reference system linked to the drone D. In this case, the analysis of the images delivered by the camera C is an analysis of the position TP1, TP2 of the target T in the images successively generated by the camera C of the drone D. The determination means 6 includes means for locating and tracking the position TP1, TP2 of the target T in the successive images. In this particular embodiment, the determination means 6 are located in the drone D. For that purpose, an image analysis program provided in the determination means 6 on board the drone D, or in a dedicated circuit, is configured to track the displacement of the target T in the sequence of images generated by the camera C, and to deduce therefrom in which angular direction is the target T with respect to the sight axis 3 of the camera C. More precisely, this program is configured to locate and track in the successive images a visual pattern or a colour spot representative of the visual aspect of the target with respect to a bottom (for example, a pattern elaborated by analysis of the grey levels of the image). For example, for the shooting of a target user practicing a snow sport, the bottom will be generally white and the colour of the spot in the images will be that of the user's clothes. This approach moreover allows having angular position data of the user to be tracked at a rate substantially faster than that at which the GPS coordinates are delivered (generally once per second), i.e. the rate of the images is typically of 30 frames per second for this type of application.
[0083] In this second embodiment, the image analysis is associated with another measuring means that provides at least in part a geographical position TP1, TP2 of the target T. These means may in particular come from the GPS unit of the ground station, or a pressure sensor of the barometric type, arranged in the ground station of the target T. The means for determining the sight axis 3 of the camera C being capable of indicating an angular position of the target T with respect to the main axis of the drone, are hence completed by the taking into account of a geographical signal. The electronics on board the drone is capable of knowing by cross-checking between the geographical data and the angular detection data, the position of the target T. Very accurate coordinates of the position TP1, TP2 and of the speed vector VT1, VT2 of the target T are hence obtained.
[0084] According to a particular embodiment, the control means 2 are further configured to generate the flight instructions to control the displacement of the drone D at a predetermined distance d.sub.p between the drone D and the target T. In other words, in the tracking mode, the distance d between the target T and the camera C is held, in addition to the holding of the shooting angle of the camera C. The predetermined distance d.sub.p has a fixed value during the tracking. Hence, the perception of the dimensions of the target T remains substantially the same during the shooting, with a constant focal length for the camera C. The current distance d between the target T and the camera C is calculated by the control means 2, based on the position of the target TP1, TP2 determined by the determination means 6 and by the position of the drone DP1, DP2 determined by its inertial sensors.
[0085] The control means 2 are for example configured to generate flight instructions based on a feedback loop on a command for holding the predetermined distance d.sub.p. The method is similar to that relating to the feedback loop on the command for holding the predetermined direction angle ?.sub.p. The control means 2 calculate the current distance d and generate instructions to displace the drone to a position DP2 whose distance with the target T corresponds to the predetermined distance d.sub.p. Hence, the distance between the later positions TP2, DP2 of the drone D and of the target T of
[0086] According to a particular embodiment, shown in
[0087] The control means 2 are for example configured to generate the flight instructions based on a feedback loop on a command for holding the predetermined direction angle ?.sub.p. The method is similar to that relating to the feedback loop on the command for holding the predetermined direction angle ?.sub.p and that for the predetermined distance d.sub.p between the camera C and the target T. Hence, as shown in
[0088] In a particular embodiment, the control means are configured to generate flight instructions allowing modifying the position of the drone to modify simultaneously the current direction angle ?, the current distance d between the camera C and the target T and the current elevation angle ? to reach the three corresponding predetermined values.
[0089] In the embodiment shown in
[0090] According to a particular embodiment, the activation means 7 are arranged, for example, in the ground station. According to another embodiment, the activation means 7 are arranged in the drone.
[0091] The ground station S includes means, controlled by at least one piloting means forming a button for activating the target tracking, to alternatively switch the drone piloting mode between a mode of activation of the target tracking system adapted to activate the activation means 7 and a mode of deactivation of the activation mode adapted to deactivate the activation means 7. According to a particular embodiment in which the ground station S includes a touch screen E provided with touch areas, one of the touch areas forms a button for activating the target tracking.
[0092] Hence, when the user operates the button for activating the target tracking, the drone passes to the target tracking mode in which the control means 2 generate the flight instructions. According to a particular embodiment, the flight instructions are generated in particular based on the speed vector VT1, VT2 determined, the position TP1, TP2 determined, and the predetermined direction angle ?.sub.p.
[0093] According to a particular embodiment of the invention, the means 7 for activating the tracking of the target by the drone are adapted to calculate the value of the predetermined direction angle following the activation. In other words, the activation means 7 define the value of the predetermined direction angle ?.sub.p that will be held during the tracking of the target by the drone.
[0094] In particular, the value of the direction angle ?.sub.p is calculated based on the displacement of the target T during a predetermined time period consecutive to the activation of the tracking of the target.
[0095] According to another particular embodiment, the activation means 7 are for example configured so that the value of the predetermined direction angle is the current direction angle at the time of the activation, in particular at the time where the button is operated.
[0096] The activation means 7 calculate the current direction angle as a function for example of the coordinates of the speed vector and of the position of the target, that are transmitted to them by the determination means 6. Hence, the user positions the drone and the camera according to a sight angle he desires to hold, and activates the tracking mode thanks to the activation means 7 so that the drone tracks the target while keeping the chosen sight angle.
[0097] As illustrated in
[0098] Moreover, the ground station S includes means for displaying on the screen E, a dynamic icon 10 in particular when the activation mode is activated. The dynamic icon 10 includes at least a representation of the target 12 and a representation the sight angle 14 of the on-board camera C.
[0099] As illustrated in
[0100]
[0101] The ground station S further includes means 16, 16 for locking the angle between the sight axis 3 of the camera C and the direction of displacement of the target T, the locking means forming an activation/deactivation button, to alternatively lock and unlock the value of the angle.
[0102]
[0103]
[0104] According to a particular embodiment in which the ground station S includes a touch screen E having a plurality of touch areas, at least one touch area of the touch screen forms the means for locking the angle.
[0105] According to a particular embodiment, the means 7 for activating the tracking of the target by the drone are adapted to also calculate the value of the predetermined distance d.sub.p between the target and the drone at the time of the activation. Similarly, the activation means 7 are for example adapted to calculate the value of the predetermined elevation angle ?.sub.p at the time of the activation. The predetermined values are transmitted to the control means 2 that record them in a memory. Hence, when the activation means 7 are operated, the values of the three predetermined parameters are calculated by the activation means 7, then held during the tracking of the target by the control means 2 as long as the user has not deactivated the tracking of the target T.