APPARATUS AND METHOD FOR CAPTURING FLYING OBJECTS
20200125879 ยท 2020-04-23
Inventors
Cpc classification
G06V20/41
PHYSICS
B64U2101/00
PERFORMING OPERATIONS; TRANSPORTING
G06V10/25
PHYSICS
G06V20/52
PHYSICS
B64C39/024
PERFORMING OPERATIONS; TRANSPORTING
G06F18/285
PHYSICS
International classification
Abstract
An apparatus for capturing flying objects has a camera system with at least one camera for video monitoring a monitoring space, and a control unit for controlling the camera system and evaluating the video frames captured by the camera arrangement. The camera system can selectively operate in a non-zoom mode or in a zoom mode. Recognizing a flying object of interest in the monitoring space is accomplished on the basis of a multi-stage classification of flying objects in a region of interest initially based on video frames captured by the camera system in the non-zoom mode and then possibly on video frames captured by the camera system in the zoom mode.
Claims
1. An apparatus for capturing flying objects, the apparatus comprising: a camera system having at least one camera for video monitoring a monitoring space, said camera system configured to selectively operate in a non-zoom mode or in a zoom mode; and a controller for controlling said camera system and evaluating video frames captured by said camera system, said controller configured to determine a region of interest with a flying object based on the video frames captured by said camera system in the non-zoom mode and to ascertain a first probability of a presence of the flying object of interest in the region of interest in order to switch said camera system to the zoom mode in a direction of the region of interest determined if a first limit value is exceeded by a first probability ascertained in order to ascertain a second probability of a presence of the flying object of interest in the region of interest on a basis of the video frames captured by said camera system in the zoom mode and to recognize the flying object of interest in the region of interest if a second limit value is exceeded by the second probability.
2. The apparatus according to claim 1, wherein said camera is at least one pan-tilt-zoom camera, which can selectively operate in the non-zoom mode or in the zoom mode.
3. The apparatus according to claim 1, wherein said camera includes at least one static camera that operates only in the non-zoom mode and at least one pan-tilt-zoom camera that can operate in the zoom mode.
4. The apparatus according to claim 1, wherein said camera is at least one camera with a gated viewing functionality.
5. The apparatus according to claim 1, wherein said camera is at least one black-and-white camera.
6. The apparatus according to claim 1, wherein said controller has an interface for passing on evaluation results to an existing security system at a protected location and/or to a remote user.
7. A method for capturing flying objects, which comprises the steps of: capturing video frames of a monitoring space using a camera system having at least one camera in a non-zoom mode; determining a region of interest with a flying object based on the video frames captured by the camera system in the non-zoom mode; ascertaining a first probability of a presence of a flying object of interest in the region of interest determined; capturing the video frames using the camera system in a zoom mode in a direction of the region of interest if the first probability exceeds a first limit value; ascertaining a second probability of the presence of the flying object of interest in the region of interest on a basis of the video frames captured by the camera system in the zoom mode; and recognizing the flying object of interest in the region of interest if the second probability exceeds a second limit value.
8. The method according to claim 7, which further comprises capturing the video frames in the zoom mode of the camera system using at least one pan-tilt-zoom camera.
9. The method according to claim 7, which further comprises capturing the video frames in the non-zoom mode of the camera system using at least one pan-tilt-zoom camera and/or at least one static camera.
10. The method according to claim 7, which further comprises accomplishing the ascertaining of the first probability and/or the ascertaining of the second probability of the presence of the flying object of interest in the region of interest by evaluating the video frames captured by the camera system using neural networks.
11. The method according to claim 7, which further comprises accomplishing the ascertaining of the first probability and/or the ascertaining of the second probability of the presence of the flying object of interest in the region of interest by assigning flying object classes to each pixel in the region of interest.
12. The method according to claim 7, which further comprises tracking a recognized flying object of interest in the region of interest using the camera system in the zoom mode.
13. The method according to one of claim 7, which further comprises determining a distance to a recognized flying object of interest in the region of interest.
14. The method according to claim 7, wherein if the flying object of interest in the region of interest was recognized, the results of the flying object capturing are passed on to an existing security system at a protected location and/or to a remote user.
15. The method according to claim 7, which further comprises storing the video frames captured by the camera system.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
[0036]
[0037]
[0038]
DETAILED DESCRIPTION OF THE INVENTION
[0039] Referring now to the figures of the drawings in detail and first, particularly to
[0040]
[0041] The monitoring apparatus contains a PTZ camera 10 for video monitoring a monitoring space S, in which flying objects of interest O such as unmanned aerial vehicles (UAVs), or drones, and flying objects that are not of interest N, such as birds or aircraft, can appear. The PTZ camera 10 used can optionally also be a black-and-white camera, with which a higher resolution can be attained.
[0042] The PTZ camera 10 is optionally equipped with infrared illumination 28 so as to be able to record evaluable video frames even under poor visibility conditions, such as at night. The infrared illumination 28 is preferably mechanically connected to the PTZ camera 10 to light the monitoring space S in the viewing direction of the PTZ camera 10. The infrared illumination has, for example, a wavelength of 850 nm or 940 nm, which can be detected by the PTZ camera 10 used. The PTZ camera can optionally also be provided with a gated viewing functionality.
[0043] The PTZ camera 10 can operate in non-zoom mode, in which it scans the monitoring space S with a low zoom factor as the base setting. In so doing, the PTZ camera 10 stays in each case for a few seconds in one direction. The PTZ camera 10 can additionally operate in zoom mode, in which it zooms in on a region of interest (R) in the monitoring space S with a greater zoom factor.
[0044] The PTZ camera 10 is connected to a control unit 12, which controls the PTZ camera 10 and evaluates the video frames captured by the PTZ camera 10 preferably using neural networks. The control unit 12 also contains a memory 13 for storing the video frames captured by the PTZ camera 10. Optionally, the control unit 12 can also have a memory or be connected to a memory in which image data are stored for the purpose of comparing them to the video frames captured. The image data, which are used for training the neural networks using deep learning or for a comparative evaluation, contain image data of real flying objects, image data of the monitoring space with real flying objects, image data of the monitoring space that have been synthetically supplemented with flying objects or scenarios, and the like.
[0045] The control unit 12 is connected to a monitor 14 so as to display the video frames captured by the PTZ camera 10 and the evaluation results of the control unit 12 to a user of the monitoring apparatus. The control unit 12 is additionally connected to an input apparatus 16, via which a user of the monitoring apparatus can input control commands, for example.
[0046] In the exemplary embodiment of
[0047] In a modification of the first exemplary embodiment of
[0048]
[0049] The second exemplary embodiment differs from the first exemplary embodiment in particular in that the camera arrangement for video monitoring the monitoring space S not only has a PTZ camera 10 (or optionally a plurality of PTZ cameras), but additionally has a plurality of static cameras 30. The static cameras 30 can optionally be provided with fisheye lenses so as to be able to cover larger fields of view. The static cameras 30 can optionally also be provided with a gated viewing functionality. In non-zoom mode of the camera arrangement, the static cameras 30 capture video frames with a low zoom factor, wherein the video frames of all static cameras 30 cover the entire monitoring space S. The fields of view of the static cameras 30 are preferably aligned with respect to one another in a manner such that the video frames thereof can be combined on the monitor 14 to form a wide panorama image of the entire monitoring space S for the user. In zoom mode of the camera arrangement, the static cameras 30 can optionally continue to capture video frames of the entire monitoring space S with a low zoom factor so as to continue to display to the user a wide panorama image of the entire monitoring space S and additionally a marking of the zoomed region of interest Ron the monitor 14.
[0050] The PTZ camera 10 can be selectively used only in zoom mode of the camera arrangement or first in non-zoom mode and then in zoom mode of the camera arrangement. In a modification of the second exemplary embodiment of
[0051] The camera arrangement of
[0052] For the rest, the construction of the monitoring apparatus of
[0053] With reference to
[0054] In a first step, S10, the camera arrangement is operated in non-zoom mode to capture video frames of the entire monitoring space S with a low zoom factor. In the embodiment of
[0055] In the next step, S12, the control unit 12 evaluates the video frames captured by the camera arrangement in non-zoom mode and determines one or more regions of interest R in which flying objects N, O are located. The determined regions of interest R can be defined, for example, as what are known as bounding boxes, which contain the space coordinates of the four corner points.
[0056] This is followed, in step S14, by a first stage of classification for each of the regions of interest R determined in step S12 in order to pre-classify whether a flying object of interest O has possibly been recorded in the determined region of interest R. In a first embodiment variant, for each flying object class K, probability values for the presence of a flying object of the respective flying object class K are ascertained for the entire bounding box, and said ascertained probability values for all flying object classes K of flying objects of interest O and of flying objects that are able to be confused with them are then added up to a first probability CL1 (as an alternative to the addition of all these probability values, it is also possible to add up only the probability values for a UAV or the probability values of all UAV types or of a group of specific UAV types or to consider only the probability values of one or more specific UAV types or flying object classes K individually). The ascertainment of the probability values can in this case also be performed pixel by pixel in the bounding box, wherein, rather than assigning a single flying object class K with a corresponding probability value to the entire bounding box, each pixel is assigned a flying object class K and a corresponding probability value to refine the evaluation of the video frames in this manner. The ascertainment of the probability values is preferably effected in the form of confidence levels and as average values of the confidence levels of a plurality of successively recorded video frames.
[0057] The classification step S14just as the second classification step S20 which will be described belowis preferably performed using neural networks. In order to improve the quality of this/these classification(s), the neural networks are preferably also trained in deep learning using synthetic images. In other words, in addition to image data of real flying objects and image data of the monitoring space with real flying objects, image data of the monitoring space that have been synthetically supplemented by flying objects or scenarios are also used.
[0058] Subsequently, the ascertained first probability CL1 is compared to a first limit value T1 of for example 0.4 (step S16). If the first probability CL1 falls under the first limit value T1, the assessment is that there is no flying object of interest O in the region of interest R, and the method returns to step S10 to continue to monitor the monitoring space S in non-zoom mode of the camera system. If by contrast the first probability CL1 exceeds the first limit value T1, the assessment is that there probably is a flying object of interest O in the region of interest R, and the method proceeds to step S18.
[0059] In step S18, the control unit 12 switches the camera arrangement to zoom mode. In zoom mode, the PTZ camera 10 zooms in the direction of the region of interest R which was determined in step S12 and in which there is assumed to be located a flying object of interest O. To this end, the control unit 12 for example passes on target coordinates of the determined region of interest R to the PTZ camera 10.
[0060] In the next step, S20, the control unit 12 evaluates the video frames captured by the PTZ camera 10 in a second stage of classification by ascertaining a second probability CL2 of the presence of a flying object of interest O in this zoomed region of interest R. In this second classification stage, only probability values for the presence of a flying object of interest O are ascertained; that is to say, the additional ascertainment of probability values for the presence of similar flying objects and the adding up of the different probability values are dispensed with. This ascertainment of the second probability CL2 is performed similarly to the ascertainment of the first probability CL1, preferably likewise using neural networks and as an average value of the confidence levels over a plurality of successively recorded video frames and optionally likewise on a pixel basis.
[0061] Subsequently, the ascertained second probability CL2 is compared to a second limit value T2 of for example 0.8 (step S22). If the second probability CL2 falls under the second limit value T2, the assessment is that there is no flying object of interest O in the region of interest R after all, and the method returns to step S10 to continue to monitor the monitoring space S in non-zoom mode of the camera system. If by contrast the second probability CL2 exceeds the second limit value T2, the assessment is that there is a flying object of interest O in the region of interest R, and the method proceeds to the next steps S24 to S32.
[0062] The first classification stage preferably continues to run continuously in parallel with the described second classification stage in zoom mode of the camera arrangement. In other words, while at least one PTZ camera 10 zooms in on a region of interest R that was determined in the first classification stage and the corresponding second classification stage is performed, the static cameras 30 (or further PTZ cameras 10) continue to monitor the monitoring space S with a low zoom factor, and a first classification stage is performed to this effect. In this way it is ensured that the monitoring space S is continuously monitored, and flying objects of interest O can be continuously captured and recognized.
[0063] After a flying object of interest O has been recognized, it is optionally also possible to perform a distance measurement of the recognized flying object of interest O (step S24). This distance determination can be performed, in the case of a camera arrangement having a plurality of cameras 10, 30, for example using a triangulation method or alternatively only with one PTZ camera 10 on the basis of the zoom factor and a known size of the identified flying object O.
[0064] The evaluation result is then communicated to the user on the monitor 14 and/or acoustically (step S26). Optionally, the evaluation result is also passed on to a remote user 24 and/or to an existing security system at a protected location 26 via the interface 18 of the control unit 12 through the network 20 (step S28). It is possible in particular to communicate a warning signal that a flying object of interest O in the monitoring space S has been recognized to a remote user 24 or to an existing security system 26. It is also possible to automatically couple the evaluation results of the drones O tracked with the monitoring apparatus to the fields of view of cameras in an existing security system 26. In this way, it is possible to show the guards in a prison for example which camera of the security system will shortly show a drone or a package that is transported and deposited by a drone.
[0065] In addition, the flying object of interest O identified in the region of interest R can optionally be tracked using the PTZ camera 10 (step S30). When tracking, the pan and tilt angles of the PTZ camera 10 are set such that the flying object of interest O is centred in the zoomed region of interest R. The speeds for the zoom movement and for the pan and tilt movements of the PTZ camera 10 are set separately because the zoom movement should be significantly slower so as not to miss the flying object of interest O, whereas the pan and tilt movements must be significantly quicker so as not to lose the flying object of interest O from the zoomed region of interest R.
[0066] Finally, the video frames captured by the camera arrangement 10, 30 are stored in the memory 13 (step S32). The stored video frames can be used later for example for checking the evaluation of the video frames or to demonstrate the evaluation result.