Method and a system for estimating the geographic position of a target

11506498 · 2022-11-22

Assignee

Inventors

Cpc classification

International classification

Abstract

The invention concerns a system (100) and a method for estimating the geographic position of a target (1). The method comprises the following steps: detecting a target (1); determining the characteristics of the target (1), which characteristics at least comprise a geographic position (3) and a category of the target; tracking the detected target (1) until at least one certain predetermined criteria is not fulfilled, wherein said criteria is associated to the level of certainty for determining the geographic position (3) of the target (1). The method further comprises determining a first point in time t.sub.1 when the predetermined criteria was last fulfilled, wherein, for a second point in time t.sub.2 the following step is performed: creating a pattern (2) defining at least one possible geographic position (3) of the target (1), said pattern (2) extends at least partially around the geographic position (3) of the target (1) at t.sub.1, wherein the dimension of said pattern (2) is determined based on at least one predetermined parameter.

Claims

1. A method for estimating the geographic position of a target, the method comprises the following steps: detecting a target; determining at least a geographic position of the target and a category of the target, the category being a target type; tracking the detected target until at least one certain predetermined criteria is not fulfilled, wherein said criteria is associated to the level of certainty for determining the geographic position of the target; determining a first point in time t.sub.1 when the predetermined criteria was last fulfilled, wherein, for a second point in time t.sub.2 the following steps are performed: creating a pattern defining at least one possible geographic position of the target, said pattern extends at least partially around the geographic position where the target was determined at the first point in time t.sub.1, wherein the dimension of said pattern is determined based on at least one predetermined parameter, the at least one predetermined parameter including the target type; and calculating a probability of the presence of the target associated with a geographic position in the pattern; wherein: the method is performed in a system comprising at least one sensor arranged to detect a target, at least one sensor arranged to determine the category of the target, and at least one sensor arranged to track the detected target until at least one certain predetermined criteria is not fulfilled; the calculating of the probability of the presence of the target is based on characteristics of the at least one sensor; and said at least one predetermined parameter further comprises: the characteristics of the surrounding of the geographic position where the target was detected at said first point in time t.sub.1; and a time difference between the first point in time t.sub.1 and the second point in time t.sub.2 for which point in time the pattern is created.

2. The method according to claim 1, wherein said predetermined parameter further comprises: a level of surveillance of the surrounding of the geographic position of the target at said first point in time t.sub.1.

3. The method according to claim 1, wherein the method further comprises the following step: calculating a probability of the presence of the target associated with each geographic position in the pattern at said second point in time t.sub.2.

4. The method according to claim 1, wherein the at least one sensor of the system is controlled based on the pattern.

5. The method according to claim 3, wherein the at least one sensor of the system is configured to detect the target in a surrounding of a certain characteristic and is controlled to scan said surrounding, at least at said geographic positions where the pattern of the target is present.

6. The method according to claim 1, wherein a route for at least one object is planned based on the pattern that is created.

7. The method according to claim 6, wherein a route for an object is planned in order to minimize the probability of entering the pattern around the target, or in order to minimize the probability of being within a certain distance from the target, in order to minimize the risk of encountering the target.

8. The method according to claim 1, wherein the geographic positions of a pattern for the target are related to a grid.

9. A system for estimating the geographic position of a target wherein said system comprises: at least one sensor arranged to detect a target; at least one sensor arranged to track the detected target until at least one certain predetermined criteria is not fulfilled, wherein said criteria is associated to a level of certainty for determining a geographic position of the target; at least one sensor arranged to determine a category of the target, the category being a target type; fulfilment determination circuitry configured to determine a first point in time t.sub.1 when the predetermined criteria was last fulfilled; characteristic determination circuitry configured to determine at least the geographic position of the target and a category of the target; and pattern creation circuitry configured to, for a second point in time t.sub.2, perform the following steps: creating a pattern defining possible geographic positions of the target, wherein said pattern extends at least partially around the geographic position where the target was determined at the first point in time t.sub.1, wherein the dimension of said pattern is determined based on at least one predetermined parameter, the at least one predetermined parameter including the target type, the at least one predetermined parameter further comprising: the characteristics of the surrounding of the geographic position where the target was detected at said first point in time t.sub.1; and a time difference between the first point in time t.sub.1 and the second point in time t.sub.2 for which point in time the pattern is created; and calculating a probability of the presence of the target associated with a geographic position in the pattern based on characteristics of the utilized sensor.

10. The system according to claim 9, wherein said predetermined parameter further comprises: a level of surveillance of the surrounding of the geographic position of the target at said first point in time t.sub.1.

11. The system according to claim 10, wherein the system further comprises probability calculator circuitry arranged to: calculate a probability of the presence of the target associated with each geographic position in the pattern at said second point in time t.sub.2.

12. The system according to claim 9, further comprising sensor controlling circuitry arranged to control the at least one sensor of the system based on the pattern.

13. The system according to claim 12, wherein the at least one sensor configured to detect the target in a surrounding of a certain characteristic is controlled to scan said surrounding, at least at said geographic positions where the pattern of the target is present.

14. The system according to claim 9, wherein the system comprises means arranged to plan a route for at least one object based on the created pattern.

15. The system according to claim 14, wherein a route for an object is planned in order to minimize the probability of entering the pattern around the target, or in order to minimize the probability of being within a certain distance from the target, or in order to minimize the risk of encountering the target.

Description

BRIEF DESCRIPTION OF THE FIGURES

(1) For a further understanding of the present invention and its further objects and advantages, the detailed description set out below should be read in conjunction with the accompanying drawings, in which the same reference notations denote similar items in the various diagrams, and in which:

(2) FIG. 1 illustrates an overview of a system according to one example of the disclosure;

(3) FIG. 2 illustrates a system according to one example of the disclosure;

(4) FIG. 3a illustrates a view of an area of land according to one example of the disclosure;

(5) FIG. 3b illustrates a view of an area of land according to one example of the disclosure;

(6) FIG. 3c illustrates a view of an area of land according to one example of the disclosure;

(7) FIG. 3d illustrates a view of an area of land according to one example of the disclosure;

(8) FIG. 4a is a schematic flowchart of a method according to one example of the disclosure; and

(9) FIG. 4b is a schematic flowchart of a method according to one example of the disclosure.

(10) FIG. 5 schematically illustrates a computer according to an embodiment of the invention.

(11) The term “link” refers herein to a communication link which may be a physical connection such as an opto-electronic communication line, or a non-physical connection such as a wireless connection, e.g. a radio link or microwave link.

(12) FIG. 1 schematically illustrates a system 100 for estimating the geographic position of a target 1, and a target 1 according to one example. The system 100 comprises sensors 300a, 300b, 300c arranged to scan a certain surrounding. The sensors are arranged to detect and track a target 1. The system 100 comprises a central control unit 250. The sensors 300a-300c is according to one example of the same type. Alternatively, the sensors 300a-c is different types of sensors. The sensors 300a-c can be of any type capable of detecting a target, such as for example radar sensors, optical sensors, audio sensors etc. According to the illustrated example, the sensors 300a-300c is attached to aircrafts. However, the sensors 300a-300c of the system 100 may be arranged on other vehicles such as drones, land borne vehicles, waterborne vehicles etc. or alternatively be attached to a stationary structure such as a mast or building, stationary cameras etc. According to one embodiment not illustrated, a sensor is a human.

(13) In the illustrated example, a target 1 has been detected by one of the sensors 300a of the sensor surveillance system 100. The sensor 300a has determined the category of the target 1 to be a tank. The information about the target 1 determined by the sensor 300a is communicated to the central control unit 250. The sensor 300a tracks the detected target 1 until the geographic position of the target 1 cannot be determined to a certain level of certainty. The last point in time when the geographic position was determined to a certain level of certainty was at a point in time t.sub.1. At said point in time t.sub.1, the geographic position 3 of the target 1 was determined by the sensor 300a. The target 1 is hence at a first point in time t.sub.1 positioned at a geographic position 3.

(14) According to one example, the sensors 300a-c communicates with the central control unit 250. The central control unit 250 may be a separate unit situated in for example a building. According to one embodiment, not illustrated, the central control unit 250 is situated in a vehicle such as for example an aircraft. The central control unit 250 is arranged to control the sensors 300a-300c of the system.

(15) According to one example not illustrated, the sensor is a human seeing a target 1. The human determines the point in time t.sub.1 when the human can no longer see the target 1, and the human also determines the category of the target 1. The information regarding t.sub.1 and the category of the target may be entered into the central control unit 250.

(16) FIG. 2 is a schematic illustration of a system 100 for estimating the geographic position of a target 1 according to one example. The system 100 comprises sensors 300a-c arranged to detect and track a target 1 and a central control unit 250 connected to each sensor 300a-c via a sensor link L1, L2, L3. According to the illustrated example, a user interface 450 is connected to the central control unit 250 via a user interface link L4. The system 100 may have some of or all of the characteristics as presented in relation to FIG. 1.

(17) The sensors 300a-c are further described below. The sensors 300a-c may according to one example each comprise all the functionality described below. Alternatively, the sensors 300a-c together comprise the functionality as described below, and hence, all of the features described below may not be in each of the sensors 300a-c.

(18) According to one example the sensor 300a-c, is arranged to detect a target 1 in a geographic volume if interest. The sensor 300a-c is according to one example arranged to track the target 1 and to determine the characteristics of the target 1.The characteristics of the target 1 comprise at least a geographic position 3 and a category of the target 1. The category of the target 1 is according to one example: type of vehicle such as car, tank, submarine, aircraft etc. The category of the target 1 comprises according to one example further details regarding the target 1, such as the colour of the target, number of people in the vehicle, or any other features that characterises the target such as “broken window” “license plate ABC123” etc. In addition, the category of the target 1 may include information regarding the direction and speed at which the target 1 is moving.

(19) The category of the target 1 is, according to one example, defined in order to be able to verify the target 1 if/when it is redetected. In other words, the category of the target 1 is used to verify that the same target 1 as previously detected has been redetected. The defined category of the target 1 is according to one example utilized by the system 100 in order to receive further information regarding the target 1, such as the maximum speed of the target 1, the maximum acceleration of the target 1, the ability of the target 1 to move in a certain terrain etc. The further information regarding the target 1 may be received from the control unit 250 and/or the sensors 300a-c of the system 100. Alternatively, or in addition, the further information regarding the target 1 may be received from an external unit, such as an external database (not illustrated).The level of detail of the determined category of the target 1 which is determined by the sensor 300a-c may vary depending on the sensor 300a-c, such as the type of sensor 300a-c, the type of target 1, the speed of the target 1 and other external factors such as weather, terrain etc. In order to be able to determine a category of the target 1, a certain level of detail of the category of the target 1 has to be fulfilled. The system 100 controls the minimum level of detail of the category of the target 1. If a sensor 300a-c detects a target 1 but cannot categorize the target 1 to above the set minimum level of detail of the category of the target 1, the target 1 is not considered to be detected or alternatively, the target 1 is not categorized. The information regarding the category of the target 1 is according to one embodiment stored in the sensor 300a-c and/or in the central control unit 250. According to one embodiment, the information regarding the characteristics of the target 1 is communicated to a number of sensors 300a-c of the system 100 via links L1-L3.

(20) According to one example, the sensor 300a-c is a category determining sensor 300a-c arranged to determine the category of the target 1 at one point in time during the tracking process of the target 1. However, it is not necessary that the category determining sensor 300a-c detects the category of the target 1 during the whole tracking process. Once the category of the target 1 has been determined by the category determining sensor 300a-c, the sensor 300a-c only has to determine the geographic position of the target 1 during the tracking process.

(21) The category determining sensor 300a-c may further be arranged to track the detected target 1 until at least one certain predetermined criteria is not fulfilled, wherein said criteria is associated to the level of certainty for determining the geographic position 3 of the target 1. The level of certainty at which a sensor 300a-c is able to determine the geographic position of a target is dependent on for example the type of sensor 300a-c, the visibility in the surrounding such as in the air or in the water where the target 1 is situated, the terrain of the area surrounding the target 1, such as for example forest or swamp etc., obstructing objects, the type of target 1, the type of sensor 300a-c etc. The last point in time when the level of certainty is fulfilled, which point in time is referred to the first point in time t.sub.1, is determined by the sensor 300a-c or by the central control unit 250 and registered in the system 100, in the central control unit 250 and/or in the sensor 300a-c. In addition, the geographic position 3 of the target 1 at said first point in time t.sub.1 is determined by the sensor 300a-c or by the central control unit 250 and registered in the system 100, in the central control unit 250 and/or in sensor 300a-c.

(22) The category determining sensor 300a-c is controlled by the central control unit 250 via the sensor link L1-L3. According to one embodiment, the central control unit 250 receives data from the characteristic determining sensor 300a-c and utilizes the data received in order to determine the characteristics of the target 1 or to derive other information from the data received from the category determining sensor 300a-c.

(23) In the illustrated example, the system 100 comprises three sensors 300a-c. However, according to one embodiment, any number of sensors 300a-n and/or any types of sensors able to detect a target 1 and to determine the category and the geographic position of a target 1 can be included in the system 100.

(24) The central control unit 250 and/or the sensors 300a-c comprises, separately or in combination: fulfilment determination circuitry, characteristic determination circuitry, pattern creation circuitry, probability calculator circuitry and sensor controlling circuitry.

(25) The central control unit 250 is arranged to, for a second point in time t.sub.2 perform the following step: create a pattern defining possible geographic positions of the target 1, said pattern extends at least partially around the geographic position 3, where the target 1 was last detected, wherein the dimension of said pattern is determined based on at least one predetermined parameter.

(26) According to one example the predetermined parameter comprise: the category of the target 1, the characteristics of the surrounding of the geographic position 3 where the target 1 was detected at said first point in time t.sub.1, and a time difference between the first point in time t.sub.1 and a second point in time t.sub.2 for which point in time the pattern is created. According to one example the predetermined parameter further comprise a level of surveillance of the surrounding of the geographic position 3 of the target 1 at said first point in time t.sub.1

(27) Information regarding the level of surveillance of the surrounding of the geographic position of the target 1 at said first point in time t.sub.1 is derived from the central control unit 250, possibly in combination with data from an external unit not illustrated. The level of surveillance depends on the number of sensors 300a-c in a certain geographic volume of interest, the type of sensors 300a-c, and the terrain of the surrounding of the geographic area of interest and the ability for the sensors to detect the target category etc.

(28) According to one example, the time difference between the first point in time t.sub.1 and the second point in time t.sub.2 for which the pattern is created is determined by the central control unit 250.

(29) According to one example, the central control unit 250 is arranged to communicate with at least one external unit (not illustrated) in order to retrieve information regarding the parameters, which parameters are used in order to determine the dimensions of the pattern. Said external unit may comprise a database with geographic information, weather information, water current information or other information regarding the surrounding of a target 1. In addition, the database of the external unit may comprise information regarding a category of a target 1 such as the ability of a certain target 1 to move in a certain terrain, maximum speed/acceleration of a certain target etc. According to one example, the central control unit 250 comprises some or all of the information as mentioned above. Said external unit may according to one example also comprise a database describing the ability for different sensor types to detect different category of targets in different types of terrain.

(30) The central control unit 250 is according to one example connected to a user interface 450. The user interface 450 is according to one example arranged to receive or present data or instructions from/to a user. According to one example, a user can control a sensor 300a-n of the system 100 via the user interface 450.

(31) According to one example, the central control unit 250 controls the sensors 300a-c. According to one example, the central control unit 250 controls the sensors 300a-c based on the created pattern. For example, if a pattern is covering a geographic area, and the geographic area is an area in which only a sensor 300a-c of a certain type can detect a target 1, such a sensor 300a-c able to scan the area of interest is controlled to scan said area.

(32) According to one example the central control unit 250 determines a time schedule for each sensor 300a-n including a number of geographical positions which the sensor 300a-n is controlled to scan at certain points in time.

(33) According to one example, the central control unit 250 is connected to another system (not illustrated) comprising sensors for estimating a geographic position of a target. The central control unit 250 is according to this example able to “borrow” a certain sensor from the other system for surveilling a certain geographic area.

(34) According to one example, the central control unit 250 determines the first point in time t.sub.1 based on data from the sensor 300a-c.

(35) According to one example, the central control unit 250 and/or the sensor 300a-c is arranged to calculate a probability of the presence of the target 1 associated with each geographic position 3 in the pattern at said second point in time t.sub.2.

(36) According to one example, the central control unit 250 is arranged to be able to plan a route for an object based on the generated pattern. According to one example, the route for an object is planned in order to minimize the probability of entering a pattern around a target 1, or in order to minimize the probability of being within a certain distance from a target 1, or in order to minimize the risk of encountering a target 1.

(37) According to one example, the central control unit 250 is arranged to relate the geographic positions of a pattern for a target 1 to a grid. A grid enables a user friendly presentation of a pattern.

(38) FIG. 3a illustrates a view of an area of land. In the illustrated area, a target 1 is present. The target 1 in this illustrated example is a land borne vehicle. The area is an area illustrated in two dimensions. The target 1 in this illustrated example is only able to move along the ground. The target 1 is situated on a road 20. The illustrated area of land comprises subareas 10, 30, 40 wherein the subareas comprise different types of terrain.

(39) According to the illustrated example, the area 10 comprises forest, the area 30 is a swamp area and the area 40 is a meadow area.

(40) The illustrated view comprises the target 1, which target 1 has been detected by a sensor 300a-c of a system 100, and the surrounding area at a first point in time t.sub.1.

(41) According to one example of the illustrated example, the sensor 300a-c has categorized the target 1 as a car. According to one example, the central control unit 250 comprises a database comprising information about different categories of targets 1. According to one example, the information in the central control unit 250 for the target category “car” comprises information that the car is able to travel on a road 20 at a maximum speed of 90 km/h. In addition, according to the information in the central control unit 250, the car is not able to travel in a forest area 10, but is able to travel in a swamp area 30 and a meadow area 40 at a maximum speed of 10 km/h. The further information about the category of the target 1, such as ability to move in certain terrains, maximum speed etc. is according to one example derived from an external database, not illustrated.

(42) The sensor 300a-c determines according to one example the category of the target 1. According to one embodiment, the sensor 300a-c also determines the current speed of the target 1 and the direction of movement of the target 1. According to one example, the current speed is assumed to be the maximum speed of the target 1. According to one example, the terrain in which the target 1 is moving at the first point in time t.sub.1 is assumed to be the terrain in which the target 1 will proceed its movement. According to one example, the direction in which the target 1 is moving at the first point in time t.sub.1 is assumed to be the direction in which the target 1 will proceed its movement.

(43) FIG. 3b illustrates the same area of land as illustrated in FIG. 3a, for a second point in time t.sub.21. A pattern 2 is illustrated, defining at least one possible geographic position of the target 1 at the second point in time t.sub.21. The geographical position 3 at which the target 1 was situated at the first point in time t.sub.1 is illustrated. According to one example, the pattern 2 is created or generated by means of the central control unit 250. The pattern 2 is created around the geographic position 3 of the target at t.sub.1, wherein the dimensions of the pattern 2 are based on at least one of the following parameters; the category of the target 1; the characteristics of the surrounding of the geographic position 3 where the target 1 was detected at said first point in time t.sub.1, a time difference between the first point in time t.sub.1 and a second point in time t.sub.2 for which the pattern 2 is created. According to one example, the dimensions of the pattern 2 are further based on a level of surveillance of the surrounding of the geographic position 3 of the target 1 at said first point in time t.sub.1.

(44) According to the illustrated example as mentioned in relation to FIG. 3a, the target 1 is able to move in a meadow area 40 and a swamp area 30, but not in a forest area 10, and hence, the pattern 2 does not extend in the forest area 10. In addition, according to the illustrated example, the pattern 2 does not extend in all directions around the geographic position 3 even if the target 1 is able to move in all directions. This is according to one example due to a sensor 300a-c of a sensor surveillance system being capable to confirm that the target 1 is not situated in the area 5 above. Alternatively, when generating the pattern 2, it is assumed that the target 1 continues in a travelling direction in which it was travelling at the first point in time t.sub.1 and hence the pattern 2 is adjusted accordingly.

(45) According to the illustrated example, the target 1 is able to move in the swamp area 30 at a certain speed, in the meadow area 40 at a certain speed and on the road 20 at a certain speed, but not in a forest area 10. The shape and dimension of the generated pattern 2 at a second point in time t.sub.2 will be adjusted accordingly.

(46) According to one example, the pattern 2 is generated continuously. According to one example, the pattern 2 is generated on command, for example by a manual command.

(47) According to one example the pattern 2 is generated in two dimensions.

(48) According to one example the pattern 2 is generated in three dimensions, for example for an airborne or a waterborne target.

(49) The pattern 2 is according to one example generated in real time.

(50) The pattern 2 is according to one example generated in a flexible manner in real time based on at least one of the following parameters: the category of the target 1; the characteristics of the surrounding of the geographic position 3 where the target was detected at said first point in time; the time difference between the first point in time t.sub.1 and the second point in time t.sub.2.

(51) The pattern 2 is according to one example generated automatically, for example at certain time intervals. The pattern 2 is according to one example generated automatically, for example continuously. Alternatively, the pattern 2 can be generated on request.

(52) The pattern 2 can be generated automatically on the basis of at least one of the following parameters: the category of the target 1; the characteristics of the surrounding of the geographic position 3 where the target 1 was detected at said first point in time t.sub.1; the time difference between the first point in time t.sub.1 and the second point in time t.sub.2.

(53) The pattern 2 can be generated automatically on the basis of a level of surveillance of the surrounding of the geographic position 3 of the target 1 at said first point in time t.sub.1.

(54) According to one example, the control unit 250 derives further information regarding the ability of a certain category of the target 1 to travel in certain terrain, the maximum speed of the category of the target in different terrain and/or the maximum acceleration of the category of the target 1 in different terrain. The information can be derived from the central control unit 250 itself and/or from an external control unit and/or from the sensors 300a-c. The control unit 250 uses the derived information when generating/creating the pattern 2. According to one example, the control unit 250 derives information regarding the characteristics of the surrounding of the geographic position 3 where the target 1 was detected at said first point in time t.sub.1. The control unit 250 uses this derived information when generating/creating the pattern 2. According to one example, the control unit 250 derives information regarding; the time difference between the first point in time t.sub.1 and the second point in time t.sub.2. The control unit 250 uses this derived information when generating/creating the pattern 2.

(55) In addition, the control unit 250 derives information regarding the level of surveillance of the terrain surrounding the target 1. The control unit 250 uses this derived information when generating /creating the pattern 2.

(56) According to one example, each geographic position in the pattern 2 is associated with a probability of the presence of the target 1 in each geographic position in the pattern 2. The probability of the presence of the target 1 in each geographic position in the pattern 2 could for example be illustrated by different colours of the different positions in the pattern 2, not illustrated.

(57) For example, if the detected target 1, is a car driving on a road 20, the probability according to one example, that the car stays on the road 20 and hence is situated along the road 20 at a second point in time t.sub.21 is higher that the probability that the car is on a meadow 40, even if the car is able to drive on a meadow 40. This could according to one example be illustrated by making the pattern 2 darker along the road 20 and lighter in the surrounding areas (not illustrated). The probability of the presence of the target in each geographic position in the pattern 2 may be illustrated by other methods than a differing darkness of the positions in the pattern 2.

(58) FIG. 3c illustrates the same area of land as illustrated in FIGS. 3a and 3b, and a pattern 2 generated at a different second point in time t.sub.22. The second point in time t.sub.22 for which the illustrated pattern 2 is created occurs later in time than the second point in time t.sub.21 for which the pattern 2 in FIG. 3b is generated.

(59) Since more time has passed from the first point in time t.sub.1 when the pattern 2.sub.t21 in FIG. 1b was generated or created, a larger pattern 2.sub.t22 is generated or created for this second point in time t.sub.22. The target 1 has possibly moved further away from the geographical position 3 in the time between t.sub.21 and t.sub.22 and hence, the pattern 2 generated at t.sub.22 is larger than the pattern 2 generated at an earlier point in time t.sub.21.

(60) FIG. 3d illustrates the same area of land as illustrated in FIG. 3a-3c, and a pattern 2 generated at the second point in time t.sub.22. The pattern 2 differs from the pattern 2 illustrated in FIG. 3c, even if the same amount of time has passed between the first point in time t.sub.1 and the second point in time t.sub.22 for which the pattern 2 is generated. The shape of the pattern 2 illustrated in FIG. 3d differs from the pattern 2 in FIG. 3c. The difference is due to at least one sensor 300a-c of a sensor surveillance system of a system for estimating the geographic position of a target 1 being present. The sensor 300a-c (not illustrated) is capable of determine the characteristics of a target 1, which characteristics at least comprise a geographic position and a category of the target 1 in an area 6, at the time t.sub.22 for which the pattern 2 was generated. At the time t.sub.22, the sensor 300a-c could not detect a target 1 within the area 6. Hence, in this described example, the pattern 2 does not extend in the area 6 surveilled by the described sensor.

(61) According to one example not illustrated, the target 1 is a flying target and the pattern is a three dimensional pattern created in a volume of air. According to one example not illustrated, the target 1 is a waterborne target and the pattern is a three dimensional pattern created in a volume of water.

(62) FIG. 4a is a flow chart illustrating a method for estimating the geographic position of a target, according to one embodiment of the present disclosure. In a first step, 51 a method for estimating the geographic position of a target is performed. The method comprises the following steps: detecting a target; determining the characteristics of the target, which characteristics at least comprise a geographic position and a category of the target; tracking the detected target until at least one certain predetermined criteria is not fulfilled, wherein said criteria is associated to the level of certainty for determining the geographic position of the target; and determining a first point in time t.sub.1 when the predetermined criteria was last fulfilled. Further, for a second point in time t.sub.2 the following step is performed: creating a pattern defining at least one possible geographic position of the target, said pattern extends at least partially around the geographic position of the target at the first point in time, wherein the dimension of said pattern is determined based on at least one predetermined parameter.

(63) FIG. 4b is a flow chart illustrating a method for estimating the geographic position of a target 1, according to one example of the present disclosure.

(64) In a first step, M1 a target 1 is detected.

(65) In a second step M2, the characteristics of the target 1 is determined. The characteristics comprise at least the geographic position 3 of the target 1 and a category of the target 1. The category of the target may comprise information regarding the type of the target 1 such as for example a type of vehicle (car, boat, drone, truck etc.) or number of people. The category of the target may further comprise further information regarding the function or appearance of the target such as “broken window”, “two passengers”, child, 110 cm long etc. In addition, the category of the target 1 could comprise information about the direction and speed at which the target 1 is moving.

(66) In a third step M3, the target 1 is tracked until at least one certain predetermined criteria is not fulfilled, wherein said criteria is associated to the level of certainty for determining the geographic position of the target. In other words, the target 1 is tracked until it can no longer be seen by the sensors 300a-c of a surveillance system 100. In order to define a situation when the target 1 is no longer “seen”, certain criteria associated to the level of certainty for determining the geographic position of the target 1 are defined, and when those criteria are no longer fulfilled, the target 1 is considered to no longer be seen.

(67) In a forth step M4, a first point in time, t.sub.1 is determined. The first point in time t.sub.1, is defined as the last point in time when the criteria were fulfilled, that is, the last point in time when the target 1 was tracked.

(68) In a fifth step M5, a geographic position 3 of the target 1 at t.sub.1 is determined according to the method.

(69) In a sixth step M6, according to one example of the method, the probability of the presence of the target 1 at different geographic positions is calculated. The calculation is according to one example based on at least one of: the category of the target 1, the speed at which the target 1 was moving when tracked, the surrounding of the geographic position 3 where the target 1 was last seen, a second point in time t.sub.2 at which time the probability of the presence of the target 1 is to be calculated etc.

(70) In a seventh step M7 a pattern 2 is created/generated, which pattern 2 defines at least one possible geographic position of the target 1, said pattern 2 extends at least partially around the geographic position 3, where the target was detected at said first point in time t.sub.1, wherein the dimension of said pattern 2 is determined based on at least one predetermined parameter. The pattern 2 is according to one example based on at least one of: the category of the target, the speed at which the target was moving when tracked, the surrounding of the geographic position where the target 1 was last seen, a second point in time t.sub.2 at which time the pattern 2 of the target 1 is of interest to view for the user.

(71) In an eight step M8, at least one sensor 300a-c of a sensor surveillance system 100 is controlled based on the pattern 2. According to one example, wherein a sensor 300a-c of the sensor surveillance system 100, which senor 300a-c is capable to detect a target 1 in a surrounding of a certain characteristics is controlled to scan a surrounding comprised in a pattern 2, at least at said geographic positions where a pattern 2 of a target is present.

(72) In a ninth step M9, a route for an object is planned in order to minimize the probability of entering a pattern 2 around a target 1, or in order to minimize the probability of being within a certain distance from a target 1, in order to minimize the risk of encountering a target.

(73) In a tenth step M10, the geographic positions of a pattern for a target are related to a grid.

(74) FIG. 5 is a diagram of one version of a device 500. The control units 250 described with reference to FIG. 2 may in one version comprise the device 500. The device 500 comprises a non-volatile memory 520, a data processing unit 510 and a read/write memory 550. The non-volatile memory 520 has a first memory element 530 in which a computer program, e.g. an operating system, is stored for controlling the function of the device 500. The device 500 further comprises a bus controller, a serial communication port, I/O means, an A/D converter, a time and date input and transfer unit, an event counter and an interruption controller (not depicted). The non-volatile memory 520 has also a second memory element 540.

(75) The computer program P comprises routines for estimating the geographic position of a target.

(76) The computer program P may comprise routines for performing any of the process steps detailed with reference to FIG. 5.

(77) The program P may be stored in an executable form or in compressed form in a memory 560 and/or in a read/write memory 550.

(78) Where it is stated that the data processing unit 510 performs a certain function, it means that it conducts a certain part of the program which is stored in the memory 560 or a certain part of the program which is stored in the read/write memory 550.

(79) The data processing device 510 can communicate with a data port 599 via a data bus 515. The non-volatile memory 520 is intended for communication with the data processing unit 510 via a data bus 512. The separate memory 560 is intended to communicate with the data processing unit via a data bus 511. The read/write memory 550 is arranged to communicate with the data processing unit 510 via a data bus 514. The links L210, L230, L231, L233, L237, L243 and L253, for example, may be connected to the data port 599.

(80) When data are received on the data port 599, they are stored temporarily in the second memory element 540. When input data received have been temporarily stored, the data processing unit 510 will be prepared to conduct code execution as described above.

(81) Parts of the methods herein described may be conducted by the device 500 by means of the data processing unit 510 which runs the program stored in the memory 560 or the read/write memory 550. When the device 500 runs the program, method steps and process steps herein described are executed.

(82) The foregoing description of the preferred embodiments of the present invention is provided for illustrative and descriptive purposes. It is not intended to be exhaustive, nor to limit the invention to the variants described. Many modifications and variations will obviously suggest themselves to one skilled in the art. The embodiments have been chosen and described in order to best explain the principles of the invention and their practical applications and thereby make it possible for one skilled in the art to understand the invention for different embodiments and with the various modifications appropriate to the intended use.

(83) The components and features specified above may within the framework of the invention be combined between different embodiments specified.