AIMING-ASSISTANCE METHOD AND DEVICE FOR LASER GUIDANCE OF A PROJECTILE
20170314891 ยท 2017-11-02
Assignee
Inventors
Cpc classification
F41G3/145
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G7/007
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G7/226
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G3/02
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G7/2293
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
International classification
Abstract
A method and a device for assisting aiming at a target, in particular for the purpose of improving the accuracy with which a projectile is guided towards said target by means of a laser beam. The method makes use of a camera serving to capture either a complete image of the environment, or else a selective image of said target in said environment. Thereafter, the method makes it possible to verify that said laser beam is indeed pointing at said target by displaying the point of contact of said laser beam in said environment on the image captured by said camera, and then to determine the accuracy with which said laser beam is indeed pointing at said target. As a function of said accuracy, launching of said projectile may either be confirmed or cancelled. This method also makes it possible to identify the code of said guide beam illuminating said target.
Claims
1. A method of assisting aiming at a target, wherein the method comprises the following steps: a first step of completely scanning an environment with a camera; a second step of displaying a complete image of the environment; a third step of identifying and selecting a target in the complete image of the environment; a fourth step of an operator pointing at the target with a guide beam; a fifth step of displaying a complete image of the environment and of the point of contact of the guide beam in the environment; a sixth step of the operator pointing at the target with a guide beam; a seventh step of selectively scanning the target and the point of contact of the guide beam in the environment with the camera; and an eighth step of displaying a selective image of the target, of the current point of contact of the guide beam in the environment, and of at least one of the previously-displayed points of contact.
2. A method of assisting aiming at a target according to claim 1, wherein during the eighth step, the points of contact of the guide beam in the environment are captured over a predetermined duration, and then displayed together with the selective image of the target.
3. A method of assisting aiming at a target according to claim 1, wherein, after the eighth step of displaying a selective image, the method comprises: a ninth step of calculating a first number of the points of contact of the guide beam touching the target, and a second number of the points of contact of the guide beam not touching the target; and a tenth step of displaying information about the accuracy of the points of contact of the guide beam touching the target.
4. A method of assisting aiming at a target according to claim 3, wherein the information about the accuracy of the points of contact of the guide beam touching the target is formed by the first number and the second number, or else by a percentage of the points of contact of the guide beam actually touching the target from among all of the points of contact of the guide beam in the environment.
5. A method of assisting aiming at a target according to claim 3, wherein the ninth step takes place over a predetermined duration.
6. A method of assisting aiming at a target according to claim 1, wherein a guide beam is defined by a code constituted by time characteristics, and the method comprises, after the fourth step of pointing at the target and in parallel with the following steps, an eleventh step of analyzing and identifying the guide beam, the points of contact of the guide beam in the environment being analyzed in order to determine the time characteristics of the guide beam and thereby identify the code of the guide beam.
7. A method of assisting aiming at a target according to claim 6, wherein when the guide beam is constituted by a succession of pulses, the time characteristics of the guide beam are a frequency and a duration of the pulses.
8. A method of assisting aiming at a target according to claim 1, wherein the method comprises, between the fourth step of pointing at the target and the fifth step of displaying a complete image, an intermediate step of completely scanning the environment with the camera in order to update the display of the target and of the environment.
9. A method of assisting aiming at a target according to claim 1, wherein crosshairs are displayed on the target during the second, fifth, and eighth steps of displaying an image in order to facilitate identifying the target.
10. A method of assisting aiming at a target according to claim 1, wherein the seventh step of selectively scanning the target and the point of contact of the guide beam is performed by capturing each item in the environment for which a radiometric change is detected, and also capturing the point of contact.
11. A method of assisting aiming at a target according to claim 1, wherein during the seventh step of selectively scanning the target and the point of contact of the guide beam is performed by capturing information about particular items in the environment that are moving, and also capturing the point of contact.
12. A method of guiding a projectile by means of a guide beam, the method comprising: a step of illuminating a target by a guide beam), a step of locking the projectile on the target, a step of launching the projectile; and a step of guiding the projectile towards the target; wherein the method of assisting aiming according to claim 1, is applied during the step of illuminating a target in order to improve the accuracy with which the target (5) is illuminated.
13. A method of guiding a projectile by means of a guide beam according to claim 12, wherein the step of locking the projectile on the target and/or the step of launching the projectile is/are cancelled as a function of information about the accuracy of the points of contact of the guide beam touching the target and/or about the time characteristics of the guide beam.
14. A device for assisting aiming at a target comprising a camera, display means, a computer, and selector means, wherein the device performs the method according to claim 1 and the camera is a camera suitable for capturing specific information about particular items in the captured environment, the target being a particular item in the environment.
15. A device for assisting aiming at a target according to claim 14, wherein the computer is configured to analyze each selective image successively displayed on the display means and to determine a first number of points of contact of the guide beam touching the target, and a second number of points of contact of the guide beam not touching the target, and then to determine information about the accuracy of the points of contact of the guide beam touching the target.
16. A device for assisting aiming at a target according to claim 14, wherein the computer is configured to analyze the environment seen by the camera and in particular the points of contact of the guide beam in the environment, in order to determine time characteristics of the guide beam and to use the time characteristics to identify the guide beam.
17. A system for guiding a projectile by means of a guide beam, the system comprising a generator for generating a guide beam and a projectile provided with a receiver device, wherein the system for guiding a projectile includes a device for assisting aiming in accordance with any one according to claim 14.
18. A system according to claim 17 for guiding a projectile by a guide beam, wherein the system for guiding a projectile by a guide beam performs the method of guiding a projectile by a guide beam according to claim 12.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0102] The invention and its advantages appear in greater detail in the context of the following description of embodiments given by way of illustration and with reference to the accompanying figures, in which:
[0103]
[0104]
[0105]
[0106]
[0107] Elements that are present in more than one of the figures are given the same references in each of them.
DETAILED DESCRIPTION OF THE INVENTION
[0108]
[0109] The target 5 is initially illuminated by the guide beam 9 emitted by the generator 6, and reflections of this guide beam 9 are then dispersed in a multitude of directions by reflection on the target 5. The guide beam 9 may be visible or invisible to the human eye, depending on the wavelengths making up the guide beam 9.
[0110] In parallel with this illumination of the target 5, or indeed subsequently, the projectile 10 is launched towards the target 5. The projectile 10 includes a receiver device 11 that, on approaching target 5, receives a portion of the guide beam 9 as reflected by the target 5, and then determines the source of this portion of the reflected guide beam 9. Finally, the projectile 10 is guided and directed towards the source, i.e. the target 5, so long as the guide beam 9 is pointed on the target 5 and illuminates it.
[0111] The device 1 for assisting aiming at a target 5 comprises a camera 2, display means 3, a computer 4, and selector means 7. The display means 3 constitute a screen. The device 1 for assisting aiming at a target 5 is configured to improve the accuracy of the aim on the target 5 by performing a method of providing assistance in aiming at a target as summarized by the diagram shown in
[0112] During a scanning, first step 101, an environment is scanned completely by using the camera 2.
[0113] During a display, second step 102, a complete image of this environment corresponding to the complete scan of the environment is displayed on the display means 3. This complete image is shown in
[0114] During a third step 103 of identifying and selecting a target, a target 5 is identified and then selected in the complete image of the environment. This identification is performed by an operator in charge of aiming at the target with the guide beam 9.
[0115] Thereafter, the operator selects the target 5 in the complete image of the environment. This selection is performed by selector means 7, such as a pushbutton, while the operator is aiming at the target 5. This selection is performed by the operator while aiming at the target 5 by using the guide-beam generator 6, but without emitting the guide beam. By way of example, the operator then makes use of the telescopic sights 61 of the guide-beam generator 6 in order to aim at the target 5, and then actuates the selector means 7 in order to select the target 5 being aimed at.
[0116] This selection of the target 5 may also be done automatically when the operator aims at a target 5 that is stationary or aims at the target for a first predetermined duration.
[0117] During a fourth step 104, the operator points the guide beam 9 on the target 5 in order to guide the projectile 10 to the target 5. The operator generally makes use of the crosshairs present in the telescopic sights 61 of the guide-beam generator 6 in order to aim at the target 5.
[0118] The first step 101 and the second step 102 are repeated prior to performing the fourth step 104 in order to update the complete display of the environment.
[0119] During a display, fifth step 105, the complete image of the environment is displayed on the display means 3 together with the point of contact 91 of the guide beam 9 in the environment. Specifically, the guide beam 9 and its reflection on the target 5 are advantageously still visible to the camera 2. Also, crosshairs 8 may also be displayed on the display means 3, thus informing the operator about the point of the environment that is being aimed at. This complete image including the point of contact 91 of the guide beam 9 and the crosshairs 8 is shown in
[0120] This display, fifth step 105 thus enables the operator to view and verify firstly that the crosshairs 8 are indeed pointed at the target 5 and secondly that the point of contact 91 of the guide beam 9 also points at the target 5. Specifically, the operator must point the guide beam 9 permanently at the target 5 until the projectile 10 strikes the target 5.
[0121] Also, an intermediate step 115 of completely scanning the environment may be performed between the fourth step 104 of pointing at the target 5 and the display, fifth step 105. As a result, by making a complete new scan of the environment with the camera 2, a new complete image of the environment together with the point of contact 91 of the guide beam 9 can be displayed during the fifth step 105, in order to update this display of the target 5 and of the environment.
[0122] Thereafter, during a pointing, sixth step 106, the operator points again at the target 5 with the guide beam 9. This pointing, sixth step 106 advantageously enables the operator to correct the aim if the point of contact 91 is not situated on the target 5 in the complete image displayed during the display, fifth step 105.
[0123] During a scanning, seventh step 107, the camera 2 performs a selective scan of the target 5 and of the point of contact 91 of the guide beam 9 in the environment. The camera 2 is a camera capable specifically of capturing information about particular items in the environment as a result of radiometric changes concerning them, e.g. because they are moving. For example, the camera 2 is capable of capturing specifically information about only those particular items in the environment that are moving, together with the points of contact 91 of the guide beam 9.
[0124] Finally, during a display, eighth step 108, this selective image of the target 5 is displayed together at least with the point of contact 91 of the guide beam 9 in the environment. Once more, the operator can view the point of contact 91 of the guide beam 9 in the selective image of the environment and can verify that the point of contact 91 is still indeed situated on the target 5. Advantageously, this selective image is simplified and displays mainly the target 5, which may be moving for example, together with the point of contact 91 of the guide beam 9 in the environment. Advantageously, this selective image speeds up analysis by the operator, who sees immediately the position of the point of contact 91 relative to the target 5. As in the display, fifth step 105, the crosshairs 8 may be displayed on the display means 3 in order to inform the operator about the point of the environment that is being aimed at. This selective image including the point of contact 91 of the guide beam 9 and the crosshairs 8 is shown in
[0125] The sixth, seventh, and eighth steps are then repeated until the projectile 10 strikes, the steps being performed continuously.
[0126] Once the target 5 has been identified and selected, the aiming-assistance device 1 thus uses the images it displays on the display means 3 advantageously to provide the operator with feedback about the positions of the target 5 and of the point of contact 91 of the guide beam 9 in the environment, and it does so in real time and while the operator is performing the aiming operation. The operator can then immediately correct any difference between the position of the point of contact 91 relative to the target 5 and thus improve the accuracy of the aim.
[0127] Furthermore, the aiming-assistance device 1 also serves to quantify the accuracy of the aim. Specifically, the computer 4 is configured to analyze each image that is displayed in succession on the display means 3 and to act during a ninth step 109 to determine a first number of points of contact 91 of the guide beam 9 that touch the target 5, and a second number of points of contact 91 of the guide beam that do not touch the target 5. The computer 4 also serves to calculate the percentage of these points of contact 91 actually touching the target 5 relative to the total number of points of contact 91 of the guide beam 9 in the environment. Thereafter, during a tenth step 110, information 92 about the accuracy of the points of contact 91 touching the target 5 can be displayed on the display means 3 in the form of this percentage of the points of contact 91 that are actually touching the selected target.
[0128] The ninth step 109 and the tenth step 110 preferably take place simultaneously with the sixth, seventh, and eighth steps as shown in the summary diagram of
[0129] Also, during the eighth step 108, the points of contact 91 of the guide beam 9 in the environment may be captured during a second predetermined duration, and then displayed together with the selective image of the target 5.
[0130] Under such circumstances, during the display, eighth step 108, the selective image of the target 5 and of the current point of contact 91 of the guide beam 9 as picked up during the scanning, seventh step 107 can be displayed simultaneously together with at least one of the points of contact 91 that were previously displayed during the display, fifth step 105 and during any preceding display, eighth steps 108.
[0131] Likewise, the ninth step 109 may also take place over the second predetermined duration. As a result, the accuracy information 92 displayed during the tenth step 110 is determined over this second determined duration.
[0132] This information 92 is displayed on the display means 3 together with the points of contact 91 captured during the second predetermined duration, as shown in
[0133] This information 92 can also be used by the system 24 guiding a projectile 10 in order to confirm, or else cancel, the locking of the projectile 10 on the target 5 and the launching of the projectile 10 towards the target 5. Specifically, if the aiming accuracy is judged by the operator to be too low, the operator can cancel launching of the projectile 10 or can pause it momentarily until sufficient aiming accuracy is achieved. Such cancellation may also be performed automatically if the accuracy information 92 about the points of contact 91 of the guide beam 9 touching the target 5 is below a predetermined threshold.
[0134] Finally, the aiming-assistance device 1 serves to identify the code of the guide beam 9 aiming at the target 5. Specifically, the computer 4 is configured to analyze the environment seen by the camera 2, and in particular the points of contact 91 of the guide beam 9 in the environment. During an eleventh step 111, the computer 4 can thus determine timing characteristics of the guide beam 9 and identify the code of the guide beam 9. This eleventh step 111 takes place after the fourth step 104 and in parallel with the following steps.
[0135] Naturally, the present invention may be subjected to numerous variations as to its implementation. Although several embodiments are described, it should readily be understood that it is not conceivable to identify exhaustively all possible embodiments. It is naturally possible to envisage replacing any of the means described by equivalent means without going beyond the ambit of the present invention.