System and method of marksmanship training utilizing a drone and an optical system
11662178 · 2023-05-30
Inventors
Cpc classification
F41A33/06
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G3/26
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41A33/02
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
A63F9/0291
HUMAN NECESSITIES
F41J5/10
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41A33/04
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G3/2605
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
International classification
F41G3/26
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41A33/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Abstract
A shooting simulation system and method. The system includes a plurality of firearms. Each firearm is associated with a separate soldier having a man-worn computer, a location device for determining a location of the soldier, an optical system for capturing an image where the captured image provides information on a trajectory of a virtual bullet fired from a shooting firearm, and an orientation device for obtaining the orientation of the firearm when shooting the firearm. Furthermore, the system includes an aerial drone having a camera to capture a second image. The system also includes a shooter/target location resolution module for identifying a valid target and a target image recognition module for determining an impact location where a virtual bullet from the shooting firearm would impact within the captured images and determining if an identified target from the captured images is a hit or a miss.
Claims
1. A method, comprising: actuating a trigger of a firearm to fire a simulated bullet at a target; determining, using one or more computers, that the target is a valid target, wherein the one or more computers identify an identifier associated with the target to determine that the target is a valid target; capturing an image when the trigger is actuated, wherein the image is captured using a camera mounted on the firearm and a drone camera mounted on a drone; determining, using the one or more computers, a trajectory of the simulated bullet; determining, using the one or more computers and based on the determined trajectory, an impact location where the simulated bullet would impact, wherein the one or more computers analyze the image and use the captured image to calculate the impact location; and determining, using the one or more computers and based on the determined impact location, a hit or miss of the simulated bullet on the target, wherein the one or more computers use the captured image from the camera mounted on the firearm and the drone camera to determine the hit or miss of the simulated bullet on the target.
2. The method of claim 1, wherein the one or more computers use the captured image to determine the trajectory of the simulated bullet.
3. The method of claim 1, further comprising: detecting a heading of the firearm, wherein the heading of the firearm is detected using an orientation sensor mounted on the firearm; and detecting an orientation of the drone camera, wherein the heading and pitch of the drone camera is detected using a drone orientation device mounted on the drone.
4. The method of claim 3, wherein the one or more computers use the triangulation analysis of the detected heading of the firearm and the detected orientation of the drone camera to determine the trajectory of the simulated bullet.
5. The method of claim 3, wherein the one or more computers use the detected heading of the firearm and the detected orientation of the drone camera to determine that the target is a valid target.
6. The method of claim 1, further comprising: detecting a location of the firearm, wherein the location of the firearm is detected using a first location sensor associated with the firearm.
7. The method of claim 6, wherein the one or more computers use the detected location of the firearm to determine that the target is a valid target.
8. The method of claim 6, further comprising: detecting a location of the target, wherein the location of the target is detected using a second location sensor associated with the drone.
9. The method of claim 8, wherein the one or more computers use the detected location of the drone to determine that the target is a valid target.
10. The method of claim 9, wherein the one or more computers use the detected location of the firearm and the detected location of the drone to determine that the target is a valid target.
11. The method of claim 1 wherein the image captured using the camera mounted on the firearm is obscured and the drone camera provides a different view of the target.
12. A system, comprising: a firearm, the firearm comprising a trigger adapted to be actuated to fire a simulated bullet at a target; a camera mounted on the firearm and adapted to capture a first image when the trigger is actuated; a drone camera mounted on an aerial drone and adapted to capture a second image of the target; and one or more computers configured to: determine that the target is a valid target, wherein the one or more computers are adapted to identify an identifier associated with the target to determine that the target is a valid target; determine a trajectory of the simulated bullet; determine, based on the determined trajectory, an impact location where the simulated bullet would impact, wherein the one or more computers are adapted to analyze the first and second images and use the first and second images to calculate the impact location; and determine, based on the determined impact location, a hit or miss of the simulated bullet on the target, wherein the one or more computers are adapted to use the captured first and second images to determine the hit or miss of the simulated bullet on the target.
13. The system of claim 12, wherein the one or more computers are adapted to use the captured first and second images to determine the trajectory of the simulated bullet.
14. The system of claim 12, further comprising: an orientation sensor mounted on the firearm and adapted to detect a heading of the firearm; and a drone orientation sensor mounted on the drone and adapted to detect an orientation of the drone camera.
15. The system of claim 14, wherein the one or more computers are adapted to use the detected heading of the firearm and the orientation of the drone camera to determine the trajectory of the simulated bullet.
16. The system of claim 15 wherein the one or more computers are configured to utilize triangulation of the detected heading of the firearm and the orientation of the drone camera to determine the trajectory of the simulated bullet.
17. The system of claim 14, wherein the one or more computers are adapted to use the detected heading of the firearm and the orientation of the drone camera to determine that the target is a valid target.
18. The system of claim 12, further comprising: a first location sensor associated with the firearm and adapted to detect a location of the firearm; and a second location sensor associated with the drone and adapted to detect a drone location of the drone.
19. The system of claim 18, wherein the one or more computers are adapted to use the detected location of the firearm and the drone to determine that the target is a valid target.
20. The system of claim 18, further comprising: a second location sensor associate with the target and adapted to detect a location of the target.
21. The system of claim 20, wherein the one or more computers are adapted to use the detected location of the target to determine that the target is a valid target.
22. The system of claim 21, wherein the one or more computers are adapted to use the detected location of the firearm and the drone to determine that the target is a valid target.
23. The system of claim 12 wherein the image captured using the camera mounted on the firearm is obscured and the drone camera provides a different view of the target.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
DESCRIPTION OF THE INVENTION
(5) The present invention is a shooting simulation system and method.
(6) Oftentimes, a soldier does not have a clear view of the target, such as when the target is partially obscured by a bush or tree. To enhance identification and location refinement of enemy and friendly forces in such a situation, the system 10 may include a drone 80 or unmanned aerial vehicle. The drone includes a drone camera 82, a drone GPS device 84, a drone orientation device 86, and a transmitter/receiver 88. The drone camera allows capture of images. The drone GPS device 84 provides the location of the drone 80 while the drone orientation device provides yaw, pitch, and roll information on the orientation of the line-of-sight of the drone camera 82 with an image on the ground. The transmitter/receiver 88 may be configured to communicate with central computing system 26 or directly with the man-worn computer 16. The drone may be controlled by a soldier or any individual communicating with the central computing system. Alternatively, the drone could be autonomous programmed to a specific squad or soldier. The drone is utilized for observing forces and is preferably a defensive or reconnaissance platform, rather than an offensive platform.
(7)
(8) The optical system 18 may include the optical image capturing device (mounted on the firearm) which captures an image when the trigger is actuated. The optical image capturing device 52 is aligned relative to a known orientation or sight of the firearm and captures an image when the trigger 32 is actuated. The image is then recorded and stored in one or more modules, such as the target image recognition module 60, the man-worm computer 16 or the central computing system 26. Furthermore, the image recording device may be integrated into a scope used on the firearm. The optical system 18 may be located in the firearm or portions of the optical system and with the exception of the optical image capturing device, may be separate from the firearm but still carried by the soldier (e.g., in the man-worn computer 18). In addition, the optical image capturing device may transmit the captured image without recording the image, as the image may be recorded in another node, such as the man-worm computer. In one embodiment, the firearm, and associated components (i.e., the optical image capturing device) may communicate via a wireless or wired link with the man-worm computer. In one embodiment, the optical system, with the exception of the optical image capturing device, and/or man-worm computer are incorporated in a smart mobile phone. In a similar fashion as the optical image capturing device 52, the drone camera captures images and may be sent to the target image recognition module 60, the man-worn computer 16 or the central computing system 26.
(9) The system 10 may include the target image recognition module 60 which may be located anywhere in the system, such as the man-worn computer 16, the central computing system 26 or in another node of the system 10. The target image recognition module 60 may store data on ballistics for bullets or other munitions which would be fired from the firearm. The target image recognition module 60 is utilized to determine where a firearm's virtual bullets/munitions impacts, i.e., the impact location, relative to the intended target based on the captured image at the time of trigger actuation. Furthermore, target image recognition module 60, utilizing the calculated impact location, provides the functionality on determining if a hit or miss is awarded for the captured image based on where the virtual bullets/munitions of the firearm are calculated to hit relative to the target by the target image recognition module 60. Additionally, the system may include a shooter/target location resolution module 62 which may utilize coordinate system mathematics to determine if a valid target is within a predetermined resolution zone 70, as depicted in
(10)
(11) The target image recognition module 60 may utilize silhouette extraction techniques of targets (e.g., soldiers, vehicles, human forms, etc.) to determine and recognize a target. For instance, silhouette extraction of targets may be obtained by utilizing computer vision techniques as well as ancillary identifiers, such as helmets, gun shape, vehicle features, etc. Furthermore, as targets are known to the system, the potential targets can be photographed and added to a database and artificial intelligence may learn to recognize specific targets.
(12) The man-worn computer 16 may also include an aural system, which may be incorporated in the firearm itself or as a separate component worn by the soldier 12. The aural system may provide an indication of when a hit has been calculated against the targeted soldier (e.g., designating a kill to the targeted soldier), near miss cues (e.g., bullet flyby noise for close shots).
(13) The target image recognition module 60 may determine if the image is a recognizable target (e.g., a human form). The target image recognition module 60 may utilize several sources of information to verify the validity of the target. Furthermore, the target image recognition module 60 may include ballistic data of a projected firing of a bullet or other type of projectile utilized by the firearm to determine where the bullet would hit. Moreover, the shooter/target location resolution module 62 may receive the geographic location indicia of soldiers utilizing the system 10 and identify a target within the zone 70. In one embodiment, the shooter/target location resolution module 62, by obtaining the geographic location indicia of both the shooter and the target, may know the range between the firearm and the target. In addition, the target image recognition module 60 may optionally be used to determine an accurate projected trajectory of the bullet (i.e., the bullet ballistics) for the particular target at a determined range, thereby determining an impact location of the bullet. As discussed above, the determination of where a virtual bullet/munition would impact, and thus determine a hit or miss may utilize various forms of data. Furthermore, the orientation device 24 may provide the orientation of the firearm relative to a known three-dimensional coordinate system through the measurement of roll, yaw and pitch rotations of the firearm, the distance to the target, weather conditions (wind, altitude, etc.), movement of the gun, etc. which may also be used to determine the trajectory of the bullet/munition and its impact location. The calculated bullet's trajectory from the target image recognition module 60 is then used to determine where the bullet would have hit, and from the determination of the bullet's virtual position relative to the intended target, a determination of a hit or miss may be accomplished. Thus, the present invention may be utilized to accurately determine the position where the virtual bullet would impact, i.e., the impact location, relative to the target, and thereby determine if it is a hit or miss. A hit may be defined by predetermined constraints, which may be stored in the man-worm computer, central computing system or other node in the system for determining a hit. The man-worn computer 16 may utilize various navigation and motion systems to collect data for accurate determination of the bullet's trajectory and/or location of the soldier, such as GPS, accelerometers, and magnetometers. The ultimate determination of a hit or miss is accomplished by the target image recognition module 60 if a valid target is determined to be within the resolution zone as determined by the shooter/target location resolution module 62. Furthermore, the drone 80 may communicate information (images, orientation of the drone camera, drone location, etc.) to the target image recognition module 60, thereby providing additional information for determining the image location, impact location of any bullet/munition as well as actual identification of friendly or enemy forces in the captured image. Additionally, by have images captures from a different location than the soldier's perspective, further views often obscured from the soldier may be visible from the drone's position to provide more accurate information. Furthermore, by utilizing the drone camera, identification of the target can be enhanced. Utilizing two images, the target image recognition module may determine a more accurate probability of which target the bullet impacts.
(14) In one embodiment, the captured image, a portion of the image (relevant cropped image) or several images and any relevant data are sent to the target image recognition module 60. In one embodiment, the target image recognition module 60 resides in the man-worm computer 16. In another embodiment, the target image recognition module 60 resides with the central computing system. The optical system of the firearm, in one embodiment, to reduce transmission data, may send a cropped image of the relevant portion of where the virtual bullets or munitions would impact (impact location) to any remotely located target image recognition module 60. The central computer may also provide the functionality to manage a wireless network encompassing the plurality of soldiers having firearms 14. The target image recognition module 60, through information gathered from the shooter/target location resolution module 62 (whether a valid target is within the resolution zone 70) and the target image recognition module 60 (impact location of the bullet) determines a hit or miss. As discussed above, the target image recognition module 60 may reside anywhere within the system. In one embodiment, the target image recognition module 60 resides with the central computing system 26. The central computing system may provide overall control of a training session, such as tabulating and informing soldiers of a hit, a kill or a miss, and control timing of the training session. Furthermore, where a target is concealed behind objects such as bushes, trees or buildings, the target image recognition module 60 or other node or module may determine the probability of a hit, kill, or miss. The shooter/target location resolution module 62 along with the target image recognition module 60 may resolve the majority of shooting scenarios realistically, however there are situations where more analysis is needed for a realistic simulation. A disambiguation module 28 may be utilized in various scenarios. The disambiguation module 28 may reside anywhere in the system, such as the man-worm computer or the central computing system. In one scenario, a common tactical technique used by soldiers is known as “recon by fire.” From a covered position, soldiers fire into a location where enemy soldiers may be concealed behind bullet penetrable objects, such as bushes. In the real world, the shooting soldier would see or hear an active response, return fire, sounds, movement or get no response. The shooter/target location resolution module 62 is aware of the enemy's location and if outside the resolution zone, issues a miss. However, if the shooter/target location resolution module 62 determines that the enemy is within the resolution zone, the target image recognition module sees bushes and cannot determine hit/miss. The real-world soldier also cannot know a hit/miss with certainty. In this case, the system would apply a hit probability based on the number of bullets fired into the resolution zone. Another possibility is that the enemy soldier is not only concealed by bushes but also covered by an impenetrable wall. To resolve this situation, the system may utilize a terrain database (most live training occurs at bases where the terrain is well known). In this scenario, the shooting soldier would get a miss just as he would in the real world. In another situation, where a soldier leads a moving target, further calculations must be made. To determine a hit/miss, the system, through the disambiguation module, must compute the path of the target and the bullet to determine if they intersect at a point in time. Subsequent images taken before and immediately after the trigger pull may be used to verify computations, using velocity of the target and bullet ballistics. In one embodiment of the present invention, a terrain database and/or artificial intelligence (AI) may be utilized. This image-based system is ideal for establishing and maintaining a high-fidelity representation of real-world terrain features. During a training exercise, each shot fired will yield at least one high resolution uncompressed image. The man-worm computer has the capacity to save complete images including misses and a large portion of the image which is not needed by the target image recognition module to determine hit/miss. Each image may be logged with geographic location and field of view orientation. Hundreds of images from exercises may be added in to update the database with changes to structures and seasonal foliage. Saved images that contain a valid target including misses may also be used to train AI programs.
(15) It should be understood that the calculation of a hit or miss as well as the identity of the target is determined by information gathered by the target image recognition module 60 and the shooter/target location resolution module 62 and does not require the use of beacons or other identifying indicia worn by the targeted soldier or vehicle. Thus, the present invention utilizes sensors/data obtained from the captured image and the location indicia generated by the GPS device of each firearm and the targeted soldier is a passive target which emits no active electronic emissions for identifying the targeted soldier.
(16) In another embodiment, the determination of a hit or miss from virtual bullets/munitions can be calculated in a distributed network, where specific calculations or procedures are done by specific components (nodes) in the network. For example, some of the calculations may be conducted by the man-worn computer while other calculations are completed by the central computing system. In one embodiment as discussed below for system 110, the target image recognition module 60 (which may reside in the central computing system 26) adjudicates (determines) if a virtual bullet/munition fired by the shooting firearm is a hit (including where the hit is on the target), kill, miss on a target and what target. To illustrate, the optical image capturing device captures the image. In a first calculation step, the shooter/target location resolution module 62 determines if a valid target lies within the resolution zone. The shooter/target location resolution module 62 determines if a valid target from information such as orientation of the firearm and the geographical locations of the shooter and the target is within the resolution zone. In this first calculation step, if it is determined that the target does not lie within the resolution zone 70, no further calculation is necessary as the shot would be considered a miss. However, if it is determined that a valid target lies in the resolution zone 70, a second calculation step may be performed by the target image recognition module 60 which utilizes stored ballistics for the firearm and munitions used as well as using the captured image to determine a more exact and accurate impact location of the bullet or munition. This information is then utilized by the target image recognition module 60, which determines a hit or miss. In another embodiment, for a moving target, the target image recognition module 60 or disambiguation module 28 calculates where the moving target would be by using the distance traveled by the target over a certain time and from this information, determine if a bullet/munition would hit the target. In this way, a soldier may practice “leading” the moving target, to provide realistic marksmanship training. Furthermore, the system may employ artificial intelligence (AI) to learn from each training session to improve the accuracy of the hit/miss adjudication. Also, in another embodiment of the present invention, each soldier may include ancillary identifiers which assists the optical system in determining if the target is a human.
(17) With reference to
(18) The target image recognition module 60 may store ballistic data for the firearm as well as the shooting conditions to assist in determining where the virtual or notional bullets/munitions would actually hit based on parameters at the time of firing. As discussed above, the determination of whether a valid target lies in the resolution zone 70 performed by the shooter/target location recognition module 62 may utilize various forms of data. The inclination and orientation of the barrel of the gun, distance to the target, location of the target and shooter, drone location, drone camera orientation, etc. may be used to determine if any valid target is being targeted within the resolution zone 70. If there is no valid target within the resolution, no further calculations are necessary since there is no possibility of hitting a target if there is no target. However, if there is a valid target identified within the resolution zone 70, the target image recognition module 60 may, using various types of data, perform a determination or second calculation by the system to determine the impact location of the bullet/munition. Various types of information may include the movement of the gun, weather conditions (wind, altitude, etc.), range between the shooting firearm and the target, ballistics of the firearm and munition may all be used to determine the trajectory of the bullet in combination of extracting a trajectory from the captured image. The target image recognition module 60 may utilize various navigation and motion systems to collect data for accurate determination of the bullet's trajectory and/or location of the soldier, such as GPS, magnetometers, and accelerometer. Thus, the shooter/target location resolution module 62 first identifies if a valid target is within the resolution zone and the target image recognition module 60 determines the impact location of the bullet. Furthermore, the target image recognition module 60 determines if the impact location of the bullet is a hit or miss.
(19) The central computing system may receive the hit or miss data from the target image recognition module 60 and may independently determine/verify a hit or miss of the target. In addition, the central computing system then manages the location of all the soldiers as well as compiling all the hits and misses of each soldier at a specific location and time during the simulation. This compilation may be used for debriefing of the soldiers and determination of the success of each soldier and each team. The central computing system may compile such data as time of firing, accuracy, number of bullets fired, times the soldier is targeted, etc. In one embodiment, the central computing system may provide a playback of each encounter providing a graphical representation of each soldier, trajectory of the bullets, etc. In addition, the optical system may capture images which are enhanced by infrared detection or night vision systems enabling optical image pickup in reduced visibility. These images may be downloaded to other computer devices or printed. Furthermore, the central computing system may send back information on a hit or miss to the intended target. For example, the target (targeted soldier or other object) may be informed that he is killed by receiving an aural warming. The target image recognition module 60 may also determine where a hit occurs on the target and if the target is killed or disabled. In addition, where a target is hidden behind cover (e.g., a building) or concealment (e.g., a bush), the man-worm computer or central computing system may determine if the target is hit. A Monte Carlo simulation which provides probability of random events (e.g., whether a bullet would hit a concealed target) may be employed for determining a hit. This may include a probability chart based on variables such as range, shots fired, etc.
(20) The present invention may also utilize an aural system to alert a soldier that the soldier has been hit or utilize blanks fired from the firearm to provide realistic sounds during the simulation (e.g., firing of the firearm, such as the firing of blanks or bullets passing in close proximity to the soldier).
(21)
(22) The target image recognition module 60 may optionally send the hit/miss information and any relevant data to the central computing system which then manages the location of all the soldiers as well as compiling all the hits and misses of each soldier at a specific location and time during the simulation. This compilation may be used for debriefing of the soldiers and determination of the success of each soldier and each team. The central computing system may compile such data as time of firing, accuracy, number of bullets fired, times the soldier is targeted, etc. In one embodiment, the central computing system may provide a playback of each encounter providing a graphical representation of each soldier, trajectory of the bullets, etc. In addition, the central computing system may independently determine/verify a hit or miss of the target. Since the central computing system includes the position of each soldier and the information on the triggered firearm (e.g., heading and inclination of barrel, distance to target, etc.), the central computing system may determine/verify a hit or miss. In step 214, this verification of a hit may be sent back to the intended target (i.e., the targeted soldier) to inform of a hit.
(23) The present invention may optionally utilize geographic location indicia generated by the GPS device 20 carried with each soldier. The GPS device may then transmit this location indica to the shooter/target location recognition module 62 where the location of each soldier, both target and shooter are determined. This location indicia may be used to identify the appropriate target and shooter and user to determine if the projected impact location of the bullet or munition is within the resolution zone 70. To minimize data transmission, location data could be sent only to soldiers within range of one another
(24) In another embodiment of the present invention, the system 10 may perform the various computing functions in a distributed network. In this network, the firearm (man-worn computer) communicates with one or more firearms (man-worm computer) using the wireless transmitter/receivers 16. Any necessary information is passed from one node (i.e., firearm or man-worm computer) to another node. In one embodiment, the wireless transmitter/receiver enables the use of a wireless network for communicating between each firearm/man-worm computer. The functionality of the target image recognition module 60 and the shooter/target location resolution module 62 may reside in any node, such as a man-worm computer or the central computing system 26 depending on where efficiency and reduced latency occurs.
(25) The various components (e.g., parts of the optical system, wireless transmitter/receiver, image recording device, etc.) associated with each firearm in system 10. For example, the man-worn computer may be a separate component worn by the soldier and communicating with the firearm or may be integrated into the firearm. Furthermore, the firearm may be incorporated with a vehicle, either manned or unmanned.
(26) Although the present invention has illustrated the use of firearms, the present invention may also be incorporated in vehicles, such as tanks, aircraft, watercraft, and armored personnel carriers. The computing system may determine the legitimacy of such targets in its image recognition program. In addition, the present invention may be used for various scenarios such as within law enforcement field or recreational field.
(27) The present invention provides many advantages over existing shooting simulation systems. The present invention does not require the wearing of sensors by soldiers to detect a hit by a laser or other device. Furthermore, the targeted soldier does not need to emit an active electronic emission and may be a passive target. Additionally, in one embodiment, the shooting firearm does not need to emit any spectral emissions to determine if the image is a legitimate target. Thus, the cost of equipment is drastically reduced. Furthermore, the present invention enables the accurate calculation of a bullet's trajectory rather than the straight line of sight calculation used in laser simulation systems. In addition, the present invention provides for the carriage of light weight and cost-effective equipment (i.e., an optical system) for use on the firearm. The present invention may be incorporated in existing operational firearms or built into realistic replicas. Additionally, the present invention may be utilized for bore sighting or zeroing a weapon.
(28) The present invention may be utilized between two soldiers, a single person against another target, a vehicle (including a tank, watercraft, aircraft, or surface vehicle) and another target, or in force on force exercises. Unlike other simulated shooting systems, the present invention goes beyond the mere scoring of a hit or miss. The present invention may be incorporated in real weapons and used for marksmanship training. Thus, the present invention may be used for training with real world firearms.
(29) While the present invention is described herein with reference to illustrative embodiments for particular applications, it should be understood that the invention is not limited thereto. Those having ordinary skill in the art and access to the teachings provided herein will recognize additional modifications, applications, and embodiments within the scope thereof and additional fields in which the present invention would be of significant utility.
(30) It is therefore intended by the appended claims to cover any and all such applications, modifications, and embodiments within the scope of the present invention.