SMALL ARMS SHOOTING SIMULATION SYSTEM
20170316711 · 2017-11-02
Assignee
Inventors
- John Surdu (Severn, MD, US)
- Josh Crow (Orlando, FL, US)
- Chris Ferrer (Orlando, FL, US)
- Rick Noriega (Longwood, FL, US)
- Peggy Hughley (Winter Springs, FL, US)
- Padraic Baker (Orlando, FL, US)
Cpc classification
F41J5/10
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G3/2605
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
International classification
Abstract
A shooting simulation system and method for training personnel in targeting visual and non-line-of-sight targets. The firearm simulation system has a plurality of participants each having a firearm and each being equipped to transmit their location to a remote computer server for storage and use with other transmitted data to determine which participant was a Shooter and which participant was the Shooter's target and for determining a simulated hit or miss of the target and assessing the simulated damage to the target.
Claims
1. A simulation system of direct and non-line of sight shooting comprising: a plurality of firearms, each said firearm having a trigger sensor and one said firearm being held by each of a plurality of participants in the simulation, and each participant having a computer and a position location sensor for determining a participant's location, orientation and movement information, and each firearm having an orientation sensor for recording the orientation of the firearm with respect to a known three-dimensional coordinate system, and an optical system aligned to the sights of the firearm for capturing the sight picture at the time the trigger sensor is activated to provide image information about the aim point of the Shooter participant's firearm with respect to an intended target participant; and a remote computer server having an entity server database, and a target resolution module, said remote computer server being wirelessly coupled to each said participant to periodically receive and store each participant's position location, sensor location, orientation and speed information in said server entity state database and for use by said remote computer server receiving the captured image and the orientation of the Shooter participant's firearm at the time the trigger sensor is activated for use by the computer server target resolution module for identifying the target participant when the Shooter participant's firearm trigger sensor is activated by a Shooter participant; wherein the computer server stores reported information on each of a plurality of participant's location, orientation and speed and remotely determines the identification of the target participant of the Shooter participant who activates his trigger sensor.
2. The simulation system in accordance with claim 1 in which said remote server has a hit resolution module receiving the target resolution information and identifying a simulated hit or miss of the target participant.
3. The simulation system in accordance with claim 2 in which said remote server has a damage effects module receiving information from said hit resolution module to determine simulated damage to simulated target participant from a simulated hit.
4. The simulation system in accordance with claim 1 in which said remote server has a hit resolution module receiving the target resolution information and identifying a simulated hit or miss of the target participant to remotely identify the target participant and compute the trajectory of a simulated round fired from the participant's rifle.
5. The simulation system in accordance with claim 1 wherein a participant computer may be a participant worn computer worn separate from the shooting firearm and gathers and transmits to the remote computer server data from the orientation sensor, the sight picture, the Shooter's location for determination of a hit or miss of the target participant.
6. The simulation system in accordance with claim 1 in which said optical system operates with visual and non-visual light spectra.
7. The simulation system in accordance with claim 6 in which said optical system operates with visual and infra-red light spectra.
8. A method of simulating firearm use between a plurality of participants comprising the steps of: equipping each of a plurality of participants with a firearm having a trigger sensor and an orientation sensor for recording the orientation of the firearm with respect to a known three-dimensional coordinate system, and an optical system aligned to the sights of the firearm for capturing the sight picture at the time the trigger sensor is activated to provide image information about the aim point of the Shooter participant's firearm with respect to an intended target participant; equipping each of said plurality of participants with computer and a position location sensor for determining the location, orientation and movement information of the participant; selecting a remote server having an entity state database and a target resolution module; periodically communicating and storing each of said participant's position location sensor's location, orientation and movement information to said remote server's entity state database; receiving the captured image and the orientation of the Shooter participant's firearm at the remote computer server when the trigger sensor is activated; and determining which participant is a Shooter participant activating a firearm's trigger sensor and which participant is the target participant of said Shooter participant with said target resolution module with information stored in said entity state database and said received captured image and the orientation of the Shooter participant's firearm; wherein the remote computer server stores reported periodic information on each of a plurality of participants' location, orientation and movement for computing the remote identification of a target participant of a Shooter participant.
9. The method of simulating firearm use in accordance with claim 8 including the step of identifying a target hit or miss when the Shooter participant's firearm trigger sensor is activated by a Shooter participant with the remote computer server hit resolution module.
10. The method of simulating firearm use in accordance with claim 8 including the step of remotely identifying the target participant of a Shooter participant and computing the trajectory of a simulated round fired from the participant's rifle.
11. The method of simulating firearm use in accordance with claim 8 including the step of determining in the remote computer server the reported locations of all participants and the reported orientation of the shooting firearm to determine a list of identities of participants who are possible target participants for the shooting participant.
12. The method of simulating firearm use in accordance with claim 8 in which the remote server disambiguates which participant is the intended target when the list of possible targets includes more than one participant using the captured image or images from the optical system.
13. The method according to claim 8 including the step of computing the range between the Shooter participant and the target participant in the remote computer server and the time of flight of a simulated projectile to the target participant to determine a hit or miss of the target participant of a simulated projectile.
14. The method according to claim 13 including the step of computing in the remote computer server the velocity of a moving target participant to determine the location of the target participant at the time of flight of the simulated projectile.
15. The method according to claim 13 including the step of computing in the remote computer server the effect on the simulated projectile of the weather, atmospheric information, and the terrain.
16. The method according to claim 8 including the step of informing the Shooter and target participants of the simulated hit or miss of the target participant.
17. The method according to claim 8 including the step of determining in the remote computer server the location, type, and severity of simulated wounds inflicted by a hit on a target participant.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] The accompanying drawings, which are included to provide further understanding of the invention are incorporated in and constitute a part of the specification, and illustrate an embodiment of the invention and together with the description serve to explain the principles of the invention.
[0020]
[0021]
[0022]
[0023]
DESCRIPTION OF THE INVENTION
[0024] The present invention is a system for simulating live, force-on-force simulated firearms engagements at realistic ranges. The Shooter can be a person with a direct fire small arm, such as a rifle or submachine gun or with an indirect fire or high-trajectory firearm, such as a grenade launcher, or an unmanned ground vehicle or unmanned aerial vehicle. The invention simulates a plurality of firearms. The system is symmetrical and homogenous in that a Shooter can also be a target, and vice versa.
[0025] In
[0026] The Shooter 10 aims his firearm at his Target 11 and pulls the trigger which activates a trigger sensor. The Shooter's location, firearm orientation, and sight image are transmitted to the wireless relay. The sight image is a digital representation of the Shooter's view through his firearm's sight when he pulls the trigger. The location and orientation of the Shooter 10 and his sight image are transmitted to the Remote Server 14 and to the Interaction Manager 16. The Interaction Manager queries the target Resolution Module 17, which produce a list of possible targets from the Entity State Database based on the firearm location, orientation, known position sensor error, and known orientation sensor error. This list of possible targets is provided to the Hit Resolution Module 18.
[0027] The Hit Resolution Module 18 runs the multiple, multi-spectral algorithms to find targets in the sight image. Multiple algorithms may be used based on environmental conditions and other factors that influence which algorithms will be the most successful. This step includes processing the sight image to locate targets and determining the relationship between the aim point and the target based on the sight image. For instance, did the Shooter aim high, low, left, or right of center of mass of the target.
[0028] The Hit Resolution Module 18 calls the Target Reconciliation Module 20, which reconciles results from the computer vision computation with information from the Entity State Database. This step identifies which targets from the Target Resolution Module 20 correspond to targets identified by the computer vision algorithm. This step is purely based on the results of employing a plurality of computer vision (CV) algorithms and does not rely on any artificial indicia in the scene. The CV algorithms use a plurality of algorithms to construct a silhouette around the target; however, if the CV algorithms cannot construct a full silhouette, they then construct a bounding box around the targets in the scene.
[0029] The Hit Resolution Module 18 queries the Munitions Fly-out Module 21 for the flight time of the projectile and adjustments to the trajectory of the round. These adjustments can be based on range (e.g., drop of the round over distance), atmospheric effects, weather, wind, interactions with the terrain, and other factors as required to accurately predict the trajectory of the round. The system uses a representation of the terrain in the area of interest to compute whether the simulated projectile struck the target.
[0030] The Hit Resolution Module 18 computes whether the trajectory of the round intersects the target determined by the Target Reconciliation Module 17 based on the adjusted trajectory, time of flight, and relative velocity of the target. Relative velocity accounts for movement of the target, the Shooter, and the Shooter's firearm. If the round strikes the projected target location at time of impact, the Hit Resolution Module 18 calls the Damage Effects Module 22. This module computes the damage to the target based on the firearms characteristics, the munitions characteristics, and location of the calculated impact point in the target's calculated silhouette. Damage effects indicate the extent of damage to the target, such as whether the target was killed, sustained a minor wound or major wound, the location of the wound, and the like.
[0031] A near miss is reported through the wireless relay 12 and retransmitted to the Target 11 and the Shooter 10, respectively, who are informed of the near-miss results via audio and visual effects similar to the existing MILES system. A hit result is reported through the wireless relay 12 and re-transmitted to the Target 11 and the Shooter 10, respectively. The Shooter is notified of a hit, and the Target is notified that he was hit, with what firearm or round he was hit, and the severity of the damage.
[0032]
[0033] When the participant pulls the trigger on his training rifle, the Trigger Pull Sensor 25 sends a message to the Participant-Worn Computing Device 24. The Participant-Worn Computing Device 24 captures the trigger-pull events. The Firearm Orientation Sensor 26 returns the firearm orientation to the Participant-Worn Computing Device 24. Similarly, the Image Capture Device 27 provides the sight image as seen by the Shooter 10 to the Participant-Worn Computing Device. The Image Capture Device 27 may provide: [0034] 1. A mix of visible spectrum, non-visible spectrum, and multi-spectral images. [0035] 2. A video image or a series of still images. [0036] 3. Images from a single viewpoint or multiple viewpoints. [0037] 4. Images from narrow and wide-angle viewpoints.
[0038] The Participant-Worn Computing Device 24 sends the location and orientation of the firearm as well as the sight images via the Wireless Relay 12 to the Remote Server 14.
[0039] The target is not augmented with indicia or beacons. Other than the participant-worn subsystem, the target includes only his operational equipment.
[0040] In
[0041] The Orientation Sensor 26 provides three-dimensional orientation with respect to the geomagnetic frame of reference. This three-dimensional representation can be in the form of a quaternion; yaw, pitch, and roll; or other frame of reference, as appropriate. The Orientation Sensor 26 is calibrated to the fixed coordinate system when the system is turned on, and it can be periodically recalibrated during a simulation event as necessary. The orientation sensor may employ a plurality of methods to determine three-dimensional orientation. There is no minimum accuracy requirement for the Orientation Sensor 26; although, a more accurate orientation sensor reduces the burden on the Target Reconciliation Module 17.
[0042] The Location Sensor 23 provides the Shooter's location with respect to a fixed reference frame. In the current embodiment, this is provided as latitude and longitude, but other coordinate representation methods may be employed. The participant's speeds may be measured directly by the position sensor or may be inferred through the collection of several position reports over time.
[0043] The location, orientation, and velocity updates are transmitted 13 to a Remote Server 14, where they are stored in the Entity State Database 15 for later use, as shown in
[0044] As depicted in
[0045] As shown in
[0046] The sight image is a digital representation of the Shooter's view through his firearm's sight when he pulls the trigger. The image capture device 27 is aligned with the barrel and sights of the simulated firearm so that the image captured from the device is an accurate representation of the Shooter's sight picture when the trigger was pulled. In the first embodiment of the invention, the image capture device 27 is the same scope through which the Shooter is aiming the firearm, but the image capture device may be separate form the weapon sights. The image capture device 27 may provide:
[0047] A mix of visible spectrum, non-visible spectrum, and multi-spectral images;
[0048] A video image or a series of still images;
[0049] Images from a single viewpoint or multiple viewpoints; and
[0050] Images from narrow and wide-angle viewpoints.
[0051] The Position Location Sensor 23 provides periodic updates of the participant's location, orientation, and speed to the Participant-Worn Computing Device 24. The Participant-Worn Computing Device transmits these updates to the wireless relay 12.
[0052]
[0053] The location and orientation of the Shooter 10 and his sight image are transmitted from the Wireless relay 12 to the Remote Server 14 and the Interaction Manager 16. Any communication means with sufficient bandwidth may be used in this step of the process. The Participant-Worn Computing Device 24 may perform preprocessing of the captured sight picture to reduce bandwidth requirements. Pre-processing includes, but is not limited to, cropping the image, reducing the resolution of the image, compressing the image, and/or adjusting the tint, hue, saturation, or other attributes of the image.
[0054] In
[0055] The Target Resolution Module 17 provides this list of possible targets to the Hit Resolution Module 18. In
[0056] In
[0057] In
[0058] Having determined the intended target, in
[0059] In
[0060] In
[0061] The Munitions Fly-Out Module 21 accounts for weapon systems that detonate based on range to the target, distance from the firearm, or other factors, by determining when the detonation occurs. As an example, but not a limitation of the invention, if a Shooter fires simulated munitions from his firearm that explode at a pre-sent distance, the Munitions Fly-Out Module 21 computes the trajectory of the munitions to their points of detonation. The locations where the munitions detonated are then passed to the Damage Effects Module 22 to compute damage to any nearby participants.
[0062] In
[0063] In
[0064] In
[0065] The system records information from the Remote Server 14 to assist in reviewing the training event. Information such as, but not limited to, participant's locations over time, sight pictures when triggers were pulled, sight pictures after the CV algorithms have processed them, results from the Target Reconciliation Module 20, and status of participant-worn devices may be displayed to an event controller during and after the training event.
[0066] This invention is equally applicable to high-trajectory or non-line of sight shooting. In the case of high-trajectory fire, the image from the Image Capture Device 27 is not necessary. The modified process for non-line of sight and high-trajectory shooting is depicted in
[0067] In Step 206, the Target Resolution Module 17 queries the Entity State Database 15 to determine whether any participants, friendly or enemy, are within the burst radius of the simulated munitions. In Step 207, the Munitions Fly-Out Module 21 predicts the locations of those participants at the time of impact or detonation of the simulated munitions. In Step 208, for each participant within the burst radius of the munitions, the Damage Effects Module 22 determines if the participant is hit, where the target was hit, and the severity of the damage, just as described in Step 115,
[0068] In Step 209, if a participant received a hit from a high-trajectory shot, in Step 212, the target is notified of the results, including location(s) and severity of wounds. The Shooter 10 may be notified that he has hit his target as well. In an augmented reality situation, this notification might come in the form of a depiction of an explosion near the target(s). If the high-trajectory shot is a miss or near miss, in Step 210, this is reported to the target. The Shooter 10 may also be notified in Step 211. The reporting of hits and misses can be configured based on different training situations. For instance in one training mode, the system sends feedback to the Shooter 10 after each shot so that the Shooter may learn from each shot and improve his marksmanship. In another training mode, such as simulating a firefight, this constant feedback from the system to the Shooter 10 may be both distracting and inappropriate. In such a situation, the messages to the Shooter 10 may be suppressed during the event and reported afterward.
[0069] It should be clear at this time that a shooting simulation system for personnel, unmanned systems, and vehicles has been provided that enables non-line of sight engagements and permits firing through obscurants and terrain features like bushes and tall grass. However the present invention is not to be considered limited to the forms shown which are to be considered illustrative rather than restrictive.