"System and Method for Shooting Simulation"
20190249955 ยท 2019-08-15
Inventors
- John Surdu (Severn, MD, US)
- Josh Crow (Orlando, FL, US)
- Chris Ferrer (Orlando, FL, US)
- Rick Noriega (Longwood, FL, US)
- Peggy Hughley (Winter Springs, FL, US)
- Padraic Baker (Orlando, FL, US)
Cpc classification
F41J5/10
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G3/2605
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
International classification
Abstract
A shooting simulation system and method for training personnel in targeting visual and non-line-of-sight targets. The firearm simulation system has a plurality of participants each having a firearm and each being equipped to transmit their location to a remote computer server for storage and use with other transmitted data to determine which participant was a Shooter and which participant was the Shooter's target and for determining a simulated hit or miss of the target and assessing the simulated damage to the target.
Claims
1-17. (canceled)
18. A shooting simulation method comprising a shooter having a weapon and a position location sensor, a target having a position location sensor, the position location sensors reporting the shooter's location and target's location to a remote computer system, the remote computer system: receiving (a) a captured image from an optical system on the weapon once the shooter activates a trigger of the weapon, the captured image indicating where the weapon is being aimed when fired, and (b) an orientation of the shooter's weapon when fired; identifying the target by determining whether the target's location sensor reports that the target is in a line of fire of the weapon and determining whether the target is in the captured image; calculating a trajectory of a simulated round fired from the weapon based (a) on the captured image and the orientation of the shooter's weapon when the weapon is fired and (b) characteristics of the simulated round; and determining whether the simulated round hit the target.
19. The method of claim 18, wherein the computer system employs a computer vision algorithm to identify a particular target in the captured image when multiple targets are in the line of fire of the weapon.
20. The method of claim 18, wherein determining whether the simulated round hit the target includes calculating whether the simulated round hit the target based on the target's speed and direction of movement.
21. The method of claim 18, wherein determining whether the simulated round hit the target includes determining whether the trajectory of the simulated round is blocked by an obstacle.
22. The method of claim 21, wherein, if the trajectory of the simulated round is blocked by an obstacle, determining whether the simulated round can pass through the obstacle based on characteristics of the obstacle.
23. The method of claim 18, wherein when the target is occluded by an obstacle in the captured image, the remote computer system fills in the occluded portion of the target.
24. The method of claim 18, wherein determining whether the simulated round hit the target includes determining a minimum arming distance of the simulated round.
25. The method of claim 18, wherein the simulated round is at least one round selected from the group bullet, grenade, mortar, and rocket.
26. A shooting simulation system comprising a shooter having a weapon and a position location sensor, a target having a position location sensor, the position location sensors reporting the shooter's location and target's location to a remote computer system, the remote computer system being configured to: receive (a) a captured image from an optical system on the weapon once the shooter activates a trigger of the weapon, the captured image indicating where the weapon is being aimed when fired, and (b) an orientation of the shooter's weapon when fired; identify the target by determining whether the target's location sensor reports that the target is in a line of fire of the weapon and determine whether the target is in the captured image; calculate a trajectory of a simulated round fired from the weapon based (a) on the captured image and the orientation of the shooter's weapon when the weapon is fired and (b) characteristics of the simulated round; and determine whether the simulated round hit the target.
27. The system of claim 26, wherein the computer system employs a computer vision algorithm to identify a particular target in the captured image when multiple targets are in the line of fire of the weapon.
28. The system of claim 26, wherein the computer system determines whether the simulated round hit the target by calculating whether the simulated round hit the target based on the target's speed and direction of movement.
29. The system of claim 26, wherein the computer system determines whether the simulated round hit the target by determining whether the trajectory of the simulated round is blocked by an obstacle.
30. The system of claim 29, wherein, if the trajectory of the simulated round is blocked by an obstacle, the computer system is further configured to determine whether the simulated round can pass through the obstacle based on characteristics of the obstacle.
31. The system of claim 26, wherein, when the target is occluded by an obstacle in the captured image, the computer system can fill in the occluded portion of the target.
32. The system of claim 26, wherein the computer system determines whether the simulated round hit the target by determining a minimum arming distance of the simulated round.
33. The system of claim 26, wherein the simulated round is at least one round selected from the group bullet, grenade, mortar, and rocket.
34. A shooting simulation system comprising: a plurality of participants respectively carrying a weapon with a trigger sensor, the participants having a computer and a position location sensor that reports the participant's location, orientation and movement information wirelessly to a remote computer system; an orientation sensor on the participants' firearms that reports an orientation of the respective firearm to the remote computer system; an optical system aligned with an aim point of the participants' firearms that captures an image of the aim point of a shooter participant's firearm at the time the trigger sensor is activated and provides image information to the remote computer system; the remote computer system storing each participant's location; the remote computer system being configured to receive the image and the orientation of the shooter participant's firearm at the time the trigger sensor is activated; and the remote computer system being operable to identify a target participant in the image and to determine the relationship between the point of aim and the target participant's location within the image based on the target participant's location; wherein when the target participant is occluded by an obstacle in the image, the remote computer system fills in the occluded portion of the target participant.
35. The system of claim 34, wherein the remote computer system determines whether simulated round hit the target participant by determining a minimum arming distance of the simulated round.
36. The system of claim 34, wherein the remote computer system determines whether a simulated round hit the target participant by calculating whether the simulated round hit the target participant based on the target participant's speed and direction of movement and a trajectory of the simulated round.
37. The system of claim 34, wherein the remote computer system determines whether the simulated round hit the target participant by determining whether a trajectory of the simulated round is blocked by an obstacle, the computer system being configured to determine whether the simulated round can pass through the obstacle based on characteristics of the obstacle.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] The accompanying drawings, which are included to provide further understanding of the invention are incorporated in and constitute a part of the specification, and illustrate an embodiment of the invention and together with the description serve to explain the principles of the invention.
[0020]
[0021]
[0022]
[0023]
DESCRIPTION OF THE INVENTION
[0024] The present invention is a system for simulating live, force-on-force simulated firearms engagements at realistic ranges. The Shooter can be a person with a direct fire small arm, such as a rifle or submachine gun or with an indirect fire or high-trajectory firearm, such as a grenade launcher, or an unmanned ground vehicle or unmanned aerial vehicle. The invention simulates a plurality of firearms. The system is symmetrical and homogenous in that a Shooter can also be a target, and vice versa.
[0025] In
[0026] The Shooter 10 aims his firearm at his Target 11 and pulls the trigger which activates a trigger sensor. The Shooter's location, firearm orientation, and sight image are transmitted to the wireless relay. The sight image is a digital representation of the Shooter's view through his firearm's sight when he pulls the trigger. The location and orientation of the Shooter 10 and his sight image are transmitted to the Remote Server 14 and to the Interaction Manager 16. The Interaction Manager queries the target Resolution Module 17, which produce a list of possible targets from the Entity State Database based on the firearm location, orientation, known position sensor error, and known orientation sensor error. This list of possible targets is provided to the Hit Resolution Module 18.
[0027] The Hit Resolution Module 18 runs the multiple, multi-spectral algorithms to find targets in the sight image. Multiple algorithms may be used based on environmental conditions and other factors that influence which algorithms will be the most successful. This step includes processing the sight image to locate targets and determining the relationship between the aim point and the target based on the sight image. For instance, did the Shooter aim high, low, left, or right of center of mass of the target.
[0028] The Hit Resolution Module 18 calls the Target Reconciliation Module 20, which reconciles results from the computer vision computation with information from the Entity State Database. This step identifies which targets from the Target Resolution Module 20 correspond to targets identified by the computer vision algorithm. This step is purely based on the results of employing a plurality of computer vision (CV) algorithms and does not rely on any artificial indicia in the scene. The CV algorithms use a plurality of algorithms to construct a silhouette around the target; however, if the CV algorithms cannot construct a full silhouette, they then construct a bounding box around the targets in the scene.
[0029] The Hit Resolution Module 18 queries the Munitions Fly-out Module 21 for the flight time of the projectile and adjustments to the trajectory of the round. These adjustments can be based on range (e.g., drop of the round over distance), atmospheric effects, weather, wind, interactions with the terrain, and other factors as required to accurately predict the trajectory of the round. The system uses a representation of the terrain in the area of interest to compute whether the simulated projectile struck the target.
[0030] The Hit Resolution Module 18 computes whether the trajectory of the round intersects the target determined by the Target Reconciliation Module 17 based on the adjusted trajectory, time of flight, and relative velocity of the target. Relative velocity accounts for movement of the target, the Shooter, and the Shooter's firearm. If the round strikes the projected target location at time of impact, the Hit Resolution Module 18 calls the Damage Effects Module 22. This module computes the damage to the target based on the firearms characteristics, the munitions characteristics, and location of the calculated impact point in the target's calculated silhouette. Damage effects indicate the extent of damage to the target, such as whether the target was killed, sustained a minor wound or major wound, the location of the wound, and the like.
[0031] A near miss is reported through the wireless relay 12 and retransmitted to the Target 11 and the Shooter 10, respectively, who are informed of the near-miss results via audio and visual effects similar to the existing MILES system. A hit result is reported through the wireless relay 12 and re-transmitted to the Target 11 and the Shooter 10, respectively. The Shooter is notified of a hit, and the Target is notified that he was hit, with what firearm or round he was hit, and the severity of the damage.
[0032]
[0033] When the participant pulls the trigger on his training rifle, the Trigger Pull Sensor 25 sends a message to the Participant-Worn Computing Device 24. The Participant-Worn Computing Device 24 captures the trigger-pull events. The Firearm Orientation Sensor 26 returns the firearm orientation to the Participant-Worn Computing Device 24. Similarly, the Image Capture Device 27 provides the sight image as seen by the Shooter 10 to the Participant-Worn Computing Device. The Image Capture Device 27 may provide: [0034] 1. A mix of visible spectrum, non-visible spectrum, and multi-spectral images. [0035] 2. A video image or a series of still images. [0036] 3. Images from a single viewpoint or multiple viewpoints. [0037] 4. Images from narrow and wide-angle viewpoints.
[0038] The Participant-Worn Computing Device 24 sends the location and orientation of the firearm as well as the sight images via the Wireless Relay 12 to the Remote Server 14.
[0039] The target is not augmented with indicia or beacons. Other than the participant-worn subsystem, the target includes only his operational equipment.
[0040] In
[0041] The Orientation Sensor 26 provides three-dimensional orientation with respect to the geomagnetic frame of reference. This three-dimensional representation can be in the form of a quaternion; yaw, pitch, and roll; or other frame of reference, as appropriate. The Orientation Sensor 26 is calibrated to the fixed coordinate system when the system is turned on, and it can be periodically recalibrated during a simulation event as necessary. The orientation sensor may employ a plurality of methods to determine three-dimensional orientation. There is no minimum accuracy requirement for the Orientation Sensor 26; although, a more accurate orientation sensor reduces the burden on the Target Reconciliation Module 17.
[0042] The Location Sensor 23 provides the Shooter's location with respect to a fixed reference frame. In the current embodiment, this is provided as latitude and longitude, but other coordinate representation methods may be employed. The participant's speeds may be measured directly by the position sensor or may be inferred through the collection of several position reports over time.
[0043] The location, orientation, and velocity updates are transmitted 13 to a Remote Server 14, where they are stored in the Entity State Database 15 for later use, as shown in
[0044] As depicted in
[0045] As shown in
[0046] The sight image is a digital representation of the Shooter's view through his firearm's sight when he pulls the trigger. The image capture device 27 is aligned with the barrel and sights of the simulated firearm so that the image captured from the device is an accurate representation of the Shooter's sight picture when the trigger was pulled. In the first embodiment of the invention, the image capture device 27 is the same scope through which the Shooter is aiming the firearm, but the image capture device may be separate form the weapon sights. The image capture device 27 may provide: [0047] A mix of visible spectrum, non-visible spectrum, and multi-spectral images; [0048] A video image or a series of still images; Images from a single viewpoint or multiple viewpoints; and [0049] Images from narrow and wide-angle viewpoints.
[0050] The Position Location Sensor 23 provides periodic updates of the participant's location, orientation, and speed to the Participant-Worn Computing Device 24. The Participant-Worn Computing Device transmits these updates to the wireless relay 12.
[0051]
[0052] The location and orientation of the Shooter 10 and his sight image are transmitted from the Wireless relay 12 to the Remote Server 14 and the Interaction Manager 16. Any communication means with sufficient bandwidth may be used in this step of the process. The Participant-Worn Computing Device 24 may perform preprocessing of the captured sight picture to reduce bandwidth requirements. Pre-processing includes, but is not limited to, cropping the image, reducing the resolution of the image, compressing the image, and/or adjusting the tint, hue, saturation, or other attributes of the image.
[0053] In
[0054] The Target Resolution Module 17 provides this list of possible targets to the Hit Resolution Module 18. In
[0055] In
[0056] In
[0057] Having determined the intended target, in
[0058] In
[0059] In
[0060] The Munitions Fly-Out Module 21 accounts for weapon systems that detonate based on range to the target, distance from the firearm, or other factors, by determining when the detonation occurs. As an example, but not a limitation of the invention, if a Shooter fires simulated munitions from his firearm that explode at a pre-sent distance, the Munitions Fly-Out Module 21 computes the trajectory of the munitions to their points of detonation. The locations where the munitions detonated are then passed to the Damage Effects Module 22 to compute damage to any nearby participants.
[0061] In
[0062] In
[0063] In
[0064] The system records information from the Remote Server 14 to assist in reviewing the training event. Information such as, but not limited to, participant's locations over time, sight pictures when triggers were pulled, sight pictures after the CV algorithms have processed them, results from the Target Reconciliation Module 20, and status of participant-worn devices may be displayed to an event controller during and after the training event.
[0065] This invention is equally applicable to high-trajectory or non-line of sight shooting. In the case of high-trajectory fire, the image from the Image Capture Device 27 is not necessary. The modified process for non-line of sight and high-trajectory shooting is depicted in
[0066] In Step 206, the Target Resolution Module 17 queries the Entity State Database 15 to determine whether any participants, friendly or enemy, are within the burst radius of the simulated munitions. In Step 207, the Munitions Fly-Out Module 21 predicts the locations of those participants at the time of impact or detonation of the simulated munitions. In Step 208, for each participant within the burst radius of the munitions, the Damage Effects Module 22 determines if the participant is hit, where the target was hit, and the severity of the damage, just as described in Step 115,
[0067] In Step 209, if a participant received a hit from a high-trajectory shot, in Step 212, the target is notified of the results, including location(s) and severity of wounds. The Shooter 10 may be notified that he has hit his target as well. In an augmented reality situation, this notification might come in the form of a depiction of an explosion near the target(s). If the high-trajectory shot is a miss or near miss, in Step 210, this is reported to the target. The Shooter 10 may also be notified in Step 211. The reporting of hits and misses can be configured based on different training situations. For instance in one training mode, the system sends feedback to the Shooter 10 after each shot so that the Shooter may learn from each shot and improve his marksmanship. In another training mode, such as simulating a firefight, this constant feedback from the system to the Shooter 10 may be both distracting and inappropriate. In such a situation, the messages to the Shooter 10 may be suppressed during the event and reported afterward.
[0068] It should be clear at this time that a shooting simulation system for personnel, unmanned systems, and vehicles has been provided that enables non-line of sight engagements and permits firing through obscurants and terrain features like bushes and tall grass. However the present invention is not to be considered limited to the forms shown which are to be considered illustrative rather than restrictive.