COMBAT TRAINING SYSTEM

20230366649 · 2023-11-16

    Inventors

    Cpc classification

    International classification

    Abstract

    A combat training system for carrying out and monitoring at least one shooting practice and combat scenarios, comprising: at least two components, with a first component being a firearm and a second component being a target object; a device for sensing position information of each of the at least two components; device for sensing spatial orientation information at least for the at least one firearm of the at least two components; and an information processor, in which position information of the at least two components and the orientation information of the at least one firearm is processed and transformed into at least result information. The at least two components each have a clock, which are time-synchronized with each other and each generate coded time signals with a component identifier which individualizes the respective component, and an event sensor is provided on each of the at least two components.

    Claims

    1-17. (canceled)

    18. A combat training system which monitors at least one of shooting practice and combat scenarios, comprising: at least two components, with the first component being a firearm and the second component being a target object; a sensor for sensing position information of each of the at least two components; a sensor for sensing spatial orientation information for the at least one firearm; and a processor with the position information of the at least two components and the spatial orientation information of the at least one firearm being processed to be transformed into at least result information, wherein the at least two components each have a clock with the clocks being time-synchronized with each other and each generating coded time signals with an identifier identifying the component and an event sensor of each of the at least two components.

    19. A combat training system according to claim 18, wherein: the sensor for sensing the position information is part of each of at least one firearm of the at least two components.

    20. A combat training system according to claim 19, wherein: the sensor for sensing the position information receives and evaluates terrestrial or satellite-supported position signals.

    21. A combat training system according to claim 18, wherein: the sensor for sensing the spatial orientation information is part of each of the at least one firearm of the at least two components.

    22. A combat training system according to claim 19, wherein: the sensor for sensing the spatial orientation information is part of each of the at least one firearm of the at least two components.

    23. A combat training system according to claim 20, wherein: the sensor for sensing the spatial orientation information is part of each of the at least one firearm of the at least two components.

    24. A combat training system according to claim 23, wherein: the sensor for sensing the spatial orientation information includes at least one of an: inertial measurement unit, a magnetometer, a camera, an optical sensor, an ultrasonic transducer, a motion sensor, or a uwb location system.

    25. A combat training system according to claim 18, wherein: the sensor for sensing position information is at least one of a terrestrial, a waterborne and an airborne remote control system.

    26. A combat training system according to claim 18, wherein: the sensor for sensing spatial orientation is at least one of a terrestrial, a waterborne and an airborne remote control system.

    27. A combat training system according to claim 25, wherein: the sensor for sensing the position information and the sensor for sensing spatial orientation is in at least one of a radar system, an optical image and a pattern detection system, an ultrasound wave system, a range-resolving detection system (SODAR), a lightwave-based range-resolving detection system (LIDAR), and a motion capture system.

    28. A combat training system according to claim 18, wherein: the at least two components each comprise at least one of an information transmitter and a receiver.

    29. A combat training system according to claim 18, wherein: the at least one firearm detects armed states and generates information regarding an armed state of the firearm.

    30. A combat training system according to claim 18, wherein: the at least one firearm includes a firearm disabling system which is remotely activatable.

    31. A combat training system according to claim 18, wherein: the event sensor is at least one of an accelerometer, an acoustic sensor, a sound sensor, a deformation sensor, and a contact sensor.

    32. A combat training system according to claim 18, wherein: the event generates time-coded signals with the clock.

    33. A combat training system according to claim 18, comprising: an optical projection system for generating a visually virtual target object.

    34. A combat training system according to claim 31, wherein: the optical projection system comprises at least one of virtual reality glasses and a spatially positionable projection surface.

    35. A combat training system according to claim 18, wherein: the information processor comprises at least one of: a receiver for receiving position and orientation information; calculator for processing and evaluating the at least received position and orientation information; a clock; a display for displaying result information; and a transmitting unit.

    36. A combat training system according to claim 35, wherein: time is provided by the clock for the processor which is time-synchronized with clocks of the at least two components.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0036] In the following text, the invention will be described for exemplary purposes without limitation of the general inventive thought, based on embodiments thereof and with reference to the drawing. In the drawing:

    [0037] FIG. 1 shows the basic structure of a combat training system designed according to the invention;

    [0038] FIG. 2 shows a component arrangement for sensing and displaying the overall situation of a combat scenarios;

    [0039] FIG. 3 shows the component arrangement for sensing and displaying according to FIG. 2 and with flexible danger zones;

    [0040] FIG. 4 is a block diagram to illustrate flexible danger zones;

    [0041] FIG. 5 shows components for sensing and displaying according to FIG. 2 with the option of active weapon deactivation;

    [0042] FIG. 6 is a block diagram to illustrate integrated weapon deactivation.

    [0043] FIG. 7 is a component representation similar to FIG. 2 with sensing and displaying simulated hits

    [0044] FIG. 8 is a block diagram for sensing simulated hits;

    [0045] FIG. 9 shows a component arrangement for sensing and analysing according to FIG. 1 with weapon-specific hit assignment;

    [0046] FIG. 10 is a block diagram to illustrate weapon- and shooter-specific hit assignments;

    [0047] FIG. 11 is a component representation according to FIG. 6 with generation of virtual target objects; and

    [0048] FIG. 12 is a block diagram for combating virtual targets.

    DETAILED DESCRIPTION OF THE INVENTION

    [0049] The heart of the combat training system according to the invention is a sensor arrangement, with which it is possible to detect events such as the firing of a shot, and the position as well as the spatial alignment and orientation of at least one firearm participating in the combat scenario, and to transmit the firearm-specific event, position and orientation information to a central information processing unit for further evaluation, preparation and display of the information.

    [0050] A description of the sensors and other components needed for this will be explained in the following text with reference to various use cases. These relate specifically to: [0051] a. Detecting and displaying the overall situation, i.e. position and orientation of all firearms and optionally target objects or target representing devices present in the combat scenario; [0052] detecting and displaying the danger zones due to fire ready firearms [0053] b. Calculating flexible danger zones [0054] c. Blocking firearms in certain situations [0055] d. Detecting simulated hits on target objects [0056] e. Firearm-specific assignment of hits [0057] f. Combating virtual target objects

    [0058] One option for position and attitude sensing of a firearm is the integration of a sensor arrangement that is preferably modular in design and can be integrated in the firearm. If the firearm is a handheld weapon, e.g. a pistol or rifle, the capability exists to integrate the sensor arrangement inside the replaceable grip of the firearm. This enables flexible and mobile use, and the shooters can continue using their own weapon for the training without limitations.

    [0059] The sensor arrangement preferably comprises a number of components of the following kind: unit for receiving and evaluating satellite-supported position signals, such as for example GPS, GLONASS, GALLILEO, BEIDOU signals, Inertial Measurement Unit (IMU), magnetometer, integrated camera, for example with automatic image analysis for target detection, optical sensors, e.g., brightness sensor, polarization sensor for direction detection, ultrasonic range sensors ultra-wideband (UWB) position sensor, motion capture, i.e. motion analysis sensor.

    [0060] Besides these sensors, the sensor arrangement also comprises the components necessary for operating the sensors, such as a firing processor-based signal capture and signal processing logic, an energy supply unit and a communications unit for transmitting the information captured by the sensors, that is at least the position and orientation information, to a central information processing unit.

    [0061] Additionally, a part of each sensor arrangement is a clock, which sets a system time at the location of the sensor arrangement, wherein each clock, and therewith the system times of all sensor arrangements in all of the components involved in the firing exercise, that is all firearms and target objects, are time-synchronized. Moreover, the clocks are capable of generating time-synchronized, coded time signals, by which each individual component is identifiable in component-specific manner. For this purpose, the coding provides a component identifier that individualises the respective component.

    [0062] Besides the sensor arrangement which captures position and attitude information for the respective component, as explained previously the at least one firearm and the at least one target object are each equipped with an event sensor, which in the case of the firearm is able to sense a firing event, and in the case of the target objects is able to sense an impact event. At least one of the following sensors is preferably suitable for this purpose: accelerometer, acoustic sensor, structure-borne sound sensor, deformation sensor, contact sensor.

    [0063] Thus in each case the event sensor is able to trigger the clock arranged in the corresponding component to generate a coded time signal, thereby adding a temporally precise timestamp to the detected event (firing event or impact event) and supply it to the information processing unit together with the information about the location and alignment of the firearm or the target object for analysis.

    [0064] Through the temporal correlation between firing and impact event, an exact shooter-target assignment can be made as part of the information analysis procedure. Moreover, the ability to capture events and forward the information associated therewith to the information processing unit in realtime offers the capability of dynamic situation and threat assessment in situ, i.e. while the exercises are taking place on the combat field.

    [0065] As explained previously, all of the components described above may be integrated in or adapted to a standardized module in or on the weapon. Equally, it is possible to arrange some of the abovementioned components on portable items of equipment that the shooter has with him, for example uniform, rucksack, helmet etc.

    [0066] To illustrate an extremely simple embodiment for creating a combat training system, reference is herewith made to FIG. 1, in which the essential components are illustrated in a block diagram. Sensors 1, preferably integrated in the firearm, serve to sense events, position and alignment and to sense the orientation of the firearm, ich on order to be able to detect the firing and determine the direction in which the round was shot. The sensor signals generated with the aid of the sensors undergo a subsequent data preparation 2, in which the sensor signals are prepared for transmission to an external information processing unit 3 and are provided with a coded time signal that individualises the respective firearm. The sensors 1 together with the data preparation 2 form the sensor arrangement 4, which is either integrated in or on a firearm modularly as a structural unit or may be mounted directly on the weapon and attached to items of the shooter's equipment in the form of a distributed system with intercommunicating components, for example using realtime-capable short-range wireless communication.

    [0067] Alternatively to or in combination with a sensor arrangement 4 mounted directly on the firearm, it is a reasonable development to install the sensor arrangement in the surrounding area, that is as at least one of a terrestrial-, water- and airborne remote control system. At least one of the following systems is well suited for this purpose: radar system, optical image and pattern detection system, ultrasound wave based, range-resolving detection system (Sodar), and lightwave-based range-resolving detection system (Lidar).

    [0068] Markers that can be mounted on the firearm are useful for improving the detectability of the at least one firearm for one of the remote control systems listed above, so that their position as well as a spatial orientation can be detected unerringly. In this case too, each event at a firearm and a target object is captured with resolution as to time, on the basis of a common system time.

    [0069] FIG. 2 illustrates a scenario which represents n firearms or shooters with firearms. Each individual firearm is equipped with a sensor arrangement 4.sub.1, 4.sub.2, . . . 4.sub.n. The items of event, position and orientation information captured by each of the n sensor arrangements 4 are transmitted, resolved by time, to the central information processing unit 3, where an overall situation picture of the current combat situation is captured, prepared in a data preparation process, and presented in a display. In this context, all firearm-specific event, location and orientation information is merged, wherein the absolute positions are combined under realtime conditions to in at least one of a unified coordinate system for evaluation and display purposes. The capability of sensing and representing the alignment of all firearms in realtime as well as their positions, for example on a monitor or with the aid of a light projection system, introduces considerable added value for firing safety in the conduct of the exercise.

    [0070] The information processing unit is also equipped with a communications unit, via which detailed feedback based on the data preparation and evaluation performed can be transmitted to all participating shooters, which has the effect of significantly improving the quality of the training.

    [0071] In addition to the weapon-mounted sensor arrangement for sensing position and spatial orientation, the at least one firearm provides a unit for detecting the arm state and generating information about the arm state of the firearm. The information about the arm state is transmitted to the central information processing unit 3 via a communications unit on the weapon, and is evaluated and displayed together with the respective positions and orientations of the firearms.

    [0072] Besides the above, the knowledge of the positions, orientations and the arm state of all the firearms involved in a firing exercise makes it possible to connect danger zones within a combat scenario with the positions of the training participants who are bearing firearms and coordinate and show them in realtime. In this way, it is possible to check at any time during an exercise with exercise participants are located in a danger zone (internal firing safety) or are moving beyond zone and area boundaries (external firing safety). This in turn makes it possible to respond quickly and flexibly to firing safety events, e.g., by interrupting the exercise, by informing the firing safety officers, or by a direct message, e.g., via at least one of a participant-specific signalling device and an generalized, broadcast announcement to the entire firing range. In this case, evaluation and decision making may be carried out manually, e.g., by a firing safety officer, or semi-automatically, based for example on a software-supported suggestion for action, or fully automatically. Additionally, sensing of the weapon orientation enables all danger zones to be managed dynamically, for example by adjusting the length of the danger zone according to the elevation of a firearm.

    [0073] FIG. 3 shows such an operational scenario, in which n-firearms are each equipped with a sensor arrangement 4.sub.1, 4.sub.2, . . . 4.sub.n. Similarly to the case shown in FIG. 2, the position and orientation of the firearm in space as well as the arm state thereof are used to capture the overall situation of a firing exercise in progress. In conjunction with the danger zones defined from ZDV A2 2090, threats may be converted into signals by data analysis, which signals may be transmitted in a manner perceptible by all participants in the exercise via suitable display and notification means.

    [0074] In this context, FIG. 4 shows a block diagram illustrating flexible danger zones, for each of which different decision situations obtain.

    [0075] In a further preferred embodiment, in addition to the sensor arrangements 4.sub.1, 4.sub.2, . . . 4.sub.n described previously, the firearms also provide remote controlled weapons blocking capability, which can prevent a weapon from firing, see FIGS. 5 and 6. Such a blocking capability may be activated either if fixed boundaries defined in the sensor evaluation electronics unit before the firearm was used are exceeded (particularly advantageous when used on fixed position shooting facilities and temporary firing ranges) or as a result of evaluation of danger zones and position data.

    [0076] The following two configuration principles are conceivable for the technical arranged of an activatable firearm disabling device: [0077] 1. firing is blocked in the normal state and becomes active, is enabled for example by a command. [0078] 2. firing is possible in the normal state and becomes active, is disabled for example by a command.

    [0079] The establishment of shooting and prohibition zones is of particular advantage when the combat training system according to the invention is used on facilities with fixed firing positions. Restricting shooting areas has the effect of significantly reducing not only danger to individuals, e.g. from ricochets, but also damage to local infrastructure.

    [0080] The use of dynamic shooting and prohibition zones is advisable for combat exercises as well as for other reasons. In this context, if a danger is detected for persons who are positioned within the danger zone of a weapon for example, the weapon is blocked and its firing capability is disabled. The assessment as to whether safety zones were violated at a given time is carried out for all exercise participants centrally in the higher-level information processing unit. In this case, any firearm blocking is also controlled and initiated centrally by the information processing unit. Alternatively a firearm block may also be effected in a decentralized way by a local comparison of positions and danger zones. In this case, the firearms exchange information among themselves in realtime via a firearms-specific communication unit. A combination of the two preceding control options is also conceivable depending on the nature of the danger.

    [0081] FIG. 5a illustrates the preceding scenario for fixed position firing and prohibition zones. In this case, a sensor arrangement 4 sense the position, alignment and arm state of the firearm. All this information is forwarded to the central information processing unit 3, where the information is evaluated and a firearm block is initiated according to the case.

    [0082] FIG. 5b illustrates a case scenario for dynamic firing and prohibition zones. In this case, n firearms are present in a training area. The sensor arrangements 4.sub.1, 4.sub.2, . . . , of the individual firearms calculate the position, the attitude of the firearm and also the arm state of the respective firearm. All information captured is transmitted to the central information processing unit 3, where optionally the overall situation may be displayed, but all information is evaluated taking into account the corresponding danger zones, on the basis of which a block is initiated according to the case. Blocking is carried out firearm-specifically, depending on which of the firearms defines a prohibited danger zone.

    [0083] In this context, FIG. 6 shows a block diagram illustrating the interaction between firearm blocking in different case constellations.

    [0084] As for all the embodiments described previously for conducting combat training, the target objects at which the shooters aim their firearms provide a unit for capturing impact events and the position, or advantageously also for capturing the spatial alignment, particularly in cases where the target objects are mobile. In this way, the status of the target object, a permitted impact area and the spatial orientation of the target object may be captured and transmitted to the central information processing unit as an element of target object information.

    [0085] In this regard, FIGS. 7 and 8 show a use case in which simulated hits on the target object can be sensed, i.e. the firing exercise is conducted with maneuver ammunition, i.e. blanks. All weapons involved in the training are equipped with a sensor arrangement 4.sub.1, 4.sub.2, . . . , 4.sub.n. As in all the use cases described previously, position and alignment sensing of each individual firearm takes place time-synchronously.

    [0086] In the case of mobile target objects, the target objects are also equipped with a sensor arrangement 5.sub.1, 5.sub.2, . . . , 5.sub.n for sensing at least the position, preferably also the orientation of each, and these are also time-synchronised with all sensor arrangements involved in the exercise.

    [0087] Each firearm and each target object involved in the exercise is equipped with a sensor arrangement 4, 5, all of which have a common, precisely synchronised time basis and are also capable of generating a unique identification feature for identifying the respective component, which feature is transmitted to the central information processing unit 3 via a transmitting unit attached to each firearm/target object. At the moment of firing, the position and alignment of a firearm and a target object to be hits are captured by the respective sensor arrangement.

    [0088] In the same way as the sensor arrangement 4 in the n firearms for sensing the position, alignment and also the firing event, the target objects are also fitted with a sensor arrangement 5, which is able to sense the position and optionally also the alignment of the target object, and a unit for sensing an event at the target object, for example an impact event, and transmit to the central information processing unit. All sensor arrangements 4, 5 are time-synchronized with each other, and are also able to transmit an identification feature to the information processing unit for purposes of unique component identification.

    [0089] At the time a shot is fired, the point in time, the position and orientation of the respective firearm and also the target object at which the firearm was aimed is captured. From the captured information and its temporal correlation, the central information processing unit calculates whether the target object was hit. Optionally, a ballistics model of the shot motion may be taken into account as well as a temporal correlation criterion for the evaluation of the information and the associated determination of the shooter-target assignment. The corresponding target object may optionally also respond correspondingly upon determination of a hit, controlled by the central information processing unit, for example by tipping of a falling plate target on the target object.

    [0090] Various interference parameters, such as virtual weather conditions or a modified external ballistics behaviour of virtual projectiles may also be taken into account when assessing hits. It is also possible to incorporate 3D terrain data in the hit assessment, for example concealment of a target object by buildings or land topologies.

    [0091] In a further application case, illustrated in FIGS. 9 and 10, the firearms are each equipped with the sensor arrangement 4.sub.1, 4.sub.2, . . . , 4.sub.n, which, besides determining the position and alignment of the firearm, is also able to capture the time of a discharge with sufficient accuracy, preferably in the millisecond range. The target objects to be combated are also equipped with a sensor arrangement 5.sub.1, 5.sub.2, . . . , 5.sub.n, which enables sensing of the position and optionally of the alignment of the respective target object, and is able to detect a precisely temporally resolved impact event, and transmits this additionally to the central information processing unit together with a corresponding timestamp as hit information. Each component in the exercise, which is equipped with sensor arrangements 4, 5 also works with a time basis which is sufficiently exactly time-synchronised with the other components, and a unique identification feature, so that temporal and positional events, i.e. firings and hits on the target object can be captured in correlated manner. This correlation may incorporate models, e.g. an external ballistic description of the projectile trajectory, for example so that flight times and shot distances derived therefrom can be considered.

    [0092] The sufficiently precise time of a discharge is sensed on the firearm, in the same way the sufficiently precise time of a hit is sensed on the target object. Based on these values, taking into account the alignment of the firearm and target object relative to each other, the distance between the weapon and the target and the external ballistic properties of the ammunition used, it can be determined which firearm with the greatest probability hit the corresponding target object. The unique identification feature enables a weapon-specific hit assignment, which may be evaluated while the exercise is ongoing or subsequently.

    [0093] In a further use case, illustrated in FIGS. 11 and 12, the positions, orientations and the times of discharge of the respectively participating firearms are captured. If virtual targets, are introduced with georeferencing into the training scenario, for example through the use of an Augmented Reality System, for example using VR glasses worn by the individual shooters, these can be combated with the real firearms equipped with the sensor arrangement. In such cases, it is of no importance whether maneuver or life ammunition is being used. Hits on virtual target objects of such kid can be calculated on the basis of the position and orientation of the firearms and the virtual position of the target. At the same time, however, it is important for example the ensure by using a dynamic, situation-dependent weapons blocking system, that no unacceptable dangers are caused, by overelevation of weapons for example.

    LIST OF REFERENCE NUMERALS

    [0094] 1 Sensors [0095] 2 Data preparation [0096] 3 Information processing unit [0097] 4.sub.1, 4.sub.2, . . . , 4.sub.n Sensor arrangement on firearm [0098] 5.sub.1, 5.sub.2, . . . , 5.sub.n Sensor arrangement on target object