Patent classifications
G09B9/006
GEOMETRICALLY PAIRED LIVE INSTRUMENTATION TRAINING HAND GRENADE
A method and apparatus for simulating a hand grenade in a training environment. The hand grenade includes dead reckoning to determine location after leaving a thrower. By knowing a location of the thrower and subsequent path after leaving the thrower, the explosion location and simulated damage to targets can be determined. The simulation can determine the effect of obstructions between the explosion location and nearby targets.
WEAPON TARGETING TRAINING SYSTEM AND METHOD THEREFOR
The present invention relates to a training system for training of a forward controller. The training system utilises a pod housing that can be attached to a hardpoint under an aircraft wing. The pod is configured to receive communications from the forward controller on the ground (or in another aircraft) and communicate wirelessly with an HMD and/or electronic device in the cockpit of the aircraft. This allows cheaper, less expensive aircraft to be used for training purposes.
Systems and methods for training firefighters
Systems and methods for training firefighters and other first responders are provided. The systems include a plurality of sensors configured to detect at least one ambient condition, such as, temperature; a receiver configured to receive the electric signals from the plurality of sensors; and a digital storage device operatively connected to the receiver and adapted to store the electrical signals corresponding to the detected ambient conditions. The data detected by the sensors can be retrieved for later review, analysis, and training. The systems may include at least one video recorder and/or at least one infrared image recorder each configured to detect images that can be stored for later retrieval and a synchronizing device adapted to synchronize the sensor data with the recorded images for later review. Methods of implementing the systems and portable cases enclosing the system are also provided.
Tactical target mobile device
A device, sometimes mobile, which simulates human motion, such as an omnidirectional sensing target. The target that can detect and differentiate between projectile (bullet) hits to the head, body, and one or more peripheral regions (e.g., arms, pelvis, legs) of a humanoid-form target, elicit a physical response from the mobile device based upon the sensing data, and provide real time feedback to a user (shooter) via a user-friendly interface. The target includes multiple layers of conductive material separated by one or more insulating layers. Forming can be by wrapping planar sensing panel into a three-dimensional configuration such that the panel occludes a three-dimensional volume and presents surfaces sensing projectiles from all directions. Associated circuitry is configured to have a bullet pass through multiple zones and to electronically detect the bullet path based on a combination and/or sequence of zones detecting bullet passage.
MOVABLE SIGHT FRAME ASSEMBLY FOR A WEAPON SIMULATOR
A sight frame assembly for a weapon simulator includes one or more connection arms. The connection arms are for slidably coupling the sight frame assembly to the weapon simulator. The connection arms include a passageway for slidably receiving a guide rod that is attached to a sight frame assembly mounting point on the weapon simulator. A spring is positioned in proximity to the guide rod. The sight frame assembly is disposed in a first position along a longitudinal axis of the weapon simulator when the spring is in an uncompressed state, and the sight frame assembly is disposed in a second position along the longitudinal axis of the weapon simulator when the spring is in a compressed state.
Augmented Reality Training Systems and Methods
A system is configured to provide Joint Terminal Attack Controller (JTAC) training using one or more of augmented reality (AR) devices, virtual reality (VR) devices, or other devices. The system may generate one or more AR elements (such as one or more of an enemy combatant, an aerial device, an ordnance, a ground vehicle, or a structure). The system may produce AR data that can be provided to one or more AR headsets and may produce VR scene data corresponding to the AR data that can be provided to one or more VR headsets. Additionally, the system may detect a cap or cover on a device, determine data related to the device from the cap, and present visual data superimposed over the cap based on the determined data.
Method and apparatus for adjusting viewing angle in virtual environment, and readable storage medium
This disclosure discloses a method and an apparatus for adjusting a viewing angle in a virtual environment. The method includes: displaying a first viewing angle picture, the first viewing angle picture including a virtual object having a first orientation; receiving a drag instruction for a viewing angle adjustment control; adjusting the first viewing angle direction according to the drag instruction, to obtain a second viewing angle direction; and displaying a second viewing angle picture, the second viewing angle picture including the virtual object having the first orientation.
Method for simulating live aircraft infrared seeker obscuration during live, virtual, constructive (LVC) exercises
The illustrative embodiments provide for a method a training system. The training system includes a physical sensor system connected to a physical vehicle. The physical sensor system is configured to obtain real atmospheric obscuration data of a real atmospheric obscuration. The training system also includes a data processing system comprising a processor and a tangible memory. The data processing system is configured to receive the real atmospheric obscuration data, and determine based on the real atmospheric obscuration data whether a target is visible to the physical vehicle in a simulation training environment generated by the data processing system. The simulation training environment at least including a virtual representation of the physical vehicle and a virtual representation of the real atmospheric obscuration.
UNIVERSAL LASERLESS TRAINING ARCHITECTURE
A laserless combat simulation device includes at least one processing device and a memory device having instructions stored thereon that, when executed by the at least one processing device cause the at least one processing device to receive a trigger event from a weapon device. The instructions further cause the at least one processing device to receive image data from an optical sensor array of the weapon device of a field of view of the weapon device, receive position information and orientation information of the weapon device at the time of the trigger event, analyze the image data to determine an identity and location of a target, and calculate a ballistic outcome based at least in part on the position information and orientation information of the weapon device, the identity and location of the target, and ballistic characteristics of a simulated round fired from the weapon device.
SIMULATED LIDAR DEVICES AND SYSTEMS
Systems and methods for generating simulated LiDAR data using RADAR and image data are provided. An algorithm is trained using deep-learning techniques such as loss functions to generate simulated LiDAR data using RADAR and image data. Once trained, the algorithm can be implemented in a system, such as a vehicle, equipped with RADAR and image sensors in order to generate simulated LiDAR data describing the system's environment. The simulated LiDAR data may be used by a vehicle control system to determine, generate, and implement modified driving operations.