Patent classifications
G06G7/80
Fire control system
A fire control system comprises a fixed base and a sight assembly rotatably attached to the fixed base. The sight assembly includes an optical range finder for calculating a distance to a selected target and a camera having a zoom lens assembly and an optical sensor for generating an image signal representative of a target scene including a selected target. The zoom lens assembly includes a zoom controller and zoom lens optical elements, wherein the zoom controller is configured to change a magnification of the zoom lens optical elements responsive to a calculated distance to the selected target. In a further aspect, a method for imaging a target is provided.
Method and system of determining miss-distance
The presently disclosed subject matter includes a computerized method and system for determining miss-distance between platforms. The proposed method and system make use of an electro optic sensor (e.g. camera) mounted on one of the platforms for obtaining additional data which is used for improving the accuracy of positioning data obtained from conventional positioning devices. A navigation error is calculated where the relative position of the two platforms is converted to the camera reference frame. Once the navigation error is available, it can be used to correct a measured miss-distance.
System and method utilizing a smart camera to locate enemy and friendly forces
A tactical awareness system. The system includes a plurality of soldiers affiliated with a friendly force, where each soldier has a man-worn computer, a Global Positioning System (GPS) device for determining a location of the soldier, and a smart camera for capturing an image. The plurality of soldiers communicates wirelessly with each other via a network. The location of each soldier is distributed to all soldiers affiliated with the friendly force via the network. The smart camera captures an image when activated by one of the plurality of soldiers and a target image recognition module is used for determining if an enemy or friendly force is identified in a captured image by the smart camera. Furthermore, each smart camera includes a Camera Orientation Module (COM) for obtaining the orientation relative to a known three-dimensional coordinate system of the smart camera.
Device controlling shooting based on firearm movement
New systems, devices and methods for extremely precise aiming and shooting of firearms by relatively unskilled users are provided. In some embodiments, shots may be planned in advance. In some embodiments, a device including specialized computer hardware and software aids a user in planning a shot(s), evaluating the accuracy of the planned shot(s), adjusting the location of the planned shot(s), and executing the planned shots. In some embodiments, the system may prevent shots where motion, acceleration and positioning of the device and/or firearm relative to the planned shot location(s) and/or the surrounding environment may cause an inaccurate shot, but execute shots where those factors facilitate a highly accurate shot. In some embodiments, the control system may similarly account for, counteract and/or otherwise adjust for any other relevant ballistic and other accuracy-impacting factors with a position-actuable firing mechanism to maintain a projected flight path of such a point of impact.
Automated weapons system with selecting of target, identification of target, and firing
An automated weapon system is comprised of a human transported weapon comprising a barrel and munitions; sensing means; targeting means; computational logic for determining where to aim the human transported weapon; aim computational logic; firing activation means; and, firing means. The munitions can be aimed towards a targeting area to be propelled through the barrel. The sensing means senses which of up to a plurality of targets are within firing range of the automated weapon system. The targeting means selects a selected target from the targets in the targeting area that are within the firing range, responsive to the sensing. The computational logic determines where to aim the human transported weapon so that the munitions will hit the selected target if fired at a firing time. The aim computational logic adjusts the aim of the munitions through the human transported weapon, to compensate as needed for where the selected target is at the firing time, responsive to the determining where to aim. The firing activation means initiating firing of the munitions at the firing time. The firing means fires the munitions responsive to the adjusting the aim and the initiating firing.
Mission planning for weapons systems
A mission planning method for use with a weapon is disclosed. The method comprises: obtaining a first training data set describing the performance of the weapon; using the first training data set and a Gaussian Process (GP) or Neural Network to obtain a first surrogate model giving a functional approximation of the performance of the weapon; and providing the first surrogate model to a weapons system for use in calculating a performance characteristic of the weapon during combat operations.
Dynamic weapon to target assignment using a control based methodology
The system and method of dynamic weapon to target assignment (DWTA) using a control based methodology to dynamically assign each projectile to a target in a multiple target engagement situation. In some cases, closest proximity is used in a real-time, to accomplish the DWTA functional requirement and performance criteria. In some cases, g pulling acceleration and projectile fin deflection motion are also used to assess the best matched pair for each projectile and each target with an end goal of intercepting the target or guiding the projectile to an acceptable error basket for target destruction via detonation. For the closest distance criterion for projectile/target pairing, a cutoff time is used to ensure the pairing is conducted within an acceptable duration while still being able to intercept the target or meet a required miss distance basket (e.g., <3 m).
Method and system of determining miss-distance
The presently disclosed subject matter includes a computerized method and system for determining miss-distance between platforms. The proposed method and system make use of an electro optic sensor (e.g. camera) mounted on one of the platforms for obtaining additional data which is used for improving the accuracy of positioning data obtained from conventional positioning devices. A navigation error is calculated where the relative position of the two platforms is converted to the camera reference frame. Once the navigation error is available, it can be used to correct a measured miss-distance.
Systems and methods for detecting a distance between a conducted electrical weapon and a target
A shield cooperates with a detector to determine the distance between a conducted electrical weapon (CEW) and a target. The CEW includes a laser for aiming the CEW. The shield cooperates with the detector to block or pass laser light that reflects from a target. The position of the shield with respect to the detector may be used to determine the distance between the CEW and the target. The detected distance between a CEW and a target may be used to select a deployment unit with a suitable range for launching electrodes toward a target. The detected distance between a CEW and a target may be used to determine a spread between launched electrodes at the target. The spread may be used to determine a likely effectiveness of a stimulus signal on the target. The detected distance, and any information associated with the detected distance may be stored in a memory.
Display apparatus of interception area, display method and interception system
In a display apparatus of an interception area, a first detection device scans a first area containing at least a part of a firing range of a first weapon to detect an obstacle. A first terminal calculates the first interception area in which the first weapon is possible to intercept, based on data of the detected obstacle and data of the first weapon; calculates a first display area showing the first interception area on a screen based on the first interception area; receives second area data generated by a second terminal, and calculates a second display area based on the second area data. The second display area shows on the screen an area in which a second weapon is possible to intercept. The first terminal displays the first display area and the second display area.