F41G9/002

VISUAL GUIDANCE SYSTEM FOR BARREL-FIRED PROJECTILES
20220170725 · 2022-06-02 ·

A winged external guidance frame placed on the muzzle that can couple with a projectile while exiting the barrel utilizing the kinetic energy of the projectile to travel to the target while the accuracy is provided by on board electronics and corrected using the wings. Alternately a reusable unmanned aerial system that travels in the speed and direction of the projectile and couples with the projectile as it exits the barrel.

PROJECTILE RANGING WITH DIGITAL MAP

A terrain-referenced navigation system for an aircraft comprises: a stored digital terrain map; a position calculation unit arranged to calculate aircraft position relative to the stored digital terrain map to determine a terrain-referenced aircraft position; a fall line calculation unit arranged to calculate a fall line for a projectile starting from the terrain-referenced aircraft position as a launch point; and an impact point calculation unit arranged to directly compare the fall line with the digital terrain map, by incrementally comparing a height of the projectile along the fall line with a height of the terrain according to the stored digital terrain map in order to find an expected impact point on the terrain.

Unmanned aerial vehicle
11040772 · 2021-06-22 · ·

An unmanned aerial vehicle (UAV) adapted for transit in and deployment from a projectile casing is provided. The UAV includes a wing assembly coupled to the projectile casing and the wing assembly moveable between a closed position and a deployed position. The UAV further includes a propulsion system including at least one rotor disposed on the wing assembly to generate lift, wherein in the closed position, the wing assembly is substantially integral with the projectile casing and in the deployed position, the wing assembly is extended outwards from the projectile casing.

Mission planning for weapons systems

A mission planning method for use with a weapon is disclosed. The method comprises: obtaining a first training data set describing the performance of the weapon; using the first training data set and a Gaussian Process (GP) or Neural Network to obtain a first surrogate model giving a functional approximation of the performance of the weapon; and providing the first surrogate model to a weapons system for use in calculating a performance characteristic of the weapon during combat operations.

Drone interceptor system, and methods and computer program products useful in conjunction therewith
11022408 · 2021-06-01 · ·

A system operative to down a target drone having propellers deployed along a perimeter p, comprising a processor-controlled interceptor drone bearing a processor-controlled flexible elongate intercepting agent cannon and an onboard camera; and an onboard processor to receive sensed wind conditions and to a firing distance d, between interceptor and target drones, given a firing angle A, and wherein the processor is configured to track the target drone using imagery generated by the onboard camera including at least once, when said wind conditions exist, guiding the interceptor drone to a firing position whose distance from the target drone is d, and commanding the cannon to fire at firing angle A, once said firing position is achieved, thereby to use the flexible elongate intercepting agent to down target drones.

APPARATUS AND METHOD FOR DISPLAYING AN OPERATIONAL AREA

An apparatus and method for displaying an operational area to an operative of a host platform, said operational area being defined within an external real-world environment relative to said host platform, the apparatus comprising a viewing device (12) configured to provide to said operative, in use, a three-dimensional view of said external real-world environment, a display generating device for creating images at the viewing device, and a processor (32) comprising an input (34) for receiving real-time first data representative of a specified target and its location within said external real-world environment and configured to receive or obtain second data representative of at least one characteristic of said specified target, the processor (32) being further configured to: use said first and second data to calculate a geometric volume representative of a region of influence of said specified target relative to said real-world external environment and/or said host platform, generate three-dimensional image data representative of said geometric volume, and display a three dimensional model, depicting said geometric volume and created using said three-dimensional image data, on said display generating device for creating images at the viewing device, the apparatus being configured to project or blend said three-dimensional model within said view of said external real-world environment at the relative location therein of said specified target.

LAR display system and method
11054221 · 2021-07-06 · ·

A Launch Acceptability Region [LAR] display system and method for a payload-releasing platform, the system being configured to be communicably coupled to a LAR computing module (104) configured to compute LAR data representative of a Launch Acceptability Region in respect of said platform based on a set of input parameters and predefined payload performance parameters, the system comprising: an input module (100) configured to obtain or receive a first input parameter value in respect of a first of said input parameters, generate a set of second input parameter values in respect of said first of said input parameters, said second input parameter values being different to and at respective intervals from, said first input parameter value, and input said first input parameter value and said second input parameter values to said LAR computing module so as to cause said LAR computing module to compute, based on each of said first and second input parameter values, a respective LAR and output a set of LAR data, each data item of said set of LAR data being representative of a respective LAR and the input parameter value on which it is based; an image data generation module (106) configured to receive said set of LAR data and generate therefrom a set of LAR image data, each data item of said set of LAR image data being representative of a respective data item of said set of LAR data; anda display module (108) configured to receive said set of LAR image data and display, simultaneously, a visual representation of each LAR, wherein the relative positions in said display of said visual representations is based on their respective associated input parameter value.

AUTONOMOUS WEAPON SYSTEM FOR GUIDANCE AND COMBAT ASSESSMENT
20200393225 · 2020-12-17 · ·

An autonomous weapon system for improved guidance of a projectile for homing a target includes a guided projectile including at least one sensor and a carrier projectile and at least one guidance and reconnaissance unit including a transmitter for communication via light. The system uses emitted light for both positioning and communication of target coordinates which provides an accurate and cost effective system for combatting point and surface targets by indirect fire.

MULTIMODE UNMANNED AERIAL VEHICLE
20200256644 · 2020-08-13 ·

A system comprising an unmanned aerial vehicle (UAV) configured to transition from a terminal homing mode to a target search mode, responsive to an uplink signal and/or an autonomous determination of scene change.

Multiple kill vehicle (MKV) interceptor with improved pre-ejection acquisition and discrimination

An MKV interceptor includes a carrier vehicle (CV) that supports the deployment of M kill vehicles (KVs) and provides centralized acquisition and discrimination pre-ejection. Pre-ejection each KV acquires and transmits IR imagery, and possibly visible imagery, via an internal communication bus to a central processor on the CV. The central processor spatially registers the IR images from the different KVs, either directly from the IR images themselves or using the visible imagery, and sums the IR (and visible) images to form a registered spatially averaged IR image. This image has the same resolution but higher SNR than any one of the KV IR images. The central processor uses this registered spatially averaged image during pre-ejection acquisition and discrimination modes. The key benefit is the elimination of independent CV sense capability, which is large, heavy and expensive, and was required by either the command guided or sharing concepts.