Patent classifications
F41G3/225
WEAPON SIGHTED CAMERA SYSTEM
Systems, apparatuses, and methods are described which provide weapon sighted cameras. A camera can be mounted on a weapon and, after a set-up procedure, can acquire a target without using a weapon sight of the weapon.
Firearm electronic system
Man portable weapons include integrated electronics that calculate orientation and movement in addition to providing that data to a user's heads-up displays (HUD) as well as to group and area networks. By passing data to a HUD, the user is able to see, virtually, the flight path, point of impact and other ballistic data as well as data representing the condition and performance of the weapon for any rounds fired. The HUD also displays the relative position of other members of the team, last known enemy area of operation and other useful parameters from the man portable weapons of the other team members through the network. The electronics may be integrated within the main components of any suitable man portable weapon in a non-intrusive way as to have no effect on the firing mechanism of the small arm when it is fully assembled.
Firearm Electronic System
Man portable weapons include integrated electronics that calculate orientation and movement in addition to providing that data to a user's heads-up displays (HUD) as well as to group and area networks. By passing data to a HUD, the user is able to see, virtually, the flight path, point of impact and other ballistic data as well as data representing the condition and performance of the weapon for any rounds fired. The HUD also displays the relative position of other members of the team, last known enemy area of operation and other useful parameters from the man portable weapons of the other team members through the network. The electronics may be integrated within the main components of any suitable man portable weapon in a non-intrusive way as to have no effect on the firing mechanism of the small arm when it is fully assembled.
VIEWING DEVICE FOR AIRCRAFT PILOT
A viewing device for an aircraft pilot, the device comprising: a support for positioning on the head of the aircraft pilot; a display surface; display means carried by the support and arranged to display augmented reality objects on the display surface; acquisition means arranged to act in real time to acquire first data representative of the position and of the orientation of the support, second data representative of the position and of the orientation of a cockpit of the aircraft, and third data defining congested zones occupied by equipment of the cockpit, and to acquire the augmented reality objects; and processor means arranged to act in real time to define positions for the augmented reality objects, so that all of the augmented reality objects are positioned outside the congested zones when they are displayed on the display surface.
APPARATUS AND METHOD FOR DEFINING AND INTERACTING WITH REGIONS OF AN OPERATIONAL AREA
A display apparatus and method for displaying an operational area to an operative of a host platform, said operational area being defined within an external real-world environment relative to said host platform, the apparatus comprising a viewing device (12) configured to provide to said operative, in use, a three-dimensional view of said external real-world environment; a display generating device for creating images at the viewing device, a user input (33) configured to receive user input data (35) representative of a specified target or region in respect of which an operation is to be performed, and thereby defining an initial geometric volume for the operational area, said user input data including data representative of the location within said external real-world environment of said specified target or region and data representative of said operation to be performed in respect thereof; and a processor (32) configured to: use said user input data to generate or obtain three-dimensional image data representative of an adjusted geometric volume based, at least, on said initial geometric volume and on said operation to be performed, and display one or more images depicting said adjusted geometric volume and created using said three-dimensional image data, on said display generating device, the apparatus being configured to project or blend said one or more images displayed on said display generating device into said view of said external real-world environment at the relative location therein of the specified target or region.
Waveguide based fused vision system for a helmet mounted or head worn application
A waveguide based fused vision system includes a helmet mounted or head worn sensor module and a display assembly. The display assembly includes a frame including a pair of waveguide combiners and a pair of projectors associated with waveguide combiners. The display assembly also includes a sensor module. The pair of projectors is disposed on opposite sides of a frame. The sensor module is disposed substantially above the frame and includes a first sensor, a second sensor, a video processing circuit, and a symbol generator. The video processing circuit is configured to merge first sensor information from the first sensor and first symbols from the symbol generator and to merge second symbols and the second sensor information for displaying conformally the real world scene.
Weapon sighted camera system
Systems, apparatuses, and methods are described which provide weapon sighted cameras. A camera can be mounted on a weapon and, after a set-up procedure, can acquire a target without using a weapon sight of the weapon.
Firearm electronic system
Man portable weapons include integrated electronics that calculate orientation and movement in addition to providing that data to a user's heads-up displays (HUD) as well as to group and area networks. By passing data to a HUD, the user is able to see, virtually, the flight path, point of impact and other ballistic data as well as data representing the condition and performance of the weapon for any rounds fired. The HUD also displays the relative position of other members of the team, last known enemy area of operation and other useful parameters from the man portable weapons of the other team members through the network. The electronics may be integrated within the main components of any suitable man portable weapon in a non-intrusive way as to have no effect on the firing mechanism of the small arm when it is fully assembled.
Controllable firing pattern firearm system
A controllable firing pattern firearm system is described herein. The controllable firing pattern firearm system includes a firearm, one or more actuators for adjusting at least one of a position and orientation of the firearm, and a controller controlling the actuators to produce a designated firing pattern on a target as the firearm is fired. The controller receives several user inputs to generate the commands for the actuators to produce the designated firing pattern, where the designated firing pattern may be a spiral firing pattern. The user provides input through a control panel having several control input mechanisms. The user inputs include a firing pattern size or target diameter, projectile firing density, and a distance of a target from the firearm, among other inputs. A method is also described herein for firing a spiral firing pattern on a target with the controllable firing pattern firearm system.
Target designator
A target designator for a guided weapon is disclosed. The designator has a sight arranged to display, in operation, a reticule superimposed upon a field of view. The reticule is moveable within the field of view. The designator further comprises an eye tracker operable to track the gaze of the operator whilst the operator uses the sight. The eye tracker communicates with the sight such that the reticule moves so as to be aligned with the direction of the gaze of the operator.