Universal Virtual Simulator
20200143699 ยท 2020-05-07
Inventors
- Arif M. J. PASHAYEV (Baku, AZ)
- Jahangir J. ASKEROV (Baku, AZ)
- Roald Z. SAGDEEV (Baku, AZ)
- Daniel A. USIKOV (Newark, CA, US)
- Adalat S. SAMADOV (Baku, AZ)
- Anvar T. HAZARKHANOV (Sumqayit, AZ)
- Toghrul I. KARIMLI (Baku, AZ)
- Seymur M. M. KARIMOV (Baku, AZ)
- Aliazhdar A. SEYIDZADEH (Baku, AZ)
- Hikmat M. SEYIDOV (Baku, AZ)
- Ilkin A. MIRZOYEV (Baku, AZ)
- Ruslan T. BARZIGYAR (Baku, AZ)
- Samir S. RUSTAMOV (Baku, AZ)
Cpc classification
G09B9/302
PHYSICS
G06F3/0346
PHYSICS
G06F3/011
PHYSICS
International classification
Abstract
Virtual aircraft simulators are used to educate and train aircraft pilots flying solo or with another pilot, instructor, and air-traffic controller. The device contains a capsule installed on a computerized mechanical platform providing up to six degrees of freedom of real-time movement, and a pilot seat. To simulate real sensations of the pilot more closely, the capsule may also be equipped with a control stick, one or more thrust levers, and pedals. The stereo glasses are used to create virtual reality. The invention improves the functionality of the simulator by introducing a virtual avatar with artificial intelligence, which, when flying with a trainee, can replicate the actions of a captain, co-pilot, air-traffic controller, or instructor. The avatar also can maintain a verbal dialogue with the trainee within the scope of a standard pilot communication protocol thesaurus. The device is suitable for any aircraft type without changing the hardware.
Claims
1. A universal virtual simulator comprising: one or two pilot seats installed on a movable platform; a set of seat belts, a headphone and microphone headset, 3D virtual reality glasses, a control stick, one or more thrust levers, pedals, electronic means for registering a position of pilot's hands, head and legs, pilot's bio-field sensors, and a personal computer for creating a virtual flight simulation, wherein said simulator is provided with static and dynamic 3D cameras and scanners, a flexible head-cap with built-in sensors, and intelligent avatars capable of performing functions of an instructor, captain, co-pilot, or air-traffic controller based on corresponding commands and programs by utilizing artificial intelligence methods for voice and visual control.
2. A training method using an intelligent avatar-instructor, the method comprising providing a virtual simulator comprising one or two pilot seats installed on a movable platform, a set of seat belts, a headphone and microphone headset, 3D virtual reality glasses, a control stick, one or more thrust levers, pedals, electronic means for registering a position of pilot's hands, head and legs, pilot's bio-field sensors, and a personal computer for creating a virtual flight simulation, wherein said simulator is provided with static and dynamic 3D cameras and scanners, a flexible head-cap with built-in sensors, and intelligent avatars capable of performing functions of an instructor, captain, co-pilot, or air-traffic controller based on corresponding commands and programs by utilizing artificial intelligence methods for voice and visual control; and utilizing the computer-based virtual simulator with audio, video and digital sources of information connected thereto from simulator systems and manual, cardiologic, head, and eye-based biomedical sensors of the captain, co-pilot, air-traffic controller, or examinees to simulate failures of systems, units, and devices, meteorological problems, spatial disorientation conditions; monitor and assess as an independent expert consistency of following operating procedures and a psycho-physiological condition of the captain, co-pilot, air-traffic controller, or examinees; demonstrate performance of the functions for optimal prevention of an emergency situation and achieving specified aircraft control modes; and perform flight training based on visual rules and instruments during malfunctions and problems and during visual illusion of the pilots, and to perform system training and updating with new flight data for continuous improvement of the method.
3. A method of operating an automated unmanned or single-pilot commercial or military aircraft using an intelligent avatar comprising a providing a virtual simulator comprising one or two pilot seats installed on a movable platform, a set of seat belts, a headphone and microphone headset, 3D virtual reality glasses, a control stick, one or more thrust levers, pedals, electronic means for registering a position of pilot's hands, head and legs, pilot's bio-field sensors, and a personal computer for creating a virtual flight simulation, wherein said simulator is provided with static and dynamic 3D cameras and scanners, a flexible head-cap with built-in sensors, and intelligent avatars capable of performing functions of an instructor, captain, co-pilot, or air-traffic controller based on corresponding commands and programs by utilizing artificial intelligence methods for voice and visual control, wherein performing an optimal aircraft operation occurs by: a virtual aircraft cockpit provided with a virtual avatar; a computer having audio, video and digital information sources connected thereto from all cockpit and aircraft systems comprising a black box, ground weight and center-of-gravity measurement systems, systems of communication with an air-traffic controller, crew and passengers, a simulator-based experienced instructor flight database with the simulation of failures of aircraft systems and units, an aircraft system and unit diagnostics and forecasting unit based on the measurement of the dynamic characteristics of the aircraft systems comprising operating parameters, vibrations, temperatures, pressures, revolutions, sounds; and intelligent and automated control units.
Description
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0014] A universal virtual simulator comprises one or two pilot seats installed on a computerized movable mechanical software-equipped platform, which provides up to six degrees of freedom in the real-time. Depending on the type of the aircraft and pilot training levels, the platform can also be provided with removable control sticks, joysticks, pedals, and one or more thrust levers. The pilot uses stereo glasses to immerse into a virtual reality, a microphone and headphones linked to a mathematical support of a voice command recognition, 3D virtual pilot avatars with dynamic face, head, arms and legs imitation, electronic and software means to create avatars resembling real pilots, a helmet with built-in sensors for recording encephalograms to monitor the neurological and physiological condition of the pilots (for example, to assess the degree of alertness and ability to adequately control their legs and hands).
[0015] The virtual reality stereo glasses are equipped with additional sensors, such as eye tracker for both eyes; RGB front-view video camera (or stereo camera); camera for determining distances to objects in the scene; head position and orientation sensors; haptic gloves and other tools for arms and legs allowing to simulate the virtual touch sensations; computer and software performing virtual reality simulation.
[0016] The stereo image of the cockpit instruments is generated in accordance with the type of the aircraft and then transmitted to the stereo glasses (heap-mounted deviceHMD). The image in the stereo glasses depends on the position and orientation of the pilot's head. Depending on the type of utilized HMD, their spatial position is determined by the sensors installed directly on the HMD, and frequently with the support of the external HMD tracking devices. The more advanced HMD includes the front-view cameras, which allow turning on the real 3D images of the pilot's arms and legs (as well as rudders, joysticks and pedals) in the virtual cockpit (augmented reality). The eye-tracking device is also used for optimal 3D visualization.
[0017] In the absence of front-view cameras, the pilot's hands are synthesized and placed into a virtual cockpit in accordance with the hands and fingers position sensors [10].
[0018] The augmented reality methods are used to include a real 3D image of the pilot's legs into the virtual cockpit image. If the stereo glasses are not equipped with the augmented reality sensors, the 3D image of the legs is synthesized based on the pedal sensors.
[0019] The calculated virtual 3D image of the hand palms, fingers, and feet of the pilot is used to simulate the pilot's manipulation of the virtual buttons, handles, and other simulated manipulators in the cockpit. If haptic gloves are used, the virtual touch is transmitted to the gloves to generate a response to the touch by hands and feet. In case of two pilots conducting a joint flight, a 3D avatar of the other pilot is created in the corresponding seat of the virtual cockpit (i.e., in the glasses of the second pilot, the avatar of the first pilot is placed in the seat of the first pilot, and vice versa, in the glasses of the first pilot, the avatar of the second pilot is placed in the seat of the second pilot). The 3D images of the actual position of the hands of both pilots are combined and converted into the final image in the stereo glasses of both pilots. Flight synchronization between the workplaces of the pilots (and instructor, if present) is performed locally (using USB or Ethernet communication channels), or via the Internet.
[0020] A flexible helmet with built-in sensors ensures the positioning of the EEG sensors on the pilot's head (for example, a brain helmet [6-8]). It is used to record a real-time multichannel oscillogram of the brain activity, which allows recognizing the trainee's degree of focusing while performing the aircraft operation tasks. The recorded electrical brain activity is also used to monitor the pilot's health (if the pilot fell asleep, lost consciousness and other physiological characteristics of the nervous activity). It is possible that the oscillogram in combination with the trainee's eye-tracking system and speech commands can be used to perform the aircraft operation tasks, such as activating switches on the cockpit panel directly using the brain action currents, i.e. without hands.
[0021] The simulator's technical support also includes haptic devices for hands, which make it possible to create a physical sensation of touching the virtual control devices in the aircraft cockpit with hands and fingers [9].
[0022] The software of the invention includes the simulations of aircraft motion control, flight direction, engine, and landing gear control. By using a computer model of a specific aircraft, the software calculates the aircraft response to the pilot's control actions by generating sounds from wind, engines, and other sources. The software also includes the simulation of the aircraft's onboard software and avatar voice recognition (with the artificial intelligence elements) to control the aircraft. The software utilizes a database of ground images and generates a stereo image depending on the altitude and position of the aircraft, as well as the position of the pilot relative to the cockpit window. The generated stereo image is then transmitted to the stereo glasses.
[0023] The artificial intelligence of voice and visual control included in the simulator complex allows performing an individual as well as group training of the aircraft captain, co-pilot, instructor or air-traffic controller, and eliminates the mandatory presence of these individuals during the simulator training. Thus, one pilot of a multiple crew aircraft can perform an individual training, where the functions and tasks of another pilot, instructor, and air-traffic controller are performed by the corresponding avatars utilizing artificial intelligence for voice and visual interaction with the pilot in training. This makes it possible to achieve more unified training and reduce the total training cost.
REFERENCES
[0024] 1. Model based control of a flight simulator motion system. www.dcsc.tudelft.nl/Research/PublicationFiles/publication-5738.pdf [0025] 2. Patent RU2361281, Int. Cl. G09B 9/32, Personal virtual pilot training simulator, V. P. Merkulov, V. K. Zakharov, V. Ya. Maklashevskiy, K. S. Vislyaev, and A. S. Yuritsyn, Bulletin No. 19, Jul. 10, 2009. [0026] 3. Patent RU2280287, Int. Cl. G09B 9/02, G09B 9/02, Complex aircraft simulator, V. A. Godunov, A. S. Pochivalov, A. V. Shapalov, and A. V. Bondurin, Bulletin No. 20, Jul. 20, 2006. [0027] 4. U.S. Pat. No. 8,624,924 B2, Int. Cl. G09G 5/00, Portable immersive environment using motion capture and head mounted display, M. K. Dobbins, P. Rondot, E. Shone, M. Yokel, K. J. Abshire, A. R. Harbor Sr., S. Lovell, and M. K. Barron, Lockheed Martin Corporation, Jan. 18, 2008; Appl. No. 61/022,185. [0028] 5. U.S. Pat. No. 5,490,784, Int. Cl. G09B 9/00, Virtual reality system with enhanced sensory apparatus, D. E. Carmein, Feb. 23, 1996; Appl. No.: 145, 413. [0029] 6. Patent US2005/107,716 A1, Methods and apparatus for positioning and retrieving information from a plurality of brain activity sensors. [0030] 7. U.S. Pat. No. 7,547,284 B2, Bilateral differential pulse method for measuring brain activity. [0031] 8. B. W. Johnson, S. Crain, R. Thornton, G. Tesan, and M. Reid, Measurement of brain function in pre-school children using a custom sized whole-head MEG sensor array. [0032] 9. https://www.roadtovr.com/exos-haptic-vr-exoskeleton-glove-aims-deliver-practical-touch-feedback/https://techcrunch.com/2017/02/09/oculus-gloves/10. https://www.oculus.com/