G09B9/302

Augmented reality for vehicle operations

A method, includes saving in-flight data from an aircraft during a simulated training exercise, wherein the in-flight data includes geospatial locations of the aircraft, positional attitudes of the aircraft, and head positions of a pilot operating the aircraft, saving simulation data relating to a simulated virtual object presented to the pilot as augmented reality content in-flight, wherein the virtual object was programmed to interact with the aircraft during the simulated training exercise and representing the in-flight data from the aircraft and the simulation data relating to the simulated virtual object as a replay of the simulated training exercise.

METHODS, SYSTEMS, APPARATUSES, AND DEVICES FOR FACILITATING PROVISIONING OF A VIRTUAL EXPERIENCE

A system, includes a memory in communication with a processor, the memory storing instructions that when executed by the processor cause the processor to receive a first location of a real vehicle, receive an updated location of the real vehicle, compute, utilizing at least the first location and the updated location, a future location of the real vehicle at a predetermined time in the future and output data to a display device adapted to display to a user of the display device at the predetermined time a mixed reality representation of an environment surrounding the real vehicle as viewed from the future location.

PROGRAMMABLE INTERFACE FOR FLIGHT CONTROL DEVICES

A rotorcraft flight simulator system includes physical flight control devices for a rotorcraft vehicle. Each of the flight control devices is configured to generate, when actuated, one or more control signals via an output connector of that flight control device. The system includes a programmable interface with multiple input pins, and each of the output connectors is coupled to a respective input pin. The programmable interface includes a controller configured to attribute particular control signals received at a particular input pin to a specific flight control device. The system includes a flight simulator computer configured to receive output signals from the controller via an Ethernet port. The output signals include, for the specific flight control device, header information indicating the specific flight control device and flight control data corresponding to a particular control signal received at the particular input pin that is associated with the specific flight control device.

Systems and methods for simulating an electrical vertical takeoff and landing (eVTOL) aircraft
11694569 · 2023-07-04 · ·

In an aspect of the present disclosure is a system for simulating an electrical vertical takeoff and landing (eVTOL) aircraft, including a fuselage 104 comprising one or more pilot inputs, each of the pilot inputs configured to detect pilot datum; a concave screen facing the fuselage 104; a plurality of projectors directed at the concave screen; a computing device communicatively connected to the plurality of projectors, the computing device configured to: receive the pilot datum detected by the pilot inputs; generate a simulated eVTOL flight maneuver as a function of the pilot datum; and command the plurality of projectors to display one or more images based on the simulated flight maneuver.

Deep-learned generation of accurate typical simulator content via multiple geo-specific data channels
11544832 · 2023-01-03 · ·

A simulator environment is disclosed. In embodiments, the simulator environment includes graphics generation (GG) processors in communication with one or more display devices. Deep learning neural networks running on the GG processors are configured for run-time generation of photorealistic, geotypical content for display. The DL networks are trained on, and use as input, a combination of image-based input (e.g., imagery relevant to a particular geographical area) and a selection of geo-specific data sources that illustrate specific characteristics of the geographical area. Output images generated by the DL networks include additional data channels corresponding to these geo-specific data characteristics, so the generated images include geotypical representations of land use, elevation, vegetation, and other such characteristics.

AIRCRAFT COCKPIT TRAINING SIMULATOR AND ASSOCIATED METHOD

An aircraft cockpit training simulator includes a plurality of aircraft cockpit simulation panels and power over Ethernet (POE) cabling extending therebetween. Each panel includes a simulator user interface device, an input circuit or an output circuit, a POE interface circuit, and a distributed controller coupled to the input circuit or output circuit and POE interface device and asynchronously communicating with other controllers using a publish/subscribe protocol. A host controller is coupled to the distributed controllers via the POE cabling and operates the aircraft cockpit simulation panels using a host computer model. The distributed controllers may operate independent of the host computer model.

AIRCRAFT VR TRAINING SYSTEM, AIRCRAFT VR TRAINING METHOD, AND AIRCRAFT VR TRAINING PROGRAM

An aircraft VR training system includes: training terminals that generates simulation images for performing simulation training in common VR space and provides the simulation images to trainees individually associated with the training terminals; and a setting terminal including setting information necessary for generating the simulation images. The setting terminal transmits the setting information to the training terminals. The training terminals set the setting information received from the setting terminal, and transmit setting completion notification of the setting information to the setting terminal. After the setting terminal receives the completion notification from all the training terminals, the setting terminal causes the training terminals to start simulation training.

SYSTEM AND METHOD OF ADJUSTING FOCAL DISTANCES OF IMAGES DISPLAYED TO A USER OF A SIMULATOR
20230154351 · 2023-05-18 · ·

Systems and methods for adjusting focal distances of images displayed to a user at a designated eye point of a simulator are provided. An image may be generated for display by a screen, wherein the image is reflected by a mirror to the designated eye point. A simulated distance from the designated eye point to an object in the image may be determined. A focal distance for the image may be determined based on the simulated distance. A simulated size of the object may be determined based on the simulated distance. An adjustor may alter a distance between the screen and the mirror to achieve the focal distance. A size of the object may be adjusted in the image based on the simulated size.

METHODS, SYSTEMS, APPARATUSES, AND DEVICES FOR FACILITATING PROVISIONING OF A VIRTUAL EXPERIENCE

A system, including a memory in communication with a processor, the memory storing instructions that when executed by the processor cause the processor to create and store a geospatial virtual environment comprising a plurality of entities each having one or more location attributes and corresponding time attributes wherein at least one of the plurality of entities is a virtual asset and wherein at least one of the plurality of entities represents a real vehicle having a defined location within a physical space having spatial coordinates that are mapped to the virtual environment, receive an updated location of the real vehicle, map the received location of the real vehicle to the geospatial virtual environment, update the entity of the geospatial virtual environment corresponding to the real vehicle with the mapped received location and output data comprising a portion of the geospatial virtual environment to a display device adapted to display to an operator of the real vehicle a mixed reality representation of at least one virtual entity.

METHOD FOR REPRESENTING AN ENVIRONMENT BY MEANS OF A DISPLAY UNIT ARRANGED ON A PERSON AND VISIBLE FOR THE PERSON
20230195209 · 2023-06-22 ·

A method represents an environment via a display unit arranged on a person and visible for the person as a display image within the scope of a simulation. The simulation is carried out in an interaction environment, wherein a number of actuatable interaction elements are arranged in the interaction environment. An interaction environment image capture, depicting the interaction environment, is created by use of a first image capturing unit arranged on the person or relative to the person. A position of the person is determined in the interaction environment and based on the position of the person an environment image is provided. An image mask is provided which depicts the individual interaction elements contained in the interaction environment image capture and is represented in the display image. The interaction environment image capture and the environment image are superimposed using the image mask and then displayed on the display unit.