Patent classifications
G09B9/20
Perspective selection for a debriefing scene
Debriefing a session from a user in a system. During the session, while the user performs actions on one or more tangible instruments of the system, dynamic data is logged in relation to the system along a session timeline. The dynamic data covers the actions of the user on tangible instrument(s). A graphical user interface depicting a debriefing scene, related to the session, is displayed from a first point of view starting at a first time within the session timeline. The debriefing scene is generated starting at the first time from at least a first image feed. Upon detection of a predetermined event in the dynamic data at a second time along the session timeline, a second point of view different from the first point of view is defined and the debriefing scene is generated therefrom after the second time using at least a second image feed.
Contextual monitoring perspective selection during training session
Monitoring a training session from a trainee in an interactive computer simulation system. During the training session, while the trainee performs actions in an interactive computer simulation station on one or more tangible instruments thereof for controlling a virtual simulated element, dynamic data is logged related to the actions of the trainee. At a monitoring station of the interactive computer simulation system and during the training session, a graphical user interface is displayed depicting a contextual scene related to the interactive computer simulation from a first point of view and detecting a predetermined event in the dynamic data during the training session. At the monitoring station, a second point of view is defined different from the first point of view and the contextual scene is generated in the graphical user interface after the predetermined event detection from the second point of view.
Contextual monitoring perspective selection during training session
Monitoring a training session from a trainee in an interactive computer simulation system. During the training session, while the trainee performs actions in an interactive computer simulation station on one or more tangible instruments thereof for controlling a virtual simulated element, dynamic data is logged related to the actions of the trainee. At a monitoring station of the interactive computer simulation system and during the training session, a graphical user interface is displayed depicting a contextual scene related to the interactive computer simulation from a first point of view and detecting a predetermined event in the dynamic data during the training session. At the monitoring station, a second point of view is defined different from the first point of view and the contextual scene is generated in the graphical user interface after the predetermined event detection from the second point of view.
SYSTEMS AND METHODS FOR SIMULATING AN ELECTRICAL VERTICAL TAKEOFF AND LANDING (EVTOL) AIRCRAFT
In an aspect of the present disclosure is a system for simulating an electrical vertical takeoff and landing (eVTOL) aircraft, including a fuselage comprising one or more pilot inputs, each of the pilot inputs configured to detect pilot datum; a concave screen facing the fuselage; a plurality of projectors directed at the concave screen; a computing device communicatively connected to the plurality of projectors, the computing device configured to: receive the pilot datum detected by the pilot inputs; generate a simulated eVTOL flight maneuver as a function of the pilot datum; and command the plurality of projectors to display one or more images based on the simulated flight maneuver.
METHODS AND SYSTEMS FOR SIMULATED OPERATION OF AN ELECTRIC VERTICAL TAKE-OFF AND LANDING (EVTOL) AIRCRAFT
Aspects relate to augmented reality (AR) methods and systems for simulated operation of an electric vertical take-off and landing (eVTOL) aircraft. An exemplary AR system includes at least an aircraft component of an eVTOL aircraft, a computing device configured to operate a flight simulator to simulate flight in an environment and simulate at least a virtual representation interactive with the flight simulator, where the at least a virtual representation includes an aircraft digital twin of the at least an aircraft component, and a mesh network configured to communicatively connect the at least an aircraft component and the computing device and communicate encrypted data.
SYSTEMS AND METHODS FOR SIMULATING AN ELECTRICAL VERTICAL TAKEOFF AND LANDING (EVTOL) AIRCRAFT
In an aspect of the present disclosure is a system for simulating an electrical vertical takeoff and landing (eVTOL) aircraft, including a fuselage 104 comprising one or more pilot inputs, each of the pilot inputs configured to detect pilot datum; a concave screen facing the fuselage 104; a plurality of projectors directed at the concave screen; a computing device communicatively connected to the plurality of projectors, the computing device configured to: receive the pilot datum detected by the pilot inputs; generate a simulated eVTOL flight maneuver as a function of the pilot datum; and command the plurality of projectors to display one or more images based on the simulated flight maneuver.
AUGMENTED REALITY FOR VEHICLE OPERATIONS
Systems, methods, and computer products according to the principles of the present inventions may involve a training system for a pilot of an aircraft. The training system may include an aircraft sensor system affixed to the aircraft adapted to provide a location of the aircraft, including an altitude of the aircraft, speed of the aircraft, and directional attitude of the aircraft. It may further include a helmet position sensor system adapted to determine a location of a helmet within a cockpit of the aircraft and a viewing direction of a pilot wearing the helmet. The helmet may include a see-through computer display through which the pilot sees an environment outside of the aircraft with computer content overlaying the environment to create an augmented reality view of the environment for the pilot. A computer content presentation system may be adapted to present computer content to the see-through computer display at a virtual marker, generated by the computer content presentation system, representing a geospatial position of a training asset moving within a visual range of the pilot, such that the pilot sees the computer content from a perspective consistent with the aircraft's position, altitude, attitude, and the pilot's helmet position when the pilot's viewing direction is aligned with the virtual marker.
AUGMENTED REALITY FOR VEHICLE OPERATIONS
Systems, methods, and computer products according to the principles of the present inventions may involve a training system for a pilot of an aircraft. The training system may include an aircraft sensor system affixed to the aircraft adapted to provide a location of the aircraft, including an altitude of the aircraft, speed of the aircraft, and directional attitude of the aircraft. It may further include a helmet position sensor system adapted to determine a location of a helmet within a cockpit of the aircraft and a viewing direction of a pilot wearing the helmet. The helmet may include a see-through computer display through which the pilot sees an environment outside of the aircraft with computer content overlaying the environment to create an augmented reality view of the environment for the pilot. A computer content presentation system may be adapted to present computer content to the see-through computer display at a virtual marker, generated by the computer content presentation system, representing a geospatial position of a training asset moving within a visual range of the pilot, such that the pilot sees the computer content from a perspective consistent with the aircraft's position, altitude, attitude, and the pilot's helmet position when the pilot's viewing direction is aligned with the virtual marker.
Augmented reality for vehicle operations
An augmented reality system, includes a head-mounted see-through optic adapted to present digital content viewable by a user and having a transparency that allows the user to see though to the surrounding environment, a non-visual tracking system adapted to identify and track objects in a surrounding environment that cannot be seen visually, a training simulation system adapted to present a virtual training object on a display on the non-visual tracking system and a virtual content presentation system adapted to present digital content in the optic when the distance between the optic and the virtual training object indicates the object is in visual range.
Visualizing sub-systems of a virtual simulated element in an interactive computer simulation system
Method and system for visualizing dynamic virtual sub-systems of a virtual simulated element in an interactive computer simulation system comprising a computer generated environment. One or more tangible instruments control the virtual simulated element in the computer generated environment. A graphical user interface comprising an interactive display portion depicting a rendered view of the virtual simulated element. While an interactive computer simulation of the virtual simulated element is performed in the interactive computer simulation system, a storage system logs dynamic data in relation to the dynamic virtual sub-systems. At least one of the dynamic virtual sub-systems of the virtual simulated element is selected and a subset of dynamic data related to the selected virtual sub-system is loaded from the storage system. The selected virtual sub-system is displayed together with the related dynamic data on the graphical user interface.