G09B9/301

VISUALIZING SUB-SYSTEMS OF A VIRTUAL SIMULATED ELEMENT IN AN INTERACTIVE COMPUTER SIMULATION SYSTEM
20180232132 · 2018-08-16 ·

Method and system for visualizing dynamic virtual sub-systems of a virtual simulated element in an interactive computer simulation system comprising a computer generated environment. One or more tangible instruments control the virtual simulated element in the computer generated environment. A graphical user interface comprising an interactive display portion depicting a rendered view of the virtual simulated element. While an interactive computer simulation of the virtual simulated element is performed in the interactive computer simulation system, a storage system logs dynamic data in relation to the dynamic virtual sub-systems. At least one of the dynamic virtual sub-systems of the virtual simulated element is selected and a subset of dynamic data related to the selected virtual sub-system is loaded from the storage system. The selected virtual sub-system is displayed together with the related dynamic data on the graphical user interface.

PERSPECTIVE SELECTION FOR A DEBRIEFING SCENE
20180232133 · 2018-08-16 ·

Debriefing a session from a user in a system. During the session, while the user performs actions on one or more tangible instruments of the system, dynamic data is logged in relation to the system along a session timeline. The dynamic data covers the actions of the user on tangible instrument(s). A graphical user interface depicting a debriefing scene, related to the session, is displayed from a first point of view starting at a first time within the session timeline. The debriefing scene is generated starting at the first time from at least a first image feed. Upon detection of a predetermined event in the dynamic data at a second time along the session timeline, a second point of view different from the first point of view is defined and the debriefing scene is generated therefrom after the second time using at least a second image feed.

SYSTEMS FOR SIMULATING JOINING OPERATIONS USING MOBILE DEVICES

Systems are disclosed relating to a mobile device mounted to a welding helmet such that a wearer of the welding helmet can see a display of the mobile device when wearing the welding helmet. In some examples, the mobile device is mounted such that a camera of the mobile device is unobscured and positioned at approximately eye level, facing the same way the wearer's eyes are facing. In some examples, the simulated training environment may be presented to the user via the display screen of the mobile device, using images captured by the camera of the mobile device, when the mobile device is so mounted to the welding helmet.

SYSTEMS AND METHODS FOR AUTONOMOUS PERPENDICULAR IMAGING OF TEST SQUARES

An unmanned aerial vehicle (UAV) assessment and reporting system may utilize one or more scanning techniques to provide useful assessments and/or reports for structures and other objects. The scanning techniques may be performed in sequence and optionally used to further fine tune each subsequent scan. The system may include shadow elimination, annotation, and/or reduction for the UAV itself and/or other objects. A UAV may receive or determine a pitch of roof of a structure. The pitch of the roof may be used to capture perpendicular images of sample regions that have a defined area-squared.

Systems and methods for autonomous perpendicular imaging of test squares

An unmanned aerial vehicle (UAV) assessment and reporting system may utilize one or more scanning techniques to provide useful assessments and/or reports for structures and other objects. The scanning techniques may be performed in sequence and optionally used to further fine tune each subsequent scan. The system may include shadow elimination, annotation, and/or reduction for the UAV itself and/or other objects. A UAV may be used to determine a pitch of roof of a structure. The pitch of the roof may be used to fine tune subsequent scanning and data capture to capture perpendicular images at target fields of view and/or target distances to capture images of sample regions having defined square footage.

Adaptive feedback timing system
12197307 · 2025-01-14 · ·

An adaptive feedback timing system and method includes receiving, by a performance observation system, monitoring data associated with electronically monitoring a lesson by a variable feedback teaching device. Adaptive feedback timing also includes receiving, by the performance observation system, error detection data associated with the variable feedback teaching device automatically detecting an error made by a student during the lesson. After receiving the error detection data, a feedback pattern is automatically selected based on a performance history criterion. Feedback data is then communicated to the variable feedback teaching device for presentation to the student according to the automatically selected feedback pattern.

Perspective tracking system
09671876 · 2017-06-06 · ·

Resolution of perspective in three dimensions is necessary for intermeshing real players into simulated environments during virtual training exercises. With the advent of high resolution image sensors the ability to sense position and orientation using image capture devices is possible. The combination of small sized sensors and image recognition tracking algorithms allows the tracking element to be placed directly on the device whose perspective is desired. This provides a solution to determining perspective as it provides a direct measurement from the center axis of the observer. This invention employs a perspective tracking device to determine a point-of-gaze or a point-of-aim in a three-dimensional space to a high degree of accuracy. Point-of-gaze may be used to determine views for head mounted displays and aim-points for weapons. The invention may operate in an unconstrained space allowing simulation participants to operate in a larger, open environment. Areas of interest in the environment are bounded by area of interest markers which identify the region and its physical constraints.

THREE-DIMENSIONAL SIMULATION SYSTEM FOR GENERATING A VIRTUAL ENVIRONMENT INVOLVING A PLURALITY OF USERS AND ASSOCIATED METHOD
20170092223 · 2017-03-30 ·

A three-dimensional simulation system for generating a virtual environment involving a plurality of users and associated method are provided. The system includes a first sensor detecting a viewing direction of a first user, a computing unit configured to create a three-dimensional simulation of the virtual environment, based on data received from the at least one first sensor; for at least one second user, an immersive retrieval assembly for the virtual three-dimensional simulation created by the computing unit. The system includes, for the first user, a second sensor detecting the position of part of an actual limb of the first user. The computing unit is configured to create, in the virtual three-dimensional simulation, an avatar of the first user, comprising a virtual head and a virtual limb, reconstituted and oriented relative to one another based on data from the first sensor and the second sensor.

Simulator system for simulating weather

One embodiment of the present disclosure relates to a method of simulating a weather pattern for use in a simulator environment. The method includes receiving an input corresponding to a desired weather event and determining a set of weather parameters pertaining to the desired weather event. The method further includes searching a weather event database for a matching weather event. The weather event database includes weather data for a plurality of weather events. The method includes identifying the matching weather event. The matching weather event includes at least a portion of the set of weather parameters pertaining to the desired weather event. The method includes receiving weather data corresponding to the matching weather event. The method further includes creating a model of the matching weather event.

Reconfigurable image generator

A RiG may simulate visual conditions of a real world environment, and generate the necessary amount of pixels in a visual simulation at rates up to 120 frames per second. RiG may also include a database generation system capable of producing visual databases suitable to drive the visual fidelity required by the RiG.