DEVICE FOR THE IMPLEMENTATION OF SERIOUS GAMES FOR THE PREVENTION AND/OR TREATMENT OF MENTAL DISORDERS
20220028291 · 2022-01-27
Inventors
Cpc classification
G16H20/70
PHYSICS
A61M21/00
HUMAN NECESSITIES
A61B5/1107
HUMAN NECESSITIES
A61B5/165
HUMAN NECESSITIES
A63F13/424
HUMAN NECESSITIES
A63F13/213
HUMAN NECESSITIES
A61B5/0816
HUMAN NECESSITIES
A61M2205/505
HUMAN NECESSITIES
A61B5/6803
HUMAN NECESSITIES
A61M2230/04
HUMAN NECESSITIES
A63F13/217
HUMAN NECESSITIES
A61B5/4803
HUMAN NECESSITIES
A63F13/21
HUMAN NECESSITIES
A63F13/00
HUMAN NECESSITIES
A61M2230/005
HUMAN NECESSITIES
A63F13/211
HUMAN NECESSITIES
A61M2205/52
HUMAN NECESSITIES
A63F13/67
HUMAN NECESSITIES
G09B5/06
PHYSICS
G16H50/20
PHYSICS
A61M2230/04
HUMAN NECESSITIES
A61M2230/005
HUMAN NECESSITIES
A61B5/0205
HUMAN NECESSITIES
A63F13/212
HUMAN NECESSITIES
A61B5/318
HUMAN NECESSITIES
A61B5/398
HUMAN NECESSITIES
A61B5/395
HUMAN NECESSITIES
A63F13/69
HUMAN NECESSITIES
A61B2562/0219
HUMAN NECESSITIES
International classification
G09B5/06
PHYSICS
A61B5/00
HUMAN NECESSITIES
A61B5/0205
HUMAN NECESSITIES
A61B5/08
HUMAN NECESSITIES
A61B5/11
HUMAN NECESSITIES
A61B5/1455
HUMAN NECESSITIES
A61B5/16
HUMAN NECESSITIES
A61B5/318
HUMAN NECESSITIES
A61B5/395
HUMAN NECESSITIES
A61B5/398
HUMAN NECESSITIES
G16H20/70
PHYSICS
Abstract
Provided is a device for the implementation of serious games, i.e. for the presentation of digital games, which do not serve the purpose of entertainment, but the mediation of therapeutic content in the form of images, films, colors, sounds, etc., but may well contain such elements, for the treatment of mental disorders, whereby an authentic and credible, but also entertaining learning experience is the focus of interest in order to achieve a therapeutic result.
Claims
1. A device for the implementation of serious games for the prevention and/or treatment of mental disorders, comprising a. At least a facility for the representation of experiential spaces, b. Sensors for physiological measurement of body functions, c. Facilities for collecting and storing environmental data, d. a module for recording and storing subjective raw data, e. a module for recording and storing therapeutic guideline values, f. Facilities for evaluating sensor and environmental data and raw subjective data to calculate feedback coefficients, g. Facilities for automated interpretation of feedback coefficients, taking into account therapeutic benchmarks and deriving a feedback decision, h. A module to control the facilities to display experiential spaces in response to the feedback decision, i. Facilities for external monitoring of sensor data and feedback decisions, and j. Facilities for external manipulation of feedback decisions.
2. The device according to claim 1, characterized in that the devices for calculating feedback coefficients are configured in such a way that the feedback coefficients are calculated repetitively at defined time intervals.
3. The device according to claim 1, characterized in that the sensors are those for measuring psychophysiological activities, selected from the group: a. Cardiovascular activities, such as heart rate, heart rate variability, respiration/oxygenation of blood via electrocardiogram (ECG) or optical sensor; b. Brain activities, such as electrical events of cortical regions of the brain, electroencephalography (EEG); c. Muscle activity, such as muscle tension, respiratory activity, electromyography; d. Electrodermal activity, such as sweat, skin conductance; e. Eye activities, such as eye movements and gaze direction, eye tracking optically and with electrooculography (EOG); f. Motion activities, such as head movements, upper body movements via motion sensors; and g. Measurement of voice pattern and breath sounds.
4. The device according to claim 1, characterized in that the means for displaying experience spaces comprise elements for providing audiovisual content and/or sensory stimuli.
5. The device according to claim 1, characterized in that the experience space display means comprises at least one audiovisual content generator.
6. The device according to claim 1, characterized in that a media library for storing audiovisual as well as sensory media resources is provided for playback via the devices for displaying experience spaces, wherein certain media contents are assigned to certain feedback coefficients.
7. The device according to claim 1, characterized in that the devices for evaluating the sensor and environmental data and the subjective raw data for calculating feedback coefficients, and the device for automated interpretation of the feedback coefficients and derivation of a feedback decision are each configured as a central device to which a plurality of devices have access for use in the therapy of anxiety disorders, wherein feedback coefficients as well as feedback decision are calculated on the basis of aggregated data of all connected devices.
8. The device according to claim 1, characterized in that the means for evaluating the sensor and environmental data and the raw data and for calculating feedback coefficients, and the means for automatically interpreting the feedback coefficients and deriving a feedback decision are configured that an approximation to predefined target parameters is made by a systematic, iterative variation of the feedback decision in response to successive feedback coefficients.
9. The device according to claim 7, characterized in that the identification of the respective sensor data as well as the subjective raw data for the calculation of the feedback coefficients is performed by a self-learning pattern recognition system accessing the data of the central device.
10. The device of claim 9, characterized in that the pattern recognition is performed by AI methods, clustering analysis, machine learning, and/or the use of artificial neural networks.
11. The device according to claim 8, characterized in that the identification of the respective sensor data as well as the subjective raw data for the calculation of the feedback coefficients is performed by a self-learning pattern recognition system accessing the data of the central device.
12. The device of claim 11, characterized in that the pattern recognition is performed by AI methods, clustering analysis, machine learning, and/or the use of artificial neural networks
13. The device according to claim 1, characterized in that the devices for displaying experience spaces are designed as VR goggles, which are provided with sensors for physiological measurement of body functions.
14. The device according to claim 1, characterized in that the devices for displaying experience spaces are designed as simulators in which sensors for physiological measurement of body functions are provided.
15. The device according to claim 1, characterized in that the devices for displaying experience spaces are designed as augmented reality or mixed reality glasses, which are provided with sensors for physiological measurement of body functions.
16. The device according to claim 1, characterized in that the module for inputting subjective raw data is selected from the group consisting of recording voice commands and/or ambient sounds, recognizing directions of gaze, or recognizing gestures, querying multiple choice questions, querying binary indications, or indicating a range.
17. The device according to claim 1, characterized in that the means for analyzing the sensor and environmental data and the subjective raw data comprises means for analyzing speech and breathing sounds.
18. The device according to claim 1, characterized in that the environmental data acquisition devices comprise light sensors, cameras, motion sensors, accelerometers, and gyroscopes.
19. The device according to claim 1, characterized in that the device is locally or remotely connected to a plurality of devices of the same type via respective interfaces, for presenting an experience space common to all devices.
Description
BRIEF DESCRIPTION OF THE FIGURES
[0048] Further features and advantages of the invention will be apparent from the following, purely descriptive and in no way limiting, description of a preferred embodiment of the invention with reference to the accompanying drawings; therein showing:
[0049]
[0050]
[0051]
DETAILED DESCRIPTION OF THE INVENTION
[0052]
[0053] On the face pad 14, in the illustrated embodiment, an optical heart rate sensor 16 is arranged for measuring the heart rate on the forehead of the person. From the heart rate sensor 16, a cable 18 extends on the upper tether 8 to a signal processing unit 20 comprising a signal processing chip 22 in a box 24 ultimately attached to the tether 8. Further, a USB cable 26 is provided for connection to a smartphone 28 which supplies power to the signal processing chip 22 and the heart rate sensor 16, with all data going from the into the signal processing chip 22 and then into the smartphone 28 via the USB interface 26.
[0054] The smartphone 28 is provided with (not shown) devices for the acquisition and storage of environmental data, in this case with microphones 30 and light sensors 32, with a module for the acquisition and storage of subjective raw data, a module for the acquisition and storage of therapeutic guide values, devices for the evaluation of the data of the heart rate sensor 16 and of the signal processing chip 22, of environmental data of the sensors 30 and 32 and the subjective raw data for calculating feedback coefficients, means for automated interpretation of the feedback coefficients taking into account the therapeutic guide values and deriving a feedback decision, and a module for controlling the VR goggles 2 in response to the feedback decision.
[0055] The
[0056] As can be further seen from
[0057] Finally,
[0058] The first step 34 is data collection: entering subjective raw data in the self-report (questionnaires, condition report) and collecting body data with sensors in the VR goggles 2 (ECG—heart rate, HRV; EEG—e.g. PFC activity; EMG, SCL, eye tracking, head movements, gaze direction).
[0059] This raw data is passed on in a next step 36 for processing by an evaluation algorithm 38, which evaluates the data received in step 36 online at x second intervals and calculates a coefficient (e.g., tension coefficient, anxiety coefficient, pain coefficient . . . ).
[0060] The calculated coefficient or coefficients are then passed on in a step 40 for processing by a decision algorithm 42 to interpret the coefficients and derive a feedback decision (AI, learns from data from all sensors across interconnected devices according to the invention and all users).
[0061] The result of the interpretation of the coefficients by the decision algorithm 42 is passed as a derivative of a feedback decision in step 44 to a feedback system 46 for outputting sensory feedback via the VR goggles 2 based on decision of the decision algorithm 42 (auditory, visual, haptic, electrical, change of conditions of a software e.g. increase of difficulty, feedback to external software—e.g. e-mail to practitioner, triggering of a notification).
[0062] Data can also be permanently analyzed and shared with a practitioner in a medical setting, see reference numeral 48, the practitioner can influence the decision algorithm 42 if needed or initiate feedback manually via the feedback system 46.