DEVICE FOR THE IMPLEMENTATION OF SERIOUS GAMES FOR THE PREVENTION AND/OR TREATMENT OF MENTAL DISORDERS

20220028291 · 2022-01-27

    Inventors

    Cpc classification

    International classification

    Abstract

    Provided is a device for the implementation of serious games, i.e. for the presentation of digital games, which do not serve the purpose of entertainment, but the mediation of therapeutic content in the form of images, films, colors, sounds, etc., but may well contain such elements, for the treatment of mental disorders, whereby an authentic and credible, but also entertaining learning experience is the focus of interest in order to achieve a therapeutic result.

    Claims

    1. A device for the implementation of serious games for the prevention and/or treatment of mental disorders, comprising a. At least a facility for the representation of experiential spaces, b. Sensors for physiological measurement of body functions, c. Facilities for collecting and storing environmental data, d. a module for recording and storing subjective raw data, e. a module for recording and storing therapeutic guideline values, f. Facilities for evaluating sensor and environmental data and raw subjective data to calculate feedback coefficients, g. Facilities for automated interpretation of feedback coefficients, taking into account therapeutic benchmarks and deriving a feedback decision, h. A module to control the facilities to display experiential spaces in response to the feedback decision, i. Facilities for external monitoring of sensor data and feedback decisions, and j. Facilities for external manipulation of feedback decisions.

    2. The device according to claim 1, characterized in that the devices for calculating feedback coefficients are configured in such a way that the feedback coefficients are calculated repetitively at defined time intervals.

    3. The device according to claim 1, characterized in that the sensors are those for measuring psychophysiological activities, selected from the group: a. Cardiovascular activities, such as heart rate, heart rate variability, respiration/oxygenation of blood via electrocardiogram (ECG) or optical sensor; b. Brain activities, such as electrical events of cortical regions of the brain, electroencephalography (EEG); c. Muscle activity, such as muscle tension, respiratory activity, electromyography; d. Electrodermal activity, such as sweat, skin conductance; e. Eye activities, such as eye movements and gaze direction, eye tracking optically and with electrooculography (EOG); f. Motion activities, such as head movements, upper body movements via motion sensors; and g. Measurement of voice pattern and breath sounds.

    4. The device according to claim 1, characterized in that the means for displaying experience spaces comprise elements for providing audiovisual content and/or sensory stimuli.

    5. The device according to claim 1, characterized in that the experience space display means comprises at least one audiovisual content generator.

    6. The device according to claim 1, characterized in that a media library for storing audiovisual as well as sensory media resources is provided for playback via the devices for displaying experience spaces, wherein certain media contents are assigned to certain feedback coefficients.

    7. The device according to claim 1, characterized in that the devices for evaluating the sensor and environmental data and the subjective raw data for calculating feedback coefficients, and the device for automated interpretation of the feedback coefficients and derivation of a feedback decision are each configured as a central device to which a plurality of devices have access for use in the therapy of anxiety disorders, wherein feedback coefficients as well as feedback decision are calculated on the basis of aggregated data of all connected devices.

    8. The device according to claim 1, characterized in that the means for evaluating the sensor and environmental data and the raw data and for calculating feedback coefficients, and the means for automatically interpreting the feedback coefficients and deriving a feedback decision are configured that an approximation to predefined target parameters is made by a systematic, iterative variation of the feedback decision in response to successive feedback coefficients.

    9. The device according to claim 7, characterized in that the identification of the respective sensor data as well as the subjective raw data for the calculation of the feedback coefficients is performed by a self-learning pattern recognition system accessing the data of the central device.

    10. The device of claim 9, characterized in that the pattern recognition is performed by AI methods, clustering analysis, machine learning, and/or the use of artificial neural networks.

    11. The device according to claim 8, characterized in that the identification of the respective sensor data as well as the subjective raw data for the calculation of the feedback coefficients is performed by a self-learning pattern recognition system accessing the data of the central device.

    12. The device of claim 11, characterized in that the pattern recognition is performed by AI methods, clustering analysis, machine learning, and/or the use of artificial neural networks

    13. The device according to claim 1, characterized in that the devices for displaying experience spaces are designed as VR goggles, which are provided with sensors for physiological measurement of body functions.

    14. The device according to claim 1, characterized in that the devices for displaying experience spaces are designed as simulators in which sensors for physiological measurement of body functions are provided.

    15. The device according to claim 1, characterized in that the devices for displaying experience spaces are designed as augmented reality or mixed reality glasses, which are provided with sensors for physiological measurement of body functions.

    16. The device according to claim 1, characterized in that the module for inputting subjective raw data is selected from the group consisting of recording voice commands and/or ambient sounds, recognizing directions of gaze, or recognizing gestures, querying multiple choice questions, querying binary indications, or indicating a range.

    17. The device according to claim 1, characterized in that the means for analyzing the sensor and environmental data and the subjective raw data comprises means for analyzing speech and breathing sounds.

    18. The device according to claim 1, characterized in that the environmental data acquisition devices comprise light sensors, cameras, motion sensors, accelerometers, and gyroscopes.

    19. The device according to claim 1, characterized in that the device is locally or remotely connected to a plurality of devices of the same type via respective interfaces, for presenting an experience space common to all devices.

    Description

    BRIEF DESCRIPTION OF THE FIGURES

    [0048] Further features and advantages of the invention will be apparent from the following, purely descriptive and in no way limiting, description of a preferred embodiment of the invention with reference to the accompanying drawings; therein showing:

    [0049] FIG. 1 the schematic front view of an embodiment of the device for displaying experience spaces of a device for conducting serious games;

    [0050] FIG. 2 schematic rear view of the device according to FIG. 1; and

    [0051] FIG. 3 the schematic sequence of operation of the device for performing serious games.

    DETAILED DESCRIPTION OF THE INVENTION

    [0052] FIG. 1 shows a schematic front view of an embodiment of the device for displaying experience spaces of a device for performing serious games. As can be seen, the embodiment shown is a VR goggle 2, in this case comprising a screen or goggle portion 4 with a face pad 14 provided with three retaining straps 6, 8 and 10 for placement on a person's head to arrange the screen portion directly in front of the person's eyes. A three-way buckle 12 adjustably connects the three straps 6, 8 and 10 together.

    [0053] On the face pad 14, in the illustrated embodiment, an optical heart rate sensor 16 is arranged for measuring the heart rate on the forehead of the person. From the heart rate sensor 16, a cable 18 extends on the upper tether 8 to a signal processing unit 20 comprising a signal processing chip 22 in a box 24 ultimately attached to the tether 8. Further, a USB cable 26 is provided for connection to a smartphone 28 which supplies power to the signal processing chip 22 and the heart rate sensor 16, with all data going from the into the signal processing chip 22 and then into the smartphone 28 via the USB interface 26.

    [0054] The smartphone 28 is provided with (not shown) devices for the acquisition and storage of environmental data, in this case with microphones 30 and light sensors 32, with a module for the acquisition and storage of subjective raw data, a module for the acquisition and storage of therapeutic guide values, devices for the evaluation of the data of the heart rate sensor 16 and of the signal processing chip 22, of environmental data of the sensors 30 and 32 and the subjective raw data for calculating feedback coefficients, means for automated interpretation of the feedback coefficients taking into account the therapeutic guide values and deriving a feedback decision, and a module for controlling the VR goggles 2 in response to the feedback decision.

    [0055] The FIG. 2 shows the rear view of the device according to FIG. 1, where same reference signs denote same elements. In particular, FIG. 2 shows the arrangement of the heart rate sensor 16 on the face pad 14, in such a way that the heart rate sensor 16 comes to rest on the forehead of the person wearing the VR goggles. In the embodiment shown, the heart rate sensor 16 is equipped with a light emitter and a receiver.

    [0056] As can be further seen from FIG. 2, the VR goggles 2 are here provided with two screens 2a and 2b, for generating a stereo image for displaying experiential spaces, where these experiential spaces are generated by assigning certain media content to certain feedback coefficients, i.e., a certain image or movie to a certain stress level.

    [0057] Finally, FIG. 3 shows in a schematic and illustrative manner the sequence of operation of the device according to FIGS. 1 and 2. for carrying out serious games according to the invention.

    [0058] The first step 34 is data collection: entering subjective raw data in the self-report (questionnaires, condition report) and collecting body data with sensors in the VR goggles 2 (ECG—heart rate, HRV; EEG—e.g. PFC activity; EMG, SCL, eye tracking, head movements, gaze direction).

    [0059] This raw data is passed on in a next step 36 for processing by an evaluation algorithm 38, which evaluates the data received in step 36 online at x second intervals and calculates a coefficient (e.g., tension coefficient, anxiety coefficient, pain coefficient . . . ).

    [0060] The calculated coefficient or coefficients are then passed on in a step 40 for processing by a decision algorithm 42 to interpret the coefficients and derive a feedback decision (AI, learns from data from all sensors across interconnected devices according to the invention and all users).

    [0061] The result of the interpretation of the coefficients by the decision algorithm 42 is passed as a derivative of a feedback decision in step 44 to a feedback system 46 for outputting sensory feedback via the VR goggles 2 based on decision of the decision algorithm 42 (auditory, visual, haptic, electrical, change of conditions of a software e.g. increase of difficulty, feedback to external software—e.g. e-mail to practitioner, triggering of a notification).

    [0062] Data can also be permanently analyzed and shared with a practitioner in a medical setting, see reference numeral 48, the practitioner can influence the decision algorithm 42 if needed or initiate feedback manually via the feedback system 46.