REALTIME EVALUATION METHOD AND SYSTEM FOR VIRTUAL REALITY IMMERSION EFFECT
20220039715 · 2022-02-10
Inventors
Cpc classification
A63F13/212
HUMAN NECESSITIES
G06F2203/011
PHYSICS
G16H20/70
PHYSICS
A61B5/165
HUMAN NECESSITIES
G06F3/011
PHYSICS
A63F13/5255
HUMAN NECESSITIES
G16H50/30
PHYSICS
A61B5/374
HUMAN NECESSITIES
A63F13/428
HUMAN NECESSITIES
A63F13/211
HUMAN NECESSITIES
International classification
A61B5/16
HUMAN NECESSITIES
A61B5/374
HUMAN NECESSITIES
G06F17/14
PHYSICS
Abstract
A realtime evaluation method and system for a virtual reality (VR) immersion effect are provided. An electroencephalogram signal is collected while a VR video is played, a degree of emotional arousal and a degree of cognitive absorption are calculated in real time based on energy of frequency bands α, β, and θ that is obtained after wavelet transform, and finally, an immersion effect index is dynamically monitored for objective evaluation on an immersion effect. The method and the system realize realtime measurement and analysis, dynamically monitors a VR video immersion effect, adopts a multi-dimensional comprehensive calculation strategy and takes individual differences into account when calculating an index, effectively resolves problems such as after-fact reporting, social desirability biases, and strong subjectivity in questionnaire measurement and other measurement means, and has a broad market application prospect.
Claims
1. A realtime evaluation method for a virtual reality (VR) immersion effect, comprising following steps: S1: equipping a participant with a VR imaging apparatus and a multi-channel electroencephalogram measurement device, and collecting an electroencephalogram signal of the participant by the multi-channel electroencephalogram measurement device, wherein when collecting the electroencephalogram signal, based on an international 10-20 system and related brain region distribution, electrodes F7 and F8 are selected for calculating a degree of emotional arousal, and electrodes FP1, FPz, and FP2 are selected for calculating a degree of cognitive absorption; S2: collecting an electroencephalogram signal of the participant in a resting state, and performing preprocessing and wavelet transform on the collected electroencephalogram signal in sequence to obtain a mean degree of emotional arousal and a mean degree of cognitive absorption in the resting state, wherein a formula for calculating the mean degree of emotional arousal and a formula for the mean degree of cognitive absorption are as follows:
E.sub.i=λ*A.sub.i+(1−λ)F.sub.i, wherein λ∈ [0, 1], and its specific value is determined based on a feature of video content; and S7: evaluating and grading an immersion effect based on a time-varying curve of the immersion index, wherein the immersion effect is evaluated as sufficient when an immersion coefficient E.sub.i≥0.8, as good when 0.4≤E.sub.i<0.8, and as insufficient when E.sub.i<0.4.
2. A realtime evaluation system based on the realtime evaluation method for the VR immersion effect according to claim 1, comprising: the VR imaging apparatus, configured to display the 3D video as the visual stimulus source; the multi-channel electroencephalogram measurement device, configured to collect the electroencephalogram signal of the participant, and transmit the electroencephalogram signal to a VR immersion effect calculation unit; and the VR immersion effect calculation unit comprising an electroencephalogram signal preprocessing module, an immersion effect calculation module, and a data visualization module, wherein the electroencephalogram signal preprocessing module is configured to perform preprocessing on collected electroencephalogram information, and the preprocessing comprises signal amplification and filtering; the immersion effect calculation module is configured to perform wavelet transform on preprocessed electroencephalogram data to obtain rhythm waves and energy of different frequency bands α, β, and θ, calculate the degree of emotional arousal and the degree of cognitive absorption in real time, and generate the immersion index; and the data visualization module is configured to display the time-varying curve of the immersion index.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0031]
[0032]
DETAILED DESCRIPTION
[0033] The present disclosure is described in detail with reference to accompanying drawings and preferred embodiments, to make the objective and effect of the present disclosure clearer. It should be understood that the specific embodiments described herein are merely intended to illustrate the present disclosure and are not intended to limit the present disclosure.
[0034] As shown in
[0035] The VR imaging apparatus is configured to build a VR video scene platform, and display a 3D video as a visual stimulus source. A professional product such as HTC Vive Pro is usually selected to reduce impact of a screen-door effect.
[0036] The multi-channel EEG measurement device is configured to collect an EEG signal of a participant, and transmit the EEG signal to the VR immersion effect calculation unit. The device includes a 16-channel electrode cap, a junction box, an amplifier, a computer host, and the like. In a measurement process, a voltage error is not greater than 0±10%, input noise is ≤0.5 uV RMS, a common-mode rejection ratio is greater than 110 dB, high sensitivity and a strong anti-interference capability are required, and there is no time delay, so that the EEG signal of the participant is collected accurately in real time. Sampled data is stored and backed up after undergoing analog-to-digital conversion by the amplifier. The VR device transmits a time stamp to EEG data by using the host and based on TCP/IP, so that the video corresponds to the EEG data to subsequently export and analyze offline the EEG data. Configurations of the host used by the system are as follows: CPU: Intel Core i7-9700, equivalent or higher configuration; GPU: NVIDIA GeForce GTX 2080 super, equivalent or higher configuration; and memory: 32 GB RAM or above. In addition, EEG data of certain duration will be packaged and uploaded to the VR immersion effect calculation unit in real time.
[0037] The VR immersion effect calculation unit includes an EEG signal preprocessing module, an immersion effect calculation module, and a data visualization module. The EEG signal preprocessing module performs, by using the amplifier, a notch filter, and a low-pass filter, data filtering, reorganization, interference removal, and artifact removal on the collected EEG data, including removal of noise signal interference caused by eye movement, head movement, and swallowing. The immersion effect calculation module is configured to perform wavelet transform on preprocessed EEG data to obtain rhythm waves and energy of different frequency bands, calculate a degree of emotional arousal and a degree of cognitive absorption in real time, generate an immersion index, and send a related index value to the data visualization module in real time. The data visualization module is configured to display a time-varying curve of the immersion index, to help a user monitor a VR video immersion effect and form a scientific decision.
[0038] As shown in
[0039] At step S1, a participant is equipped with a VR imaging apparatus and a multi-channel EEG measurement device, and an EEG signal of the participant is collected by the multi-channel EEG measurement device.
[0040] Impedance of each electrode remains below 5 kΩ by injecting conductive adhesive to a 16-channel electrode cap. An electrode position is determined based on an international 10-20 system.
[0041] At step S2, an EEG signal of the participant in a resting state is collected, and preprocessing and wavelet transform are performed on the collected EEG signal in sequence to obtain a mean degree of emotional arousal and a mean degree of cognitive absorption in the resting state.
[0042] At first, EEG signals of 60 seconds are recorded for the participant in the case of a blank screen, and a sampling frequency is 1 Hz. Then, the EEG signals in the resting state are preprocessed in real time by using an amplifier, a notch filter, and a low-pass filter, to remove noise signal interference caused by eye movement, head movement, and swallowing, and obtain preprocessed EEG data. Wavelet transform is performed on the preprocessed EEG data in the resting state to obtain rhythm waves and energy of frequency band α, β, and θ. The mean degree L.sub.A of emotional arousal and the mean degree L.sub.F of cognitive absorption in the resting state are calculated according to the following formulas:
[0043] Electrode F7 and F8 are selected for calculating the degree of emotional arousal, and electrodes FP1, FPz, and FP2 are selected for calculating the degree of cognitive absorption. In the foregoing formulas, i represents time, namely, an i.sup.th second; and j represents an EEG acquisition position, namely, a j.sup.th electrode.
[0044] At step S3, a 3D video as a visual stimulus source is played to the participant by the VR imaging apparatus, to enable the participant to be in a virtual environment with relatively closed vision and hearing, and acquire an original EEG signal synchronously.
[0045] At step S4, realtime preprocessing and wavelet transform are performed on original EEG signals of electrodes located in prefrontal and anterior temporal regions, to obtain the rhythm waves and the energy of the frequency bands α, β, and θ.
[0046] At step S5, a realtime degree A.sub.i of emotional arousal and a realtime degree F.sub.i of cognitive absorption are calculated based on results of S2 and S4 according to the following formulas:
[0047] At step S6, an immersion index E in real time is calculated based on the degree of emotional arousal and the degree of cognitive absorption in S5 according to the following formula:
E.sub.i=λ*A.sub.i+(1−λ)F.sub.i.
[0048] In the foregoing formula, λ∈ [0, 1], and its specific value is determined based on characteristics of a VR video. For a video material for emotion smoothing, for example, a natural scenery or a garden landscape, λ<0.4 is recommended. For a video material with rich emotions, for example, a life clip or a daily conversation, 0.4≤λ<0.6 is recommended. For a video material with a strong emotion stimulation capability, for example, a movie climax, a game scene, or a simulation animation, λ≥0.6 is recommended.
[0049] At step S7, an immersion effect are evaluated and graded based on a time-varying curve of the immersion index, where the immersion effect is evaluated as sufficient when immersion coefficient E.sub.i≥0.8, as good when 0.4≤E.sub.i<0.8, and as insufficient when E.sub.i<0.4.
[0050] A person of ordinary skill in the art can understand that the above descriptions are only preferred examples of the present disclosure and are not intended to limit the present disclosure. Although the present disclosure is described in detail with reference to the foregoing examples, a person skilled in the art can still make modifications to the technical solutions described in the foregoing examples, or make equivalent replacement to some technical characteristics. Any modifications and equivalent substitutions made within the spirit and scope of the present disclosure should be included within the protection scope of the present disclosure.