Patent classifications
A61B5/378
MODULATION OF THE THETA-GAMMA NEURAL CODE WITH CONTROLLED LIGHT THERAPEUTICS
Gamma brain stimulation (around 40 Hz) is performed using light pulses. To perform theta brain stimulation (around 7 Hz) without perceptible flicker, the light source is also strobed at 47 Hz (also within the gamma range). The brain perceives the 40 Hz and a subtraction frequency of 7 Hz (in the theta range). The combined gamma and theta wave stimulation of the brain may be used for preventing or treating brain disease or sleeping disorders. The particular stimulation frequencies and their phases create neuronal gamma-theta coupling in the brain that has been shown to have positive effects on memory, Alzheimer's disease, motor skills, and other functions. Other gamma and theta frequencies, creating gamma-theta coupling in the brain, are also beneficial. The phase of the light pulses is also dynamically controlled using feedback to maximize theta-gamma coupling in the brain.
Hearing and monitoring system
Systems and methods for assisting user with hearing by amplifying sound using an amplifier with gain and amplitude controls for a plurality of frequencies; and applying a learning machine to identify an aural environment and adjust the amplifiers for optimum hearing.
Hearing and monitoring system
Systems and methods for assisting user with hearing by amplifying sound using an amplifier with gain and amplitude controls for a plurality of frequencies; and applying a learning machine to identify an aural environment and adjust the amplifiers for optimum hearing.
EMBEDDED DEVICE FOR SYNCHRONIZED COLLECTION OF BRAINWAVES AND ENVIRONMENTAL DATA
A system having one or more devices (1) for automatic brainwave analysis, configured to be worn by a user, and a remote processing unit configured to store all of the signals from said one or more devices, having at least one brainwave sensor (11, 12), providing a stream (21, 22) of brain data associated with said user; at least one environmental sensor (13, 14), providing an environmental data stream (23, 24); means (15) for selecting a signal extracted from said stream of brain data, for associating them with at least one corresponding signal extracted from said stream of environmental data, for storing said signals and for transmitting them to the remote processing unit, the latter comprising means for extracting classification information from said data, and for constituting a set associating said data with the extracted classification information.
VENTRAL STRIATUM ACTIVITY
A neurofeedback method, including: recording electrical signals from at least one brain region of a subject, wherein changes in the recorded electrical signals over time indicate changes in an activity level of the at least one brain region; providing an audio signal having a perceived quality based on the recorded electrical signals and according to an activity level of the at least one brain region; delivering the audio signal to the subject during said recording.
ASYNCHRONOUS BRAIN COMPUTER INTERFACE IN AR USING STEADY-STATE MOTION VISUAL EVOKED POTENTIAL
A method and system are disclosed using steady-state motion visual evoked potential stimuli in an augmented reality environment. Requested stimuli data are received from a user application on a smart device. Sensor data and other context data are also received, where other context data includes data that is un-sensed. The requested stimuli data are transformed into modified stimuli based on the sensor data, and the other context data. Modified stimuli and environmental stimuli are presented to the user with a rendering device configured to mix the modified stimuli and the environmental stimuli, thereby resulting in rendered stimuli. Biosignals generated in response to the rendered stimuli are received from the user to a wearable biosignal sensing device. Received biosignals are classified based on the modified stimuli, resulting in a classified selection, which is returned to the user application.
System based on multi-sensory learning and EEG biofeedback for improving reading ability
A system and method for improving reading ability simultaneously utilizing a distinctive protocol of multi-sensory learning and EEG biofeedback. The present invention more particularly relates to an EEG biofeedback system comprising a biofeedback apparatus in the form of a head-mountable device including an electrode array for measuring bioelectrical signals generated by a cerebral cortex of a user's brain and a computer device receiving and analyzing data collected by said biofeedback apparatus and providing audiovisual feedback to the user.
System based on multi-sensory learning and EEG biofeedback for improving reading ability
A system and method for improving reading ability simultaneously utilizing a distinctive protocol of multi-sensory learning and EEG biofeedback. The present invention more particularly relates to an EEG biofeedback system comprising a biofeedback apparatus in the form of a head-mountable device including an electrode array for measuring bioelectrical signals generated by a cerebral cortex of a user's brain and a computer device receiving and analyzing data collected by said biofeedback apparatus and providing audiovisual feedback to the user.
Systems, devices, and methods for generating and manipulating objects in a virtual reality or multi-sensory environment to maintain a positive state of a user
Systems, devices, and methods described herein relate to multi-sensory presentation devices, including virtual reality (VR) devices, visual display devices, sound devices, haptic devices, and other forms of presentation devices, that are configured to present sensory elements, including visual and/or audio scenes, to a user. In some embodiments, one or more sensors including electroencephalography (EEG) sensors and a photoplethysmography (PPG) sensors, e.g., included in a brain-computer interface, can measure physiological data of a user to monitor a state of the user during the presentation of the visual and/or audio scenes. Such systems, devices, and methods can adapt one or more visual and/or audio scenes based on user physiological data, e.g., to control or manage the state of the user.
Systems, devices, and methods for generating and manipulating objects in a virtual reality or multi-sensory environment to maintain a positive state of a user
Systems, devices, and methods described herein relate to multi-sensory presentation devices, including virtual reality (VR) devices, visual display devices, sound devices, haptic devices, and other forms of presentation devices, that are configured to present sensory elements, including visual and/or audio scenes, to a user. In some embodiments, one or more sensors including electroencephalography (EEG) sensors and a photoplethysmography (PPG) sensors, e.g., included in a brain-computer interface, can measure physiological data of a user to monitor a state of the user during the presentation of the visual and/or audio scenes. Such systems, devices, and methods can adapt one or more visual and/or audio scenes based on user physiological data, e.g., to control or manage the state of the user.