METHOD OF DETECTING AND TRACKING BLINK AND BLINK PATTERNS USING BIOPOTENTIAL SENSORS
20230309860 · 2023-10-05
Inventors
Cpc classification
A61B5/297
HUMAN NECESSITIES
G06F3/015
PHYSICS
A61F4/00
HUMAN NECESSITIES
A61B5/0205
HUMAN NECESSITIES
A61B5/7278
HUMAN NECESSITIES
A61B5/1103
HUMAN NECESSITIES
International classification
A61B5/11
HUMAN NECESSITIES
A61B5/0205
HUMAN NECESSITIES
A61B5/00
HUMAN NECESSITIES
Abstract
A method for detecting and tracking blink and blink patterns of a user from a headworn device, the method including placing an electronic device with a housing on a head of the user, placing one or more biopotential sensors of the housing in contact with skin of the user, detecting, using the biopotential sensors, signals indicative of blink or blink patterns of the user, processing the signals by a processing unit configured to identify blink of the user, and inputting the blink signals into a model capable of decoding gaze and eyelid motion in real-time to understand the attention, intention, and states of the user.
Claims
1.-26. (canceled)
27. A method of detecting and tracking blink and blink patterns of a user, the method comprising: placing an electronic device with a housing on a head of the user; placing one or more biopotential sensors of the housing in contact with skin of the user; detecting, using the biopotential sensors, signals related to eye blink of the user; and inputting the signals into a model capable of decoding gaze and eyelid motion in real-time.
28. The method of claim 27, wherein the biopotential sensors are capable of detecting signals of electroencephalography (EEG), electro-oculography (EOG), and electromyography (EMG).
29. The method of claim 27, wherein the signals are collected from around the ears of the user are one or more of electroencephalography (EEG), electro-oculography (EOG), and electromyography (EMG) signals, and are processed to extract signals indicative of blink, and are in isolation or in combination with other signals including head direction, auditory attention, electrocardiography, temperature, blood oxygen level, activity level, and other bio-signals in order to further indicate blink.
30. The method of claim 27, wherein the housing containing the sensors is located on locations on the head of the user, including on the ear, in the ear, around the ear, horizontal to one or more of the left or right eye, vertical to one or more of the left or right eye, on the forehead, on one or more of the left and right temple, on the face, on the cheek, on the neck, or on the frontal or occipital regions of the head.
31. The method of claim 27, wherein the model is capable of detecting and discriminating between one or more different types of blink, including spontaneous blink, reflexive blink, voluntary blink, unilateral left blink, unilateral right blink, or bilateral blink.
32. The method of claim 27, wherein the model is capable of measuring characteristics of a blink, including blink duration, blink intensity, or blink velocity; and wherein the model is capable of recognizing patterns and sequences of the signals.
33. The method of claim 27, wherein the inputted signals to the model include any one or combination of electroencephalography (EEG), electro-oculography (EOG), and electromyography (EMG) signals, head direction, auditory attention, electrocardiography, temperature, blood oxygen level, activity level, and other bio-signals derived from electrodes and sensors on the headworn device.
34. The method of claim 33, wherein the inputs comprise one or more of the following forms: raw data, filtered data, low-pass filtered data, bandpass filtered data, averaged data, subtraction of left from right data, subtraction of right from left data, subtraction of left average from right average, and subtraction of right average from left average.
35. The method of claim 33, one or more of the signals are a function (f) of one or more or an average of multiple signals from the left amplified voltages Vleft, signal=f(Vleft), or right amplified voltages Vright, signal=f(Vright) or from the difference between one or more or an average of multiple signals of the left and right amplified voltages Vleft and Vright, signal=f(Vleft—Vright).
36. The method of claim 27, wherein the model comprises signal processing methods or machine learning approaches that are based on linear or non-linear models such as artificial neural network models, and where the models can be deep and/or shallow digital and/or analog artificial neural network models.
37. The method of claim 27, wherein the model is a stand-alone blink model or a model in combination with other brain decoding models.
38. The method of claim 27, wherein the signals are used in a sensor fusion approach, in conjunction with other signals including electroencephalography (EEG), electro-oculography (EOG), and electromyography (EMG) signals, head direction, auditory attention, electrocardiography, temperature, blood oxygen level, activity level, and other bio-signals either in the same or in separate parallel algorithms, and from which the sensor fusion approach can determine additional insights or commands from the user.
39. The method of claim 27, further comprising assigning the blink or blink patterns to a user interface method including operating a click of a mouse, typing, toggling a switch, or selecting items.
40. The method of claim 27, further comprising assigning the blink or blink patterns to a predesignated command, including operating as a direction function to navigate menu items, selecting different desired sounds, confirming an action, dismissing a notification, changing audio settings, answering a phone, repeating a song, pausing audio, choosing between connected devices, or switching between pre-programmed settings.
41. The method of claim 27, further comprising providing control to connected electronic devices including mobile phones, smart watches, electric wheelchairs, audio devices, earphones, headphones, hearing aids, cochlear implants, computers, appliances, gaming devices, augmented reality devices, virtual reality devices, extended reality devices, machinery, vehicles, or electronics connected by wireless connectivity.
42. The method of claim 27, wherein the signals provide feedback from the user to a device, the feedback including approval of an action or disapproval of an action.
43. The method of claim 27, wherein characteristics of a blink are used to provide a further dimension of control.
44. The method of claim 40, wherein gaze direction indicates navigation in a virtual or physical environment, and blink detection is used as selection, operation, or execution.
45. The method of claim 38, wherein a blink with or without the one or more of the signals provide information regarding conditions of the user's physical health including fatigue, injury, illness or disease.
46. The method of claim 27, wherein blink data provides additional information about the state of an opening of the eye.
Description
BRIEF DESCRIPTION OF THE FIGURES
[0046] Embodiments will now be described by way of example only with reference to the appended drawings wherein:
[0047]
[0048]
[0049]
DETAILED DESCRIPTION
[0050] The terms “comprise”, “comprises”, “comprised” or “comprising” may be used in the present description. As used herein (including the specification and/or the claims), these terms are to be interpreted as specifying the presence of the stated features, integers, steps, or components, but not as precluding the presence of one or more other feature, integer, step, component, or a group thereof as would be apparent to persons having ordinary skill in the relevant art. Thus, the term “comprising” as used in this specification means “consisting at least in part of. When interpreting statements in this specification that include that term, the features, prefaced by that term in each statement, all need to be present but other features can also be present. Related terms such as “comprise” and “comprised” are to be interpreted in the same manner.
[0051] Unless stated otherwise herein, the article “a” when used to identify any element is not intended to constitute a limitation of just one and will, instead, be understood to mean “at least one” or “one or more”.
[0052] The following relates to a method of detecting and tracking blink and blink patterns that use signal processing methods and machine learning approaches such as analog and digital artificial neural networks to achieve a combination of detecting blink and blink patterns of a user and using it in combination with user attention detection, which are all done by employing the user's brain signals and other biosignals in conjunction with information gathered from the environment, including auditory data and data from other sensors providing relevant information on the environment of the user.
[0053]
[0054] An embodiment wherein a multimodal blink detection headworn device 201 is configured to collect the input signals will be discussed below.
Earworn Device
[0055] Personal listening devices and earworn devices are limited in their ability to understand the attention or intention of a user. This limitation prevents them from responding dynamically to a user's preference, and consequently, provides only a stagnant listening experience, requiring the user to physically interfere with their device when different settings are needed or commands are required to understand the intention of a user, such as pausing a song, turning up the volume, or other commands. The image 300, shown in
[0056] The in-ear sensors and around-ear sensors are preferably made of conductive material, including, but not limited to, metals or polymers with the ability to measure bioelectric signals of the user with whom they have contact. These sensors could be capable of measuring at least one of a variety of signals, including signals such as electroencephalogram (EEG), electromyogram (EMG), electrooculogram (EOG), and blink signals. In the example embodiment shown, 3 in-ear sensors 301 are located at the end of an extension support 305 that extends inwardly from the body 306 of the hearing device 300. When in use, the in-ear sensors 301 are preferably electrodes, and engage the ear canal of the user's ear. In a preferred embodiment, there could be one or multiple in-ear sensors as could be appreciated by a person skilled in the art. Said in-ear and around-ear sensors may also be in the form of other brain imaging modalities, such as used for functional near infrared spectroscopy (fNIRS), magnetoencephalography (MEG), optical pumped magnetoencephalography (OP-MEG), giant magnetoimpedance (GMI), and functional ultrasound (fUS), which can detect the brain's response to different stimuli, as well as the inclusion of blink signals in the brain signal, as can be appreciated by a person skilled in the art.
[0057] The body 306 of the device 300, further includes a plurality of around ear sensors 302. These sensors are preferably mounted on a back surface of the body 306 in such a fashion that they contact the user's head. In a preferred embodiment shown, there are 7 around ear sensors, a person skilled in the art would understand that the number of around ear sensors could vary.
[0058] Real-time Visual Attention Decoding and Eye Blink Detection via Eye Gaze Tracking Using In- and Around-Ear Electrodes
[0059] Current eye gaze and blink tracking methods based on an electrooculogram either use signals recorded from around the eyes or a limited number of signals inside the ear. The system described herein detects a gaze direction and blink of a user based on electrooculogram and electromyogram signals recorded from the in-ear sensors 301 and around-ear sensors 302. The additional signals from the around-ear sensors 302 lead to an increase in signal quality and thus enhances the accuracy and resolution of eye gaze and blink detection. The horizontal and vertical gaze direction (right, left, up, down and center) as well as the angle of the gaze relative to the head is computed based on approximations of voltage ratios and/or subtraction or other interactions between and within the right and left in-ear sensors 301 and around-ear sensors 302. By using sensors located on both the left ear and right ear of the user, the signal quality can be increased by subtracting the signals from one another in order to identify distortions that appear as common artefacts between the signals which represent additional signal noise. The extraction of horizontal and vertical direction and gaze angles is decoded using thresholding methods as well as linear and non-linear models, including but not limited to, Linear and Logistic Regressions, Naive Bayes, Support Vector Machine, K-nearest Neighbors, Decision Tree and Random Forest and Neural Network models such as convolutional neural networks, and from these signals, additional information such as electromyography, can be gathered, which is used to determine head rotation, trunk orientation, and blink, providing an understanding of the absolute gaze direction in the user's field as well as unilateral and bilateral blink patterns, and thus the sensors behind the ear provide additional information about the state of the user.
Attention Tracking and Intention Detection Based on Sensor Fusion Models
[0060] Sensor fusion models are algorithms that combine series of noisy data in order to produce estimates of unknown variables. By estimating a probability distribution based on these combined data, these estimates tend to be more accurate than those based on a single measurement alone. One example of these is the Kalman filter suited for temporal applications, where the probability distributions based on the data is segmented by time for real-life applications. By implementing the sensor fusion model into the headworn device, in addition to integrating the data from all the sensors of the device, the system can be modified to include the information from sensors external to those provided on the hearing device itself, including information from sensors that provide additional knowledge of the environment of the user, such as visual, or other sensory information. These can be combined with the information of the user and the user's attention and intention provided from the headworn device in a signal fusion method for filtering a combination of one or more of said blink data, auditory attention data, gaze direction data, gaze-head-trunk orientation data, location data, sound data, separated sounds, raw EEG, EOG, EMG signal(s), and/or said combined location data, in conjunction with one or more of these external signals from external electronic devices that provide additional information concerning the environment of the user. This fusion of multiple on-device and off-device sensors can be used to provide a holistic understanding of the environment and state of the user, identify the user's attention and intention, and perform appropriate functions. Furthermore, these data may be further improved by the use of sensor fusion methods, e.g., to reduce drift (e.g. Manabe & Fukamoto; 2010), increase robustness, and denoise speech signals or other signals.
Alternative Embodiments
[0061] The above-described principles can be modified according to the following: the device can be a network of a device with one or more individual electrodes placed on the head or face of the user; the device can be miniaturized into a smaller package such as an in-ear hearing device; or the device can be enlarged to be suitable for a headphone, glasses frame, virtual or augmented reality headset, or helmet unit.
[0062] The network of a device with one or more individual electrodes includes: a device with a housing unit and processor capable, and one or more nodes that may or may not be spatially separate from the device, and that contain electrodes for the collection of EEG, EOG, EMG, and blink data from the head or face of the user. Additional nodes may be added to the device to increase the signal-to-noise ratio of the collected signals.
[0063] The smaller package resides in an embodiment of an in-ear hearing device that includes: one or more in-ear dry electrodes for the collection of EEG, EOG, EMG, and blink data from the ear canal of the user, microphones placed on an outward face of the body of the hearable device, as well as accelerometer, gyroscope, magnetometer, and other bio-signal sensors embedded in the device (Pontoppidan, et al., 2020). Additional miniaturized in-ear dry electrode layers can be added into the device along additional planes of skin contact in the ear to increase the signal-to-noise ratio of the collected signals while maintaining the same effective areas as the inserted earphones.
[0064] The larger package resides in an embodiment of a stand-alone headphone unit, glasses frame, virtual or augmented reality headset, or a headphone unit that is incorporated into a helmet including the following elements: around-ear electrodes to be placed in or around the ear of the user that collect EEG, EOG, EMG, and blink data, multiple dry electrodes on the inside of the unit against the skin of the user to collect signals from the scalp, microphones placed both on the outer surface of the unit and/or mounted on the body of a consumer electronic device such as smartphones, smart glasses, virtual or augmented reality headsets, smart watches, or other consumer devices, as well as accelerometer, gyroscope, magnetometer, and other bio-signal sensors embedded in the device.
[0065] The principles, devices, and systems described herein have applications in many areas that involve detecting the visual and auditory attention of the user, direction of gaze, head, and trunk orientation of the user, intention and blink patterns of the user, as well as additional features. An advantage this device brings over alternatives is that it can detect the behavior and attention and intention of the user and perform related functions all in a single package by employing several EEG, EOG, EMG, blink dry electrodes, accelerometer, gyroscope, and magnetometer sensors, microphones and additional external sensors in wirelessly-connected devices including but not limited to smartphones, tablets, smart glasses frames, virtual or augmented reality headsets, smartwatches, and helmets.
[0066] Additional applications include but are not limited to Communication for People with Mobility Disorders, Signaling an Audio Device, Human-Computer Interaction for Electronic Devices, Automotive and Heavy Machinery, and Augmented Reality (AR) or Virtual Reality (VR), each of which is discussed below.
Communication Medium for People with Mobility Disorders
[0067] Using the principles described above, information of the user's eye gaze direction and blink patterns can be interpreted to provide means to operate a computer or electronic device, such as an electric wheelchair, to people who have limited mobility, such as full body paralysis. Direction of motion on a screen or in a room can be interpreted from the direction of the user's gaze, and intention, such as selecting, operating, removing, etc. can be determined through the blink or blink pattern of the user.
Non-verbal, Non-Tactile Method for Signaling an Audio Device
[0068] Using the principles described above, the detected blink and blink patterns, as well as attentional signals from the user, can be used to provide commands to personal audio devices, such as but not limited to mobile phones, earphones, headphones, hearing aids, cochlear implants, regarding the intention of the user. In order to facilitate a better speech enhancement paradigm, speech enhancement audio processing can be directed by gaze direction, and a user can confirm or correct the selection of enhanced sounds using blinks. Furthermore, the blink interface can be used to replace other shortcut commands, such as controlling volume, changing preset settings, etc.
Human-Computer Interaction to Facilitate Commands to Electronic Devices and Wirelessly Connected Devices
[0069] Using the principles described above, the detected blink and blink patterns, as well as attentional signals from the user, can be used to provide insights and commands to electronic devices, such as but not limited to mobile phones, smart watches, electric wheelchairs, audio devices, earphones, headphones, hearing aids, cochlear implants, computers, appliances, gaming devices, augmented reality devices, virtual reality devices, extended reality devices, machinery, vehicles, electronics connected by wireless connectivity, regarding the intention of the user. The blink signals can be assigned to predesignated commands as an alternative to conventional user interaction modalities, such as mouse clicks, keyboards, touchscreens, touchpads, etc. by using the gaze of the user to translate to a new location on a display, and then using preset commands associated with blinks to perform different operations, including but not limited to selecting different desired sounds, confirming an action, dismissing a notification, changing audio settings, answering a phone, choosing between connected devices, switching between pre-programmed settings, etc.
Automotive and Heavy Machinery
[0070] Using the principles described above, information on the state of a driver can be interpreted, including, but not limited to, driver's or operator's attention, distraction, intentional, fatigue, and mental and physical safety level.
[0071] Using the attention tracking and intention detecting models, hands-free control can be provided to the driver or operator to reduce distraction.
[0072] Using blink analysis and gaze tracking, a driver's or operator's eye gaze can be tracked both during the day and night independently of lighting conditions or information provided by any eye-tracking camera.
[0073] Additional information on the state of the vehicle or environment collected by the sensors of the vehicle or system can be fused with the information on the state of the driver or operator to provide a more holistic understanding of the driving conditions or environment for further safety or attention applications.
[0074] Using the EEG, EOG, EMG, blink signals recorded from in-ear electrodes 301 and around-ear electrodes, 302 shown in
[0075] All the points described above contribute to the understanding of the level of driver's or operator's attention to the road conditions.
Augmented Reality (AR) or Virtual Reality (VR)
[0076] Using the principles described above, information about the user of VR/AR is interpreted, including, but not limited to, the user's attention to visual and auditory stimuli in their virtual environment and the user's blink as a command to interact with the virtual environment.
[0077] Although the above description includes reference to certain specific embodiments, various modifications thereof will be apparent to those skilled in the art. Any examples provided herein are included solely for the purpose of illustration and are not intended to be limiting in any way. Any drawings provided herein are solely for the purpose of illustrating various aspects of the description and are not intended to be drawn to scale or to be limiting in any way. The scope of the claims appended hereto should not be limited by the preferred embodiments set forth in the above description but should be given the broadest interpretation consistent with the present specification as a whole. The disclosures of all prior art recited herein are incorporated herein by reference in their entirety.
REFERENCES
[0078] Hori J., Sakano K., et al. Development of communication supporting device controlled by eye movements and voluntary eye blink. Conf IEEE Eng Med Biol Soc. 2004; 6:4302-5.
[0079] Tong J, Lopez M J, Patel B C. Anatomy, Head and Neck, Eye Orbicularis Oculi Muscle. [Updated 2020 Jul. 27]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2021 January. Available from: https://www.ncbi.nlm.nih.gov/books/NBK441907/
[0080] Abdelhady A, Patel B C. Anatomy, Head and Neck, Eye Superior Tarsal Muscle (Mullers Muscle) [Updated 2020 Sep. 16]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2021 January. Available from: https://www.ncbi.nlm.nih.gov/books/NBK540964/
[0081] Portello, J., Rosenfield, M., Chu, C. (2013). Blink Rate, Incomplete Blinks and Computer Vision Syndrome. Optometry and vision science: official publication of the American Academy of Optometry. 90. 10.1097/OPX.0b013e31828f09a7.
[0082] Bologna, M., Agostino, R., Gregori, B., Belvisi, D., Ottaviani, D., Colosimo, C., Fabbrini, G., Berardelli, A., Voluntary, spontaneous and reflex blinking in patients with clinically probable progressive supranuclear palsy, Brain, Volume 132, Issue 2, February 2009, Pages 502-510, https://doi.org/10.1093/brain/awn317
[0083] Chang, W. D. Electrooculograms for human-computer interaction: A review. Sensors (Basel, Switzerland). 2019. https://pubmed.ncbi.nlm.nih.gov/31207949/.
[0084] Virtanen, J., Rantala, B., Virtanen, S. I. J. (2006). Detection of artifacts in bioelectric signals.
[0085] Chang, W. D., Cha, H. S., Kim, K., & Im, C. H. (2016). Detection of eye blink artifacts from single prefrontal channel electroencephalogram. Computer methods and programs in biomedicine, 124, 19-30. https://doi.orci/10.1016/j.cmpb.2015.10.011
[0086] Li, Y., Ma, Z., Lu, W., & Li, Y. (2006). Automatic removal of the eye blink artifact from EEG using an ICA-based template matching approach. Physiological measurement, 27(4), 425-436. https://doi.org/10.1088/0967-3334/27/4/008
[0087] Komeilipoor, N. (2021). MULTIMODAL HEARING ASSISTANCE DEVICES AND SYSTEMS. International Application No. CA2021050730. Gatineau, QC: Canadian Intellectual Property Office.
[0088] Smit, A. E. (2009). Blinking and the brain: Pathways and pathology (dissertation). Haveka, Rotterdam, The Netherlands.
[0089] Lisy, F. J., Opperman, A., Dashevsky, D. D. (2014). Head-mounted physiological signal monitoring system, devices and methods.
[0090] Pontoppidan, N. H., Lunner, T., Pedersen, M. S., Hauschultz, L. I., Koch, P., Naylor, G., Petersen, E. B. (2020). Hearing assistance device with brain computer interface.
[0091] Aimone, C. A., Garten, A. S., Coleman, T., Pino, L. G., Vidyarthi, K. J. M., Baranowski, P. H., Chabior, M. A., Chong, T., Rupsingh, R. R., Ashby, M., Tadich, P. V. (2019). Wearable computing apparatus and method.
[0092] Klappert, W. R., Nichols, M. R., Shimy, C., Wagner, W., Chen, Y., Stathacopoulos, P. T. (2013). Methods and systems for monitoring attentiveness of a user based on brain activity.
[0093] Sato, D., Sugio, T. (2010). Eye-gaze tracking device, eye-gaze tracking method, electro-oculography measuring device, wearable camera, head-mounted display, electronic eyeglasses, and ophthalmological diagnosis device.