APPARATUS AND METHOD FOR RECORDING AND ANALYSING LAPSES IN MEMORY AND FUNCTION
20170319063 · 2017-11-09
Inventors
Cpc classification
G16H50/20
PHYSICS
A61B5/7264
HUMAN NECESSITIES
A61B5/02055
HUMAN NECESSITIES
A61B5/4088
HUMAN NECESSITIES
A61B5/02438
HUMAN NECESSITIES
A61B5/374
HUMAN NECESSITIES
A61B5/4809
HUMAN NECESSITIES
A61B5/002
HUMAN NECESSITIES
International classification
A61B5/00
HUMAN NECESSITIES
A61B5/11
HUMAN NECESSITIES
A61B5/02
HUMAN NECESSITIES
Abstract
An apparatus and method for sensing, recording and analyzing data representative events of memory lapses and function uses a wearable device (e.g., wrist, armband, pendant) having sensors to detect user gestures and vital signs for transmission and analysis by a computation unit to predict the onset of cognitive impairment related diseases.
Claims
1. A wearable sensor device, for sensing and recording data representative of events of memory lapses and function, comprising: a wearable sensor device; at least one gesture sensor in the wearable sensor device capable of sensing a gesture by the wearer, the gesture being representative of events of memory lapses and function; at least one vital sign sensor for sensing at least one vital sign condition being experienced by a wearer of the device; a memory for storing gesture data representing the sensed data from the gesture sensor, and for storing vital sign data sensed by the vital sign sensor; wherein the gesture data and vital sign data is adapted for transmission to a computation unit for analyzing the gesture data and vital sign data, comparing it to a reference database of normative data of age-matched subjects and for producing diagnosis data which predicts onset of cognitive impairment related diseases.
2. The device according to claim 1, wherein the gesture sensor detects at least one of a tap, a tap sequence, an audio signal, a video signal, a hand gesture, a head movement gesture, an audible trigger, and an EEG trigger.
3. The device according to claim 1, wherein the vital sign sensor detects at least one of heart rate, blood pressure, perspiration, EEG temperature and blood oxygen level.
4. The device according to claim 1, further including at least one activity sensor for detecting at least one of sleep exercise, motion and mobility.
5. The device according to claim 1, wherein the device communicates the gesture and vital sign data to a cloud server.
6. The device according to claim 5, wherein the device communicates the gesture and vital sign data through a router to a cloud server.
7. The device according to claim 5, wherein the device communicates the gesture and vital sign data through a Bluetooth low energy (BLE) device to a cloud server.
8. The device according to claim 5, wherein the device communicates the gesture and vital sign data through a charging base to cloud server.
9. The device according to claim 8, wherein the device communicates the gesture and vital sign data through a charging base and router to a cloud server.
10. The device according to claim 5, wherein the device communicates the gesture and vital sign data continuously in real time.
11. The device according to claim 5, wherein the device communicates the gesture and vital sign data in batches.
12. The device according to claim 1, wherein the computation unit calculates a risk factor score based on at least one of frequency of memory lapses, time and quality of sleep, amount and duration of exercise, mobility, heart rate, blood pressure, perspiration and diet.
13. The device according to claim 1, wherein the computation unit predicts onset of cognitive impairment related diseased by analyzing the circumstances under which the memory lapses and function occurred, and determining the type of memory lapse, including one or more components of cognitive or function.
14. The device according to claim 13, wherein the computation unit predicts onset by analyzing gesture and vital sign data for a time period offset in time from a gesture representative of a memory lapse event.
15. The device according to claim 14, wherein the time period offset includes a time period which precedes a gesture representative of a memory lapse event.
16. The device according to claim 14, wherein the time period offset includes a time period which is subsequent to a gesture representative of a memory lapse event.
17. The device according to claim 1, wherein the sensor is an audio sensor and wherein the gesture data is audio data.
18. The device according to claim 1, wherein the sensor is a video sensor and wherein the gesture data is video data of the subject wearing the wearable device.
19. The device according to claim 17, wherein the computation unit further includes a speech recognition unit.
20. The device according to claim 1, wherein the computation unit receives gesture data and vital sign data from a plurality of users wearing a wearable device, and uses the combined data to generate population risk factors.
21. The device according to claim 20, wherein the combined data is used to generate population risk factors for advancing disease.
22. The device according to claim 1, wherein the computation unit compares the gesture and vital sign data to previously obtained baseline data.
23. A method for sensing and recording data representative of events of memory lapses and function, comprising: providing a wearable sensor device worn by a subject; sensing a gesture by the wearer using a gesture sensor in the wearable sensor device, the gesture being representative of events of memory lapses and function; sensing at least one vital sign condition being experienced by a wearer of the device using a vital sign sensor in the wearable sensor device; storing vital sign gesture data representing the sensed data from the gesture sensor, and storing vital sign data sensed by the vital sign sensor; transmitting the gesture data and vital sign data to a computation unit; comparing the gesture data and vital sign data to a reference database of normative data of age-matched subjects; and producing diagnosis data which predicts onset of cognitive impairment related diseases of the subject.
24. The method according to claim 23, wherein the sensing step detects at least one of a tap, a tap sequence, an audio signal, a video signal, a hand gesture, a head movement gesture, an audible trigger, and an EEG trigger.
25. The method according to claim 23, wherein the vital sign sensor detects at least one of heart rate, blood pressure, perspiration, EEG temperature and blood oxygen level.
26. The method according to claim 23, including detecting at least one of sleep exercise, motion and mobility of the subject, and providing activity data.
27. The method according to claim 23, including communicating the gesture and vital sign data to a cloud server.
28. The method according to claim 27, wherein the device communicates the gesture and vital sign data through a router to a cloud server.
29. The method according to claim 27, wherein the device communicates the gesture and vital sign data through a Bluetooth low energy (BLE) device to a cloud server.
30. The method according to claim 27, wherein the device communicates the gesture and vital sign data through a charging base to cloud server.
31. The method according to claim 29, wherein the device communicates the gesture and vital sign data through a charging base and router to a cloud server.
32. The method according to claim 27, wherein the device communicates the gesture and vital sign data continuously in real time.
33. The device according to claim 27, wherein the device communicates the gesture and vital sign data in batches.
34. The device according to claim 23, wherein the computation unit calculates a risk factor score based on at least one of frequency of memory lapses, time and quality of sleep, amount and duration of exercise, mobility, heart rate, blood pressure, perspiration and diet.
35. The device according to claim 23, wherein the computation unit predicts onset of cognitive impairment related diseased by analyzing the circumstances under which the memory lapses and function occurred, and determining the type of memory lapse, including one or more components of cognitive or function.
36. The method according to claim 35, including predicting onset by analyzing gesture and vital sign data for a time period offset in time from a gesture representative of a memory lapse event.
37. The method according to claim 36, including analyzing gesture and vital sign data in a time period which precedes a gesture representative of a memory lapse event.
38. The method according to claim 36, including analyzing gesture and vital sign data in a time period which is subsequent to a gesture representative of a memory lapse event.
39. The method according to claim 36, wherein the gesture data is at least one of audio data and video data of the subject wearing the wearable device.
40. The method according to claim 23, wherein the computation unit further includes a speech recognition unit for recognizing speech.
41. The method according to claim 23, including receiving gesture data and vital sign data from a plurality of users wearing a wearable device, and using the combined data to generate population risk factors.
42. The method according to claim 42, including generating population risk factors for advancing disease.
43. The method according to claim 23, including comparing the gesture and vital sign data to previously obtained baseline data.
44. A non-transitory storage medium for storing instructions for sensing and recording data representative of events of memory lapses and function of a subject wearing a wearable sensing device, wherein the instructions perform the steps of: sensing a gesture by the wearer using a gesture sensor in the wearable sensor device, the gesture being representative of events of memory lapses and function; sensing at least one vital sign condition being experienced by a wearer of the device using a vital sign sensor in the wearable sensor device; and storing vital sign gesture data representing the sensed data from the gesture sensor, and storing vital sign data sensed by the vital sign sensor.
45. The storage medium of claim 44, which further includes instructions for: transmitting the gesture data and vital sign data to a computation unit; comparing the gesture data and vital sign data to a reference database of normative data of age-matched subjects; and producing diagnosis data which predicts onset of cognitive impairment related diseases of the subject.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
DETAILED DESCRIPTION OF THE INVENTION
[0041] One or more embodiments of the invention will be described as exemplary, but the invention is not limited to these embodiments.
[0042] The invention provides a wearable sensor device, for sensing and recording data representative of events of memory lapses and function, comprising: a wearable sensor device; at least one gesture sensor in the wearable sensor device capable of sensing a gesture by the wearer, the gesture being representative of events of memory lapses and function; at least one vital sign sensor for sensing at least one vital sign condition being experienced by a wearer of the device; a memory for storing gesture data representing the sensed data from the gesture sensor, and for storing vital sign data sensed by the vital sign sensor; wherein the gesture data and vital sign data is adapted for transmission to a computation unit for analyzing the gesture data and vital sign data, comparing it to a reference database of normative data of age-matched subjects and for producing diagnosis data which predicts onset of cognitive impairment related diseases.
[0043] The gesture sensor may detect at least one of a tap, a tap sequence, an audio signal, a video signal, a hand gesture, a head movement gesture, an audible trigger, and an EEG trigger. The vital sign sensor may detect at least one of heart rate, blood pressure, perspiration, EEG temperature and blood oxygen level. The device may include at least one activity sensor for detecting at least one of sleep exercise, motion and mobility. The device may communicate the gesture and vital sign data to a cloud server. The device may communicate the gesture and vital sign data through a router to a cloud server. The device may communicate the gesture and vital sign data through a Bluetooth low energy (BLE) device to a cloud server. The device may communicate the gesture and vital sign data through a charging base to cloud server. The device may communicate the gesture and vital sign data through a charging base and router to a cloud server. The device may communicate the gesture and vital sign data continuously in real time.
[0044] The device may communicate the gesture and vital sign data in batches. The computation unit may calculate a risk factor score based on at least one of frequency of memory lapses, time and quality of sleep, amount and duration of exercise, mobility, heart rate, blood pressure, perspiration and diet. The computation unit may predict onset of cognitive impairment related diseased by analyzing the circumstances under which the memory lapses and function occurred, and determining the type of memory lapse, including one or more components of cognitive or function. The computation unit may predict onset by analyzing gesture and vital sign data for a time period offset in time from a gesture representative of a memory lapse event. The time period offset may include a time period which precedes a gesture representative of a memory lapse event. The time period offset may include a time period which is subsequent to a gesture representative of a memory lapse event. The sensor may be an audio sensor and the gesture data may be audio data. The sensor may be a video sensor and the gesture data may be video data of the subject wearing the wearable device. The computation unit may further include a speech recognition unit. The computation unit may receive gesture data and vital sign data from a plurality of users wearing a wearable device, and uses the combined data to generate population risk factors. The combined data may be used to generate population risk factors for advancing disease. The computation unit may compare the gesture and vital sign data to previously obtained baseline data.
[0045] The invention provides a method for sensing and recording data representative of events of memory lapses and function, comprising: providing a wearable sensor device worn by a subject; sensing a gesture by the wearer using a gesture sensor in the wearable sensor device, the gesture being representative of events of memory lapses and function; sensing at least one vital sign condition being experienced by a wearer of the device using a vital sign sensor in the wearable sensor device; storing vital sign gesture data representing the sensed data from the gesture sensor, and storing vital sign data sensed by the vital sign sensor; transmitting the gesture data and vital sign data to a computation unit; comparing the gesture data and vital sign data to a reference database of normative data of age-matched subjects; and producing diagnosis data which predicts onset of cognitive impairment related diseases of the subject.
[0046] The sensing step may detect at least one of a tap, a tap sequence, an audio signal, a video signal, a hand gesture, a head movement gesture, an audible trigger, and an EEG trigger. The vital sign sensor may detect at least one of heart rate, blood pressure, perspiration, EEG temperature and blood oxygen level. The method may detect at least one of sleep exercise, motion and mobility of the subject, and providing activity data. The method may include communicating the gesture and vital sign data to a cloud server. The device may communicate the gesture and vital sign data through a router to a cloud server. The device may communicate the gesture and vital sign data through a Bluetooth low energy (BLE) device to a cloud server. The device may communicate the gesture and vital sign data through a charging base to cloud server. The device may communicate the gesture and vital sign data through a charging base and router to a cloud server. The device may communicate the gesture and vital sign data continuously in real time. The device may communicate the gesture and vital sign data in batches. The computation unit may calculate a risk factor score based on at least one of frequency of memory lapses, time and quality of sleep, amount and duration of exercise, mobility, heart rate, blood pressure, perspiration and diet. The computation unit may predict onset of cognitive impairment related diseased by analyzing the circumstances under which the memory lapses and function occurred, and determining the type of memory lapse, including one or more components of cognitive or function. The method may include predicting onset by analyzing gesture and vital sign data for a time period offset in time from a gesture representative of a memory lapse event. The method may include analyzing gesture and vital sign data in a time period which precedes a gesture representative of a memory lapse event. The method may include analyzing gesture and vital sign data in a time period which is subsequent to a gesture representative of a memory lapse event. The gesture data may be at least one of audio data and video data of the subject wearing the wearable device. The computation unit may further include a speech recognition unit for recognizing speech. The method may include receiving gesture data and vital sign data from a plurality of users wearing a wearable device, and using the combined data to generate population risk factors. The method may include generating population risk factors for advancing disease. The method may include comparing the gesture and vital sign data to previously obtained baseline data.
[0047] The invention provides an apparatus and method of use of a wearable device to record time, data, and frequency of lapses in memory and/or function. This can be accomplished a number of different ways, the following of which are non-limiting examples.
[0048]
[0049] The wearable device can be responsive to a tap, multiple taps, tap pattern, tap pattern for each type of impairment, audio triggered with word recognition built into the wearable, audio recording for speech recognition of key words and phrases (no tap), gaze initiated (looking at a wearable with built in camera that is looking for visual ques or gestures, gesture based trigger with hand or head motion gestures, audible trigger (like a finger snap or other), EEG triggers via EEG devices (either traditional or earbud-born EEG sensor), or through a unique combination of sensors that are illustrative of a lapse event, either based on population training data, individual training data, or a combination thereof. This might also include vital sign data from advanced wearables that also include heart rate, blood pressure, perspiration monitor, and eeg, temperature, and other sensors, including environmental sensors not born on the wearable. Essentially a data signature from the unique combination of sensors triggers the recording of an event.
[0050] The use of this technology would be for patient selection for clinical trials, monitoring of healthy aging, monitoring of subjective memory complainer, MCI, or AD, measuring response to a lifestyle intervention program, supplement, therapy, or other intervention that could influence the measurement both positive and negative. The data could be combined with other biomarker and imaging data to better predict candidates for trials, onset of cognitive decline (MCI), AD, or to predict response to therapy or other intervention.
[0051] The invention provides a method of recording lapses in memory and/or function using varying ways of triggering a wearable to record and analyze said events. The frequency of these events could be analyzed and reported to the person or the doctor to indicate current status in a given time period and also to allow comparison over time to evaluate severity of situation, healthy aging progression, disease progression, or response to therapeutic treatment and/or lifestyle modification or intervention. If an audio recording is utilized, this could be combined with speech recognition to identify and patterns and differentiate different types of events and/or impairments. It may be important to differentiate memory impairment from functional impairment—this may be accomplished utilizing different types of tap codes, audio ques, gestures, combination of sensors, etc.
[0052] This data could be combined with other wearable obtained data (depending upon the wearable) such as: exercise, motion, mobility, heart rate, perspiration, blood pressure, eeg, and sleep data that is also generated by the wearable or combination of wearables and other sensors. A user could match their data against age/gender matched controls to further assess risk factors and generate a risk score. This could also be combined with other sensor data including but not limited to sleep, motion, mobility and other information to predict future onset of Mild Cognitive Impairment (MCI), Alzheimer's disease (AD), or other types of cognitive impairment. This apparatus and method could be utilized to measure response and efficacy of a therapeutic that is intended to slow or reverse cognitive decline. This method could be utilized to measure overall cognitive health and also in response to a lifestyle intervention program including diet exercise and dietary supplements.
[0053] In another embodiment, all the data is aggregated from multiple users to generate population based risk factors for advancing disease or to generate risk scores to report back to users and doctors.
[0054] In another embodiment, the lapses or other cognitive events are automatically recorded according to an algorithm that observes changes in mobility, heart rate, and perspiration (as compared to normal) as detected and automatically recorded by the wearable. This combination could be indicative of a stress event followed by patterns of sensors that indicate a lapse event. The time period of these sensor changes would be important to differentiate lapse events from other events that could trigger same sensor or sensor combination.
[0055] In another embodiment, all data from the wearable is recorded, uploaded to the cloud for post-processing, compared with deep learning big data set and analyzed for patterns consistent with memory and function lapses. In another embodiment, there is a training set for wearable obtained data that has previously been established using a tapping mechanism so as to generate a training set that consists of all the wearable parameters previously described. The training set could be population based, individual, or a combination thereof. This would provide the ability to assess triggers in the context of other wearable data. One could expect changes in a number of factors recorded by the wearable to be predictive of lapses and to be differentiated from other events. As an example, one might detect a change in heart rate and perspiration indicating a high level of stress for a specific period of time, combined with a sudden change in mobility while the user attempts to recall said memory. This pattern could potentially be identifiable based on analysis of multiple users, trained with multiple users, or simply trained by an individual user during a training period, or a combination thereof.
[0056] In one embodiment, a user could use a tap to indicate an event. One could then analyze multiple events from a user over a period of time (perhaps a month training period), generate the unique signal for that individual (as an example, increase heart rate and perspiration for x duration, followed by change in mobility, followed by a return to normal over a certain time period). One could utilize training data generated from numerous users to be predictive of an individual. One could then eliminate the need to tap for future events. One could utilize audio recording in the training set to better differentiate real events and types of events. Generally one could utilize the tap method alongside multiple wearable sensors or wearable EEG sensor to create a “training” set for a given patient, then utilize that data to automatically trigger (without a TAP) based on one or more of the wearable sensors (possibly including EEG data), and/or patterns or combinations of the wearable data that are indicative of these events as learned in the training set.
[0057] While one or more embodiments of the invention have been described, the invention is not limited to these embodiments and the scope of the invention is defined by reference to the following claims.