Apparatus and methods for monitoring a subject
11547336 · 2023-01-10
Assignee
Inventors
- Yaniv Katz (Ra'anana, IL)
- Roman Karasik (Lod, IL)
- Zvika Shinar (Binyamina, IL)
- Avner Halperin (Ramat Gan, IL)
- Guy Meger (Tel Aviv, IL)
- Liat Tsoref (Tel Aviv, IL)
- Maayan Lia Yizraeli Davidovich (Haifa, IL)
Cpc classification
B60N2/90
PERFORMING OPERATIONS; TRANSPORTING
B60N2/002
PERFORMING OPERATIONS; TRANSPORTING
A61B5/02
HUMAN NECESSITIES
A61B5/11
HUMAN NECESSITIES
International classification
A61B5/00
HUMAN NECESSITIES
B60N2/90
PERFORMING OPERATIONS; TRANSPORTING
B60N2/00
PERFORMING OPERATIONS; TRANSPORTING
A61B5/02
HUMAN NECESSITIES
Abstract
Apparatus and methods are described for monitoring a subject. A sensor monitors the subject and generates a sensor signal in response thereto and a plurality of filters are used to filter the sensor signal using respective filter parameters. A computer processor receives the sensor signal, filters the signal with each of two or more of the filters, and in response to a quality of each of the filtered signals, selects one of the plurality of filters to filter the sensor signal. Subsequently, the computer processor detects that the subject has undergone motion, by analyzing the sensor signal, and in response thereto, filters the signal with each of two or more of the filters, and in response to a quality of each of the filtered signals, selects one of the plurality of filters to filter the sensor signal. Other applications are also described.
Claims
1. An apparatus for monitoring a subject, the apparatus comprising: a sensor, configured to monitor the subject and to generate a sensor signal in response thereto; and a computer processor, configured to: receive the sensor signal, filter the signal with each of a plurality of filters configured to filter the sensor signal using respective filter parameters to create respective filtered signals, in response to a first respective quality of each of the filtered signals, select one of the plurality of filters to filter the sensor signal, and subsequently: detect that the subject has undergone motion, by analyzing the sensor signal, in response thereto, re-filter the signal with each of two or more of the plurality of filters, and in response to a second respective quality of each of the re-filtered signals, select one of the plurality of filters to filter the sensor signal.
2. The apparatus according to claim 1, wherein the sensor is configured to monitor a cardiac-related signal of the subject.
3. The apparatus according to claim 1, wherein the plurality of filters are configured to filter the sensor signal using respective bandwidth properties.
4. The apparatus according to claim 1, wherein the plurality of filters are zero-mean filters.
5. The apparatus according to claim 4, wherein the filters are configured to remove any trends, movements of the subject, or respirations of the subject.
6. The apparatus according to claim 1, wherein the sensor is a contact-free sensor disposed on or within the subject's bed.
7. The apparatus according to claim 6, wherein the sensor is configured to be disposed underneath the subject's mattress.
8. The apparatus according to claim 7, wherein the plurality of filters are configured to have a main lobe with a full-width-half-maximum value corresponding to a human biological heartbeat as recorded with the contact free sensor under the subject's mattress.
9. The apparatus according to claim 1, wherein the computer processor is configured to select the one of the plurality of filters to filter the sensor signal by selecting the filter having the greatest signal-to-noise ratio.
10. The apparatus according to claim 9, wherein the computer processor is configured to select the filter having the greatest signal-to-noise ratio by selecting the filter that generates the highest ratio of the main lobe to the side lobes in the filtered signal.
11. The apparatus according to claim 1, wherein the computer processor is further configured to: detect if the signal quality falls below a threshold, in response thereto, filter the signal with each of two or more of the filters, and in response to a quality of each of the filtered signals, select one of the plurality of filters to filter the sensor signal.
12. The apparatus according to claim 1, wherein the computer processor is further configured to, at fixed time intervals, filter the signal with each of two or more of the filters and in response to a quality of each of the filtered signals, select one of the plurality of filters to filter the sensor signal.
13. The apparatus according to claim 12, wherein the computer processor is configured to, every 5 minutes, filter the signal with each of two or more of the filters and in response to a quality of each of the filtered signals, select one of the plurality of filters to filter the sensor signal.
14. The apparatus according to claim 12 wherein the computer processor is configured to, every 10 minutes, filter the signal with each of two or more of the filters and in response to a quality of each of the filtered signals, select one of the plurality of filters to filter the sensor signal.
15. The apparatus according to claim 12, wherein the computer processor is configured to, every 15 minutes, filter the signal with each of two or more of the filters and in response to a quality of each of the filtered signals, select one of the plurality of filters to filter the sensor signal.
16. The apparatus according to claim 1, wherein the apparatus is configured to be used in combination with an electrocardiography (ECG) signal, and wherein the sensor is configured to monitor a cardiac-related signal of the subject, and the computer processor is further configured to analyze the sensor signal and in response thereto extract one or more cardiac events which correlated with the ECG signal, the one or more cardiac events selected from the group consisting of: mitral valve closure, aortic valve opening, systolic ejection, aortic valve closure, and mitral valve opening.
17. The apparatus according to claim 16, wherein the computer processor is configured to use the identified one or more events to monitor mechanical functioning of the subject's heart.
18. The apparatus according to claim 17, wherein the computer processor is configured to use the identified one or more events to measure the subject's left ventricular ejection time.
19. The apparatus according to claim 16, wherein the computer processor is configured to, in combination with the ECG signal, use the identified one or more events to analyze the subject's cardiac cycle.
20. The apparatus according to claim 1, wherein the processor selects a subset of filters based upon the first respective quality of each of the filtered signals, and wherein when the subject has undergone motion, the processor re-filters the signal with each of two or more of the plurality of filters selected from the subset of filters.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
DETAILED DESCRIPTION OF EMBODIMENTS
(11) Reference is made to
(12) Subject-monitoring apparatus 20 comprises a sensor 22 (e.g., a motion sensor) that is configured to monitor subject 24. Sensor 22 may be a motion sensor that is similar to sensors described in U.S. Pat. No. 8,882,684 to Halperin, which is incorporated herein by reference. The term “motion sensor” refers to a sensor that senses the subject's motion (e.g., motion due to the subject's cardiac cycle, respiratory cycle, or large-body motion of the subject), while the term “sensor” refers more generally to any type of sensor, e.g., a sensor that includes an electromyographic sensor and/or an imaging sensor.
(13) Typically, sensor 22 includes a sensor that performs monitoring of the subject without contacting the subject or clothes the subject is wearing, and/or without viewing the subject or clothes the subject is wearing. For example, the sensor may perform the monitoring without having a direct line of sight of the subject's body, or the clothes that the subject is wearing, and/or without any visual observation of the subject's body, or the clothes that the subject is wearing. Further typically, the sensor performs monitoring of the subject without requiring subject compliance (i.e., without the subject needing to perform an action to facilitate the monitoring that would not have otherwise been performed). It is noted that, prior to the monitoring, certain actions (such as purchasing the sensor, placing the sensor under the subject's mattress, downloading software for use with the subject-monitoring apparatus, and/or configuring software for use with the subject-monitoring apparatus) may need to be performed. The term “without requiring subject compliance” should not be interpreted as excluding such actions. Rather the term “without requiring subject compliance” should be interpreted as meaning that, once the sensor has been purchased, placed in a suitable position and activated, the sensor can be used to monitor the subject (e.g., to monitor the subject during repeated monitoring sessions), without the subject needing to perform any actions to facilitate the monitoring that would not have otherwise been performed.
(14) For some applications, sensor 22 is disposed on or within the subject's bed, and configured to monitor the subject automatically, while the subject is in their bed. For example, sensor 22 may be disposed underneath the subject's mattress 26, such that the subject is monitored while she is lying upon the mattress, and while carrying out her normal sleeping routine, without the subject needing to perform an action to facilitate the monitoring that would not have otherwise been performed.
(15) A computer processor 28, which acts as a control unit that performs the algorithms described herein, analyzes the signal from sensor 22. Typically, computer processor 28 communicates with a memory 29. For some applications, computer processor 28 is embodied in a desktop computer 30, a laptop computer 32, a tablet device 34, a smartphone 36, and/or a similar device that is programmed to perform the techniques described herein (e.g., by downloading a dedicated application or program to the device), such that the computer processor acts as a special-purpose computer processor. For some applications, as shown in
(16) For some applications, the subject (or another person, such as a care-giver) communicates with (e.g., sends data to and/or receives data from) computer processor 28 via a user interface device 35. As described, for some applications, computer processor is embodied in a desktop computer 30, a laptop computer 32, a tablet device 34, a smartphone 36, and/or a similar device that is programmed to perform the techniques described herein. For such applications, components of the device (e.g., the touchscreen, the mouse, the keyboard, the speakers, the screen) typically act as user interface device 35. Alternatively, as shown in
(17) For some applications, user interface includes an input device such as a keyboard 38, a mouse 40, a joystick (not shown), a touchscreen device (such as smartphone 36 or tablet device 34), a touchpad (not shown), a trackball (not shown), a voice-command interface (not shown), and/or other types of user interfaces that are known in the art. For some applications, the user interface includes an output device such as a display (e.g., a monitor 42, a head-up display (not shown) and/or a head-mounted display (not shown)), and/or a different type of visual, text, graphics, tactile, audio, and/or video output device, e.g., speakers, headphones, smartphone 36, or tablet device 34. For some applications, the user interface acts as both an input device and an output device. For some applications, the processor generates an output on a computer-readable medium (e.g., a non-transitory computer-readable medium), such as a disk, or a portable USB drive.
(18) Reference is now made to
(19) It is noted that blanket 50 can be an over-blanket that is placed over the subject's body, or an under-blanket that is placed above the subject's mattress and beneath the subject (as shown). Furthermore, the scope of the present invention includes any temperature control device that includes first and second sections corresponding to respective portions of a body of a single subject, for use with a temperature-regulation unit that regulates the respective portions of the subject's body to be at respective temperatures by, simultaneously, setting the temperature of the first section of the temperature control device to a first temperature, and setting the temperature of the second section of the temperature control device to a second temperature that is different from the first temperature. For example, the temperature control device may include a mattress (e.g., an electric mattress), which includes built-in heating pads.
(20) As described hereinabove, thermoregulation during sleep affects sleep quality. Moreover, as described in the Kräuchi article, for example, selective vasodilation of distal skin regions (and hence heat loss) may promote the onset of sleep. For some applications, the computer processor drives the temperature-regulation unit to regulate the temperatures of respective portions of the subject's body to be at respective temperatures, in the manner described herein, such as to improve sleep quality, shorten sleep latency, and/or better maintain sleep continuity. For example, the computer processor may drive the temperature-regulation unit to heat the subject's legs and/or arms to a greater temperature than the subject's trunk. For some applications, the computer processor may drive the temperature-regulation unit to cool one or more portions of the subject's body. For some applications, the computer processor drives the temperature-regulation unit to heat and/or cool respective portions of the subject's body to respective temperatures, in response to the subject's sleep stage, which is detected automatically by analyzing the sensor signal from sensor 22.
(21) Reference is now made to
(22) For some applications, in response to the computer processor determining that the subject is at the start of a sleeping session (e.g., in a falling-asleep stage or a beginning-sleep stage), the computer processor drives the temperature-regulation unit to heat distal parts of the subject's body (e.g., the subject's arms and/or legs) to a higher temperature than the subject's trunk. Typically, the computer processor will use different temperature profiles for different sleep states. For example, when the subject is in slow wave sleep, the computer processor may drive the temperature-regulation unit to keep temperatures lower than during other phases of the subject's sleep. Alternatively or additionally, when the subject wakes up during the night the computer processor may use a similar profile to that used when the subject is initially trying to fall asleep.
(23) For some applications, the computer processor drives the temperature-regulation unit to warm the subject's trunk in order to gently wake up the subject. For example, the computer processor may use trunk warming to wake up the subject, based on having received an input of a desired time for the subject to wake up (e.g., via the user interface), or based on detecting that the current sleep phase of the subject is such that it would be a good time to wake up the subject.
(24) For some applications, a user designates temperature profiles corresponding to respective sleep stages, via a user input into the computer processor. Typically, the temperature profile of any sleep stage will include respective temperatures for respective portions of the subject's body, and/or differences between the temperatures to which respective portions are heated or cooled. Alternatively or additionally, the computer processor utilizes a machine learning algorithm, based upon which the computer processor analyzes the subject's response to different temperature profiles at different sleep stages and learns which temperature profiles at which sleep phases result in the best quality sleep for the subject. Typically, for such applications, based upon the aforementioned analysis, the computer processor automatically designates temperature profiles to respective sleep stages.
(25) As described hereinabove, for some applications, the computer processor additionally adjusts the temperature of an additional room-climate regulation device, such as an air-conditioning unit, an electric heater, and/or a radiator. For example, an air-conditioning unit may be used to provide additional control of the temperature of the subject's trunk by controlling the temperature of the air that the subject inhales. For some applications, the temperature profiles of the respective sleep stages include a setting for the additional room-climate regulation device.
(26) Referring again to
(27) For some applications, the temperature control device is a portion of a blanket or a mattress that is suitable for being used by two subjects (e.g., partners in a double bed). Even for such applications, a portion of the blanket or mattress that is configured to be placed underneath or over a single subject (e.g., a left half of the blanket, or a left half of the mattress) includes at least first and second sections (e.g., a trunk section corresponding to the subject's trunk, leg sections corresponding to the subject's legs, and/or arm sections corresponding to the subject's arms), and the temperature-regulation unit regulates the respective portions of the subject's body to be at respective temperatures by, simultaneously, setting the temperature of the first section of the temperature control device to a first temperature, and setting the temperature the second section of the temperature control device to a second temperature that is different from the first temperature, and, optionally, setting the temperature of additional sections of the temperature control device to further respective temperatures.
(28) Typically, the techniques described herein are practiced in combination with techniques described in WO 16/035073 to Shinar, which is incorporated herein by reference. For example, the computer processor may drive the user interface to prompt the subject to input changes to the temperature profiles corresponding to respective sleep stages, in response to a change in a relevant parameter. For example, in response to a change in season, an ambient temperature, an ambient humidity, and/or a going-to-sleep time (e.g., the subject is going to bed at an unusual time), the computer processor may drive the user interface to prompt the subject to re-enter his/her temperature profiles. (The computer processor may identify the change of the relevant parameter in a variety of ways, such as, for example, by receiving input from a sensor, or by checking the internet.)
(29) For some applications, in response to analyzing the sensor signal, the computer processor calculates a sleep score of the subject. For example, the computer processor may calculate a score from one or more parameters such as a time to fall asleep, duration of sleep, or “sleep efficiency,” which is the percentage of in-bed time during which the subject is sleeping. For some applications, the score is calculated using one or more of the aforementioned parameters, such that a higher sleep score is indicative of more restful sleeping session relative to a lower sleep score. The computer processor may then compare the sleep score to a baseline value, e.g., an average sleep score over a previous period of time. In response to the calculated sleep score being lower than the baseline value, the computer processor may drive the user interface to prompt the subject to re-enter new temperature profiles for respective sleep stages, since it is possible that the temperature profiles were a contributing factor in the subject's low sleep score. Alternatively or additionally, the computer processor may drive user interface to prompt the subject to input at least one factor that may have caused the low sleep score. The computer processor then controls the heating device in response to the input.
(30) In some applications, the computer processor computes a measure of relaxation, i.e., a relaxation score, for the subject, one or more times during a sleeping session. For example, a high relaxation score may be computed if the subject shows little movement, and little variation in both respiration rate and respiration amplitude. The relaxation score may be used to compute the sleep score. Alternatively or additionally, in response to a low relaxation score, the computer processor may immediately adjust the temperature of sections of the temperature control device.
(31) In some applications, in response to a low sleep score, the computer processor adjusts the temperature profiles even without any input from the user, or the computer processor generates an output (e.g., via user interface device 35) that includes suggested temperature profiles, which the subject may edit and/or confirm via the user interface.
(32) For some applications, when the temperature control device is initially used by the subject, the computer processor is configured to perform a “sweep” (or “optimization routine”) over a plurality of different temperature profiles at respective sleep stages, in order to ascertain which profiles at which sleep stages are conducive to a higher sleep score, relative to other settings, e.g., which setting maximizes the sleep score. For example, over the course of several sleeping sessions, the computer processor may change the temperature profiles that are used at respective sleep stages in different ways, and in response thereto, determine the optimal temperature profiles.
(33) Additional techniques as described in WO 16/035073 to Shinar, which is incorporated herein by reference, may be practiced in combination with the apparatus and methods described herein.
(34) Reference is now made to
(35) For some applications, sensor 22 is configured to monitor the subject during a sleeping session of the subject. The computer processor receives and analyzes the sensor signal (step 70). Based on the analysis of the signal, the computer processor identifies the positions of the subject's body at respective times during the sleeping session (step 72). For example, the system may identify when during the sleeping session the subject was lying on his/her side, when during the sleeping session the subject was lying on his/her back (i.e., supine), and when during the sleeping session the subject was lying on his/her stomach. For some applications, the computer processor determines the positions of the subject's body by analyzing the sensor signal using analysis techniques as described in U.S. Pat. No. 8,821,418 to Meger, which is incorporated herein by reference. For some applications, when the computer processor is first used for monitoring sleep apnea events, in accordance with the procedure shown in
(36) In addition, based upon the analysis of the sensor signal, the computer processor identifies apnea events that occur during the sleeping session (step 74). For example, the computer processor may identify apnea events by analyzing the sensor signal using analysis techniques as described in US 2007/0118054 to Pinhas (now abandoned), which is incorporated herein by reference. In step 76, the computer processor identifies a correspondence between positions of the subject and occurrences of apnea events of the subject during the sleeping session. The computer processor typically generates an output on an output device (e.g., any one of the output devices described with reference to
(37) For example, the computer processor may generate an indication of:
(38) (a) which positions cause the subject to undergo apnea events (e.g., “Sleeping on your back causes apnea events to occur”),
(39) (b) a recommended position for the subject to assume while sleeping (e.g. “Try sleeping on your side”), and/or
(40) (c) recommended steps to take in order to reduce the likelihood of apnea events occurring (e.g., “Try sleeping with a ball strapped to your back”).
(41) For some applications, the analysis of the sensor signal (step 70), the identification of subject positions (step 72), the identification of apnea events (step 74), and/or the identification of correspondence between the apnea events and the subject positions (step 76) are performed in real time, as the sensor signal is received by the processor. Alternatively, one or more of the aforementioned steps are performed subsequent to the sleeping session.
(42) For some applications, in response to detecting that the subject is lying in a given position that the processor has determined to cause the subject to undergo apnea events, the computer processor generates an alert and/or nudges the subject to change positions. For example, in response to detecting that the subject is in a supine position (and having determined that lying in this position causes the subject to undergo apnea events), the computer processor may cause the subject's bed to vibrate, or may adjust the tilt angle of the bed or a portion thereof.
(43) For some applications, techniques described herein are practiced in combination with techniques described in US 2007/0118054 to Pinhas, which is incorporated into the present application by reference. For example, the apparatus described herein may be used with a bed or mattress with an adjustable tilt angle, and/or an inflatable pillow which, when activated, inflates or deflates to vary the elevation of the head of the subject as desired. For some applications, in response to detecting that the subject is lying in a given position that the processor has determined to cause the subject to undergo apnea events, the pillow's air pressure level is changed, and/or the tilt angle of the bed or the mattress is changed, in order to change the patient's posture and prevent an upcoming apnea event, or stop a currently-occurring apnea event.
(44) Typically, the techniques described herein are practiced in combination with techniques described in WO 16/035073 to Shinar, which is incorporated herein by reference. For some applications, a processor as described with reference to
(45) Typically, the subject is more likely to snore, cough, or have an apnea episode when the subject is in a supine position. The computer processor reduces the frequency of snoring, coughing, and/or apnea of subject 24 by encouraging (e.g., by “nudging”) the subject to move from a supine position to a different position.
(46) As described hereinabove the computer processor identifies the subject's sleeping position by analyzing the sensor signal from sensor 22. In response to the identified sleeping position, e.g., in response to the identified posture being a supine position, the computer processor drives the vibrating mechanism to vibrate, and/or adjusts a parameter (e.g., an angle) of the surface upon which the subject is lying. The vibration typically nudges the subject to change his posture, while the adjustment of the parameter may nudge the subject to change his posture or actually move the subject into the new posture.
(47) In some applications, an inflatable pillow is used and the computer processor adjusts a level of inflation of the inflatable pillow. For example, to inhibit coughing and/or snoring, the computer processor may drive an inflating mechanism to inflate the inflatable pillow, by communicating a signal to the inflating mechanism.
(48) As described hereinabove, for some applications, the computer processor is configured to identify a sleep stage of the subject. For some such applications, the computer processor drives the vibrating mechanism to vibrate, and/or adjusts the parameter of the resting surface, further in response to the identified sleep stage. For example, the computer processor may drive the vibrating mechanism to vibrate, and/or adjust the parameter of the resting surface, in response to the identified sleep stage being within 5 minutes of an onset or an end of an REM sleep stage, since at these points in time, the “nudging” or moving is less likely to disturb the subject's sleep.
(49) Reference is now made to
(50) Typically, the computer processor derives vital signs of the subject (such as heart rate, respiratory rate, and/or heart-rate variability) from the sensor signal. For some applications, the computer processor compares the subject's vital signs to a baseline of the subject that was derived during previous occasions when the subject operated the vehicle. In response thereto, the computer processor may determine that the subject's vital signs have changed substantially from the baseline, that the subject is unwell, drowsy, asleep, and/or under the influence of drugs or alcohol. In response thereto, the computer processor may generate an alert to the driver, or to a remote location (such as to a family member, and/or to a corporate control center). Alternatively or additionally, the computer processor may automatically disable the vehicle.
(51) For some applications, the computer processor integrates the analysis of the sensor signal from sensor unit 80 with the analysis of a sensor signal from an additional sensor, which may be disposed in the subject's bed, for example. For example, the computer processor may determine that the subject has not had enough sleep based upon the analysis of the signals from both sensors. Or, the sensor may derive, from the combination of the sensor signals, that the subject has had enough sleep, but appears to be unwell, and/or under the influence of drugs or alcohol. In response thereto, the computer processor may generate an alert to the driver, or to a remote location (such as to a family member, and/or to a corporate control center). Alternatively or additionally, the computer processor may automatically disable the vehicle.
(52) For some applications, sensor units 80 are disposed underneath more than one seat in the vehicle. For example, sensor units may be disposed underneath the seats of a pilot and a co-pilot in an airplane (e.g., as described in WO 16/035073 to Shinar, which is incorporated herein by reference). Or, sensor units may be disposed underneath each of the seats in an airplane or a car. Based upon the sensor signals from the sensor units, the computer processor may determine that a child has been left alone in a car, and may generate an alert in response thereto. For example, the alert may be generated on the driver's and/or parents' cellular phone(s). Alternatively or additionally, the computer processor may determine the number of people in the car. (It is noted that the sensor is typically configured to distinguish between a person who is disposed upon the seat and an inanimate object (such as a suitcase, or backpack) that is disposed upon the seat.) In response thereto, the computer processor may generate seatbelt alerts, for example. Alternatively or additionally, the computer processor may automatically communicate with the billing system of a toll road for which prices are determined based upon the number of passengers in the car.
(53) Typically, in order to facilitate the above-described applications, sensor unit 80 is configured to generate a sensor signal that is such that the computer processor is able to distinguish between artifacts from motion of vehicle, and motion that is indicative of physiological parameters of the subject. Typically, the sensor unit includes (a) a housing, (b) at least one first motion sensor disposed within the housing, such that the first motion sensor generates a first sensor signal that is indicative of the motion of the vehicle, and (c) at least one second motion sensor disposed within the housing, such that the second motion sensor generates a second sensor signal that is indicative of the motion of the subject and the motion of the vehicle. The computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
(54) For some applications, the first motion sensor is disposed within the housing, such that the first motion sensor is isolated from the motion of the subject, and/or such that the first motion sensor only detects motion that is due to motion of the vehicle. The computer processor at least partially distinguishes between the motion of the subject and the motion of the vehicle by (a) deriving the motion of the vehicle from the first sensor signal(s), and (b) based upon the derived motion of the vehicle, subtracting the vehicular motion (i.e., subtracting the portion of the sensor signal that is generated by the motion of the vehicle) from the sensor signal that is generated by the second sensor(s).
(55) Reference is now made to
(56) As shown in
(57) Typically, fluid compartment 94 isolates first motion sensor 96 from motion of the subject who is sitting on the seat, such that motion sensor 96 only detects motion that is due to motion of the vehicle. Second motion sensor(s) detects both motion of the vehicle, and motion of the subject, the motion of the subject being conveyed to the second motion sensor(s) via the flexible portion of the housing. Thus, the computer processor at least partially distinguishes between the motion of the subject and the motion of the vehicle by (a) deriving the motion of the vehicle from the first sensor signal, and (b) based upon the derived motion of the vehicle, subtracting the vehicular motion (i.e., subtracting the portion of the sensor signal that is generated by the motion of the vehicle) from the sensor signal that is generated by the second sensor(s).
(58) As shown in
(59) Typically, the rigidity of the rigid portion of the housing isolates first motion sensor(s) 106 from motion of the subject who is sitting on the seat, such that first motion sensor(s) 106 only detects motion that is due to motion of the vehicle. Second motion sensor(s) detects both motion of the vehicle, and motion of the subject, the motion of the subject being conveyed to the second motion sensor(s) via the flexible portion of the housing. Thus, the computer processor at least partially distinguishes between the motion of the subject and the motion of the vehicle by (a) deriving the motion of the vehicle from the first sensor signal(s), and (b) based upon the derived motion of the vehicle, subtracting the vehicular motion (i.e., subtracting the portion of the sensor signal that is generated by the motion of the vehicle) from the sensor signal that is generated by the second sensor(s).
(60) Typically, the techniques described herein are practiced in combination with techniques described in WO 16/035073 to Shinar, which is incorporated herein by reference. For some applications, a sensor unit as described with reference to
(61) (a) An alert may be generated if, by analyzing the sensor signal, the computer processor identifies an elevated stress level of a subject, e.g., by identifying an elevated heart rate, and/or a decreased stroke volume, e.g., as described in WO 2015/008285 to Shinar, which is incorporated herein by reference. For example, in response to the pilot experiencing an elevated stress level, the computer processor may generate an alert to another member of the flight crew, and/or individuals on the ground. The computer processor may also analyze the signal of the co-pilot, and generate an alert in response to both the pilot and co-pilot experiencing an elevated stress level, since the presence of an elevated stress level in both individuals at the same time is likely to be indicative of an emergency situation. Similarly, an alert may be generated if two or more passengers experience an elevated stress level at the same time.
(62) (b) An alert may be generated if, by analyzing the sensor signal, the computer processor identifies that it is likely that the subject is experiencing, or will soon experience, a clinical event, such as a heart attack. For example, if the pilot or one of the passengers is experiencing a heart attack, members of the flight crew, and/or a physician who is travelling on the airplane, may be alerted to the situation.
(63) (c) An alert may be generated if, by analyzing the sensor signal, the computer processor identifies that it is at least somewhat likely that the subject is a carrier of a disease, such as severe acute respiratory syndrome (SARS). For example, if the computer processor identifies a change in the baseline heart rate of the subject without any correlation to motion of the subject, the computer processor may ascertain that the subject has likely experienced a rapid change in body temperature, which may indicate that the subject is sick. (The baseline heart rate is typically an average heart rate over a period of time, e.g., 1-2 hours.) In response, the computer processor may alert the flight crew to isolate the subject.
(64) (d) An alert may be generated if, by analyzing the sensor signal, the computer processor identifies that the subject (in particular, the pilot or co-pilot) is drowsy or sleeping.
(65) (e) A sleep study may be performed. For example, the computer processor may analyze the sensor signals from various passengers, and identify which passengers were sleeping at which times. In response, the computer processor may generate an output to help the airline improve the sleeping conditions on their aircraft (e.g., by reducing lighting, or increasing leg room).
(66) The computer processor may also control the lighting, temperature, or other cabin-environment parameters, in order to facilitate a more pleasant travelling experience. For example, upon detecting that a significant number of passengers are sleeping or are trying to fall asleep, the lights in the cabin may be dimmed, and/or the movie that is playing may be stopped. Alternatively or additionally, meals may be served to the passengers only if a given number of passengers are awake. To help prevent deep vein thrombosis (DVT), passengers may be prompted to stand up and take a walk, if the computer processor detects that they have been sitting in place for too long.
(67) Reference is now made to
(68) Subject-monitoring apparatus 20 comprises a sensor 22, which is generally as described hereinabove, and is configured to monitor subject 24. Subject-monitoring apparatus 20 includes a control unit, which is typically a computer processor, such as computer processor 28 described hereinabove. As described hereinabove, computer processor typically communicates with a memory 29. The computer processor is typically a control unit that performs the algorithms described herein, including analyzing the signal from sensor 22. It is noted that, in general, in the specification and claims of the present application, the terms “computer processor” and “control unit” are used interchangeably, since steps of the techniques described herein are typically performed by a computer processor that functions as a control unit. Therefore, the present application refers to component 28 both as a “computer processor” and a “control unit.”
(69) In response to the analyzing the signal from sensor 22, computer processor 28 controls a property (e.g., the content, genre, volume, frequency, and/or phase-shift) of a sound signal, and drives a speaker 110 to play the sound signal. Typically, as described hereinbelow, the property of the sound signal is controlled such as to help the subject fall asleep or remain asleep.
(70) For example, if the subject is trying to fall asleep, the computer processor may select a sound signal of the “relaxing nature sounds” genre, and may further select the content of the signal to be the sound of waves hitting the seashore. The computer processor may further set the frequency of the sound signal (e.g., the frequency of the waves) to an offset less than the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate. In some applications, the computer processor controls the offset, in response to analyzing the sensor signal; for example, as the heart rate of the subject approaches a target “relaxed” heart rate, the computer processor may reduce the offset, such that the frequency of the sound signal is very close to or identical with the subject's heart rate. As the subject begins to fall asleep, the computer processor may reduce the volume of the sound signal.
(71) In some applications, the computer processor controls a phase-shift of the sound signal with respect to a cardiac signal and/or a respiratory signal of the subject. For example, the computer processor may cause the sound of a wave hitting the seashore to occur a given amount of time (e.g., 300 milliseconds) before or after each heartbeat of the subject, or a given amount of time (e.g., 1 second) after each expiration of the subject.
(72) In some applications, the computer processor ascertains that the subject is trying to fall asleep, at least in response to analyzing the sensor signal. For example, by analyzing the sensor signal, the computer processor may ascertain that the subject is awake and is exhibiting a large amount of movement indicative of restlessness in bed. Alternatively or additionally, the ascertaining is in response to one or more other factors, such as a signal from a light sensor that indicates a low level of ambient light in the room, and/or the time of day. In response to ascertaining that the subject is trying to fall asleep, the computer processor controls the property of the sound signal, as described hereinabove.
(73) In some applications, by analyzing the sensor signal, the computer processor ascertains a sleep stage of the subject, and controls the property of the sound signal in response to the ascertained sleep stage. For example, in response to ascertaining that the subject has entered a slow-wave (i.e., deep) sleep stage, the volume of the sound signal may be reduced to a relatively low level (e.g., zero). (In identifying a sleep stage of a subject, as described throughout the present application, the computer processor may use one or more of the techniques described in (a) US 2007/0118054 to Pinhas (now abandoned), and/or (b) Shinar et al., Computers in Cardiology 2001; Vol. 28: 593-596, and (c) Shinar Z et al., “Identification of arousals using heart rate beat-to-beat variability,” Sleep 21(3 Suppl):294 (1998), each of which is incorporated herein by reference.)
(74) Typically, the computer processor controls the property of the sound signal further in response to a historical physiological parameter of the subject that was exhibited in response to a historical sound signal. For example, the computer processor may “learn” the subject's typical responses to particular sound-signal properties, and control the sound signal in response thereto. Thus, for example, if the subject has historically responded well to a “relaxing nature sounds” genre, but less so to a “classical music” genre, the computer processor may select the former genre for the subject. To determine whether the subject has historically responded well to particular properties of the sound signal, the computer processor looks at some or all of historical physiological parameters such as a quality of sleep, a time-to-fall-asleep, a heart-rate-variability, a change in heart rate, a change in respiratory rate, a change in heart-rate-variability, a change in blood pressure, a rate of change in heart rate, a rate of change in respiratory rate, a rate of change in heart-rate-variability, and a rate of change in blood pressure.
(75) In some applications, the computer processor controls the frequency of the sound signal by synthesizing the sound signal, or by selecting a pre-recorded sound signal that has the desired frequency; in other words, the computer processor selects the content of the signal, without the user's input. In other applications, the computer processor selects content of the sound signal in response to a manual input, e.g., an input entered via user interface device 35 (
(76) For some applications, in response to parameters of the signal detected by sensor 22, the computer processor controls a property of light (such as intensity, flicker frequency, or color) emitted by a light 112 in a generally similar manner to that described with respect to controlling the sound that is generated by speaker 110, mutatis mutandis. For example, the computer processor may select a light signal that causes the subject to enter a relaxed state, in response to detecting that the subject is trying to fall asleep. Alternatively or additionally, the computer processor may modulate the property of the light at a frequency of modulation that is based upon the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate, as described hereinabove with respect to the sound signal. Further alternatively or additionally, the computer processor may ascertain a sleep stage of the subject, and modulate the property of the light in response to the ascertained sleep stage. For some applications, the computer processor controls the property of the light further in response to a historical physiological parameter of the subject that was exhibited in response to a historical light signal. For example, the computer processor may “learn” the subject's typical responses to particular light-signal properties, and control the light in response thereto. The computer processor may control parameters of light 112, as an alternative to, or in addition to, controlling properties of the sound that is generated by speaker 110.
(77) For some applications, in response to parameters of the signal detected by sensor 22, the computer processor controls a property of light (such as intensity, flicker frequency, or color) that is emitted by a screen 122 of a device that the subject is using in a generally similar manner to that described with respect to controlling the sound that is generated by speaker 110, mutatis mutandis. For example, the device may be a laptop computer 32 (
(78) For some applications, a vibrating element 126 is disposed underneath a surface of chair 111 upon which the subject sits. Alternatively (not shown), a vibrating element may be disposed underneath the surface of the bed upon which the subject lies. For some applications, in response to parameters of the signal detected by sensor 22, the computer processor controls a property of the vibration (such as vibrating frequency, or a strength of vibration) that is applied to the subject by the vibrating element, in a generally similar manner to that described with respect to controlling the sound that is generated by speaker 110, mutatis mutandis. For example, the computer processor may select a vibration signal that causes the subject to enter a relaxed state, in response to detecting that the subject is trying to fall asleep. Alternatively or additionally, the computer processor may modulate the property of the vibration at a frequency of modulation that is based upon the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate, as described hereinabove with respect to the sound signal. Further alternatively or additionally, the computer processor may ascertain a sleep stage of the subject, and modulate the property of the vibration in response to the ascertained sleep stage. For some applications, the computer processor controls the property of the vibration further in response to a historical physiological parameter of the subject that was exhibited in response to a historical vibration signal. For example, the computer processor may “learn” the subject's typical responses to particular vibration-signal properties, and control the vibrating element in response thereto. The computer processor may control parameters of the vibration of the vibrating element, as an alternative to, or in addition to, controlling parameters of the sound that is generated by speaker 110, and/or light that is generated by light 112 or by screen 122.
(79) It is noted that, typically, for any of the embodiments described with reference to
(80)
(81) Reference is now made to
(82) For some applications, the computer processor selects a subset of heartbeats, based upon the qualities of the heartbeats, and some steps of the subsequent analysis (as described herein) are performed only with respect to the subset of heartbeats. For some applications, only in cases in which two consecutive heart beats have a quality indicator that exceeds a threshold, the interbeat interval is calculated and/or is selected for use in subsequent analysis. For some applications, the computer processor builds a histogram of the selected interbeat intervals. The computer processor analyzes the selected interbeat intervals over a period of time, and in response thereto, the computer processor determines whether the subject is healthy or is suffering from arrhythmia, which type of arrhythmia the subject is suffering from, and/or identifies or predicts arrhythmia episodes. For example, the computer processor may build a histogram of the selected interbeat intervals and may perform the above-described steps by analyzing the histogram.
(83)
(84) In accordance with the above, for some applications, in response to the computer processor identifying two distinct peaks in a histogram that is plotted using the techniques described herein (or performing an equivalent algorithmic operation), an alert is generated that an arrhythmia event may be taking place. Alternatively or additionally, the computer processor may generate an alert in response to identifying that the width of a peak of a histogram exceeds a threshold (or performing an equivalent algorithmic operation). For example, the width of the peak may be compared to a threshold that is determined based upon population averages according to the age and or other indications of the subject (such as, a level of fitness of the subject).
(85) For some applications, in response to the computer processor identifying two distinct peaks in a histogram that is plotted using the techniques described herein (or performing an equivalent algorithmic operation), the computer processor performs the following steps. The computer processor identifies heartbeats belonging to respective interbeat interval groups (i.e., which heartbeats had an interbeat interval that corresponds to a first one of the peaks, and which heartbeats had an interbeat interval corresponding to the second one of the peaks.) The average amplitude of the signal of each of these groups is then calculated. For some applications, the computer processor generates an output that is indicative of the average amplitude of each of the peaks, and/or the interbeat interval of each of the peaks. Alternatively or additionally, based upon these data, the computer processor automatically determines a condition of the subject. For example, the computer processor may determine which category of arrhythmia the subject is suffering from, e.g., atrial fibrillation or ventricular fibrillation.
(86) It is noted that, although the analysis of the interbeat intervals is described as being performed using histogram analysis, the techniques described herein may be combined with other types of analysis that would yield similar results, mutatis mutandis. For example, the computer processor may perform algorithmic steps that do not include a step of generating a histogram, but which analyze the subject's interbeat interval over time, in a similar manner to that described hereinabove.
(87) Reference is now made to
(88) Typically, the selection of which filter to use is repeated in response to certain events. For some applications, the selection of a filter is repeated if the signal quality falls below a threshold. Alternatively or additionally, the filter selection is repeated at fixed times intervals (e.g., once every 5 minutes, ten minutes, or 15 minutes). Further, alternatively or additionally, the filter selection is repeated in response to detecting motion of the subject, e.g., large body motion of the subject. For example, in response to the sensor signal indicating that the subject has undergone motion (e.g., large body motion), the computer processor may perform the filter selection.
(89) Referring to
(90) In general, computer processor 28 may be embodied as a single computer processor 28, or a cooperatively networked or clustered set of computer processors. Computer processor 28 is typically a programmed digital computing device comprising a central processing unit (CPU), random access memory (RAM), non-volatile secondary storage, such as a hard drive or CD ROM drive, network interfaces, and/or peripheral devices. Program code, including software programs, and data are loaded into the RAM for execution and processing by the CPU and results are generated for display, output, transmittal, or storage, as is known in the art. Typically, computer processor 28 is connected to one or more sensors via one or more wired or wireless connections. Computer processor 28 is typically configured to receive signals (e.g., motion signals) from the one or more sensors, and to process these signals as described herein. In the context of the claims and specification of the present application, the term “motion signal” is used to denote any signal that is generated by a sensor, upon the sensor sensing motion. Such motion may include, for example, respiratory motion, cardiac motion, or other body motion, e.g., large body-movement. Similarly, the term “motion sensor” is used to denote any sensor that senses motion, including the types of motion delineated above.
(91) Applications of the invention described herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium (e.g., a non-transitory computer-readable medium) providing program code for use by or in connection with a computer or any instruction execution system, such as computer processor 28. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Typically, the computer-usable or computer readable medium is a non-transitory computer-usable or computer readable medium.
(92) Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
(93) A data processing system suitable for storing and/or executing program code will include at least one processor (e.g., computer processor 28) coupled directly or indirectly to memory elements (e.g., memory 29) through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments of the invention.
(94) Network adapters may be coupled to the processor to enable the processor to become coupled to other processors or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
(95) Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the C programming language or similar programming languages.
(96) It will be understood that each block of the flowcharts shown in
(97) Computer processor 28 is typically a hardware device programmed with computer program instructions to produce a special purpose computer. For example, when programmed to perform the algorithms described with reference to
(98) Techniques described herein may be practiced in combination with techniques described in one or more of the following patents and patent applications, which are incorporated herein by reference. In some applications, techniques and apparatus described in one or more of the following patents and patent applications, which are incorporated herein by reference, are combined with techniques and apparatus described herein: U.S. patent application Ser. No. 11/048,100, filed Jan. 31, 2005, which issued as U.S. Pat. No. 7,077,810; U.S. patent application Ser. No. 11/197,786, filed Aug. 3, 2005, which issued as U.S. Pat. No. 7,314,451; U.S. patent application Ser. No. 11/446,281, filed Jun. 2, 2006, which issued as U.S. Pat. No. 8,376,954; U.S. patent application Ser. No. 11/552,872, filed Oct. 25, 2006, now abandoned, which published as US 2007/0118054; U.S. patent application Ser. No. 11/755,066, filed May 30, 2007, now abandoned, which published as US 2008/0114260; U.S. patent application Ser. No. 11/782,750, filed Jul. 25, 2007, which issued as U.S. Pat. No. 8,403,865; U.S. patent application Ser. No. 12/113,680, filed May 1, 2008, now abandoned, which published as US 2008/0275349; U.S. patent application Ser. No. 12/842,634, filed Jul. 23, 2010, which issued as U.S. Pat. No. 8,517,953; U.S. patent application Ser. No. 12/938,421, filed Nov. 3, 2010, which issued as U.S. Pat. No. 8,585,607; U.S. patent application Ser. No. 12/991,749, filed Nov. 9, 2010, which issued as U.S. Pat. No. 8,821,418; U.S. patent application Ser. No. 13/107,772, filed May 13, 2011, which issued as U.S. Pat. No. 8,491,492; U.S. patent application Ser. No. 13/305,618, filed Nov. 28, 2011, now abandoned, which published as US 2012/0132211; U.S. patent application Ser. No. 13/389,200, filed Jun. 13, 2012, now abandoned, which published as US 2012/0253142; U.S. patent application Ser. No. 13/750,957, filed Jan. 25, 2013, which issued as U.S. Pat. No. 8,603,010; U.S. patent application Ser. No. 13/750,962, filed Jan. 25, 2013, which issued as U.S. Pat. No. 8,679,034; U.S. patent application Ser. No. 13/863,293, filed Mar. 15, 2013, now abandoned, which published as US 2013/0245502; U.S. patent application Ser. No. 13/906,325, filed May 30, 2013, which issued as U.S. Pat. No. 8,882,684; U.S. patent application Ser. No. 13/921,915, filed Jun. 19, 2013, which issued as U.S. Pat. No. 8,679,030; U.S. patent application Ser. No. 14/019,371, filed Sep. 5, 2013, which published as US 2014/0005502; U.S. patent application Ser. No. 14/020,574, filed Sep. 6, 2013, which issued as U.S. Pat. No. 8,731,646; U.S. patent application Ser. No. 14/054,280, filed Oct. 15, 2013, which issued as U.S. Pat. No. 8,734,360; U.S. patent application Ser. No. 14/150,115, filed Jan. 8, 2014, which issued as U.S. Pat. No. 8,840,564; U.S. patent application Ser. No. 14/231,855, filed Apr. 1, 2014, which issued as U.S. Pat. No. 8,992,434; U.S. patent application Ser. No. 14/454,300, filed Aug. 7, 2014, which issued as U.S. Pat. No. 8,942,779; U.S. patent application Ser. No. 14/458,399, filed Aug. 13, 2014, which issued as U.S. Pat. No. 8,998,830; U.S. patent application Ser. No. 14/474,357, filed Sep. 2, 2014, which published as US 2014/0371635; U.S. patent application Ser. No. 14/557,654, filed Dec. 2, 2014, issued as U.S. Pat. No. 9,026,199; U.S. patent application Ser. No. 14/631,978, filed Feb. 26, 2015, published as US 2015/0164438; U.S. patent application Ser. No. 14/624,904, filed Feb. 18, 2015, published as US 2015/0164433; U.S. patent application Ser. No. 14/663,835, filed Mar. 20, 2015, published as US 2015/0190087; U.S. patent application Ser. No. 14/810,814, filed Jul. 28, 2015, published as US 2015/0327792; International Patent Application PCT/IL2005/000113, which published as WO 2005/074361; International Patent Application PCT/IL2006/000727, which published as WO 2006/137067; International Patent Application PCT/IB2006/002998, which published as WO 2007/052108; International Patent Application PCT/IL2008/000601, which published as WO 2008/135985; International Patent Application PCT/IL2009/000473, which published as WO 2009/138976; International Patent Application PCT/IL2011/050045, which published as WO 2012/077113; International Patent Application PCT/IL2013/050283, which published as WO 2013/150523; International Patent Application PCT/IL2014/050644, which published as WO 2015/008285; and International Patent Application No. PCT/IL2015/050880 to Shinar, which published as WO 16/035073.
(99) It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof that are not in the prior art, which would occur to persons skilled in the art upon reading the foregoing description.