System and Method for Patient Monitoring

20220280085 · 2022-09-08

    Inventors

    Cpc classification

    International classification

    Abstract

    The present disclosure provides a system and method for monitoring the cognitive state of a patient based on eye image data. The patient monitoring system comprising a camera unit configured for recording images of an eye of the patient, and a data processing sub-system in data communication with the camera and being operable to (i) receive and process eye image data from said camera, (ii) classify said eye image data into gestures and identify such gestures indicative of the cognitive state of the patient, and (iii) transmit a signal communicating said cognitive state to a remote unit. The system may further comprise an actuator module and an output unit wherein said output may be an automated medical questionnaire.

    Claims

    1. A patient monitoring system for identifying a cognitive state of a patient, the system comprising a camera unit configured for recording images of an eye of the patient; a data processing sub-system in data communication with the camera and being operable to i. receive and process eye image data from said camera; ii. classify said eye image data into gestures and identify such gestures being indicative of the cognitive state of the patient, and iii. transmit a signal communicating said cognitive state to a remote unit; an actuator module; and an output unit; wherein said actuator module is configured to drive said output unit to present an output to the patient; and wherein said data processing sub-system records reactive image data representative of a reactive eye movement that follows said output, classify said reactive image data into reactive gestures and identifying such gestures being indicative of the cognitive state of the patient.

    2. The system of claim 1, wherein the camera is carried on a head unit configured for fitting onto a patient's head.

    3. The patient monitoring system of claim 1, wherein the cognitive state is selected from wakefulness, delirium, cognitive decline, confusion, disorientation, abnormal attention, consciousness, pain, and depression.

    4. The patient monitoring of claim 1, wherein the remote unit is an alert unit of an intensive care unit, a nurse unit, or a device carried by a caregiver.

    5. The patient monitoring system of claim 1, wherein the gesture is selected from opening of at least one eye, closing at least one eye, pupil position, sequence of pupil positions, and sequences of eyelid blinks.

    6. The patient monitoring system of claim 1, wherein the opening of at least one eye gesture is indicative of a wakeful state, and wherein an alert signal is transmitted to a nurse unit.

    7. The patient monitoring system of claim 1, wherein the signal to a remote unit is transmitted via wireless communication.

    8. (canceled)

    9. The patient monitoring system of claim 1, wherein said reactive gestures are indicative of the cognitive state of the patient.

    10. The patient monitoring system of claim 9 wherein said output is an automatic medical questionnaire.

    11. The patient monitoring system of claim 10, wherein said medical questionnaire is Confusion Assessment Method for the ICU (CAM-ICU).

    12. The patient monitoring system of claim 10, wherein said medical questionnaire is a pain scale.

    13. The patient monitoring system of claim 10, wherein said medical questionnaire is an air hunger or breathing discomfort questionnaire.

    14. The patient monitoring system of claim 10, wherein the opening of at least one eye initiates the CAM-ICU.

    15. The patient monitoring system of claim 1, wherein the data processing sub-system is further operable to receiving and classifying one or more physiological parameters, and identifying such said physiological parameters, or any combination of said gestures and physiological parameters, indicative of the cognitive state of the patient.

    16. A patient monitoring system comprising a plurality of patient monitoring systems of claim 1.

    17. The system of claim 16, further comprising a centralized processor being operable for receiving signals representative of said cognitive states from each of said patient monitoring systems and classifying such signals according to one or more defined criteria.

    18. A patient monitoring method for identifying a cognitive state of a patient, the method comprising a. recording image data of at least one of patient's eyes; b. classifying said image data into gestures; c. identifying such gestures that are indicative of the cognitive state of the patient; d. transmitting a signal communicating said cognitive state to a remote unit; e. providing an output to the patient; f. recording reactive image data representative of a reactive eye movement that follows said output; g. classifying said reactive image data into reactive gestures; and h. identifying such gestures indicative of the cognitive state of the patient.

    19. (canceled)

    20. (canceled)

    21. The method of claim 18, further comprising a. receiving and classifying one or more physiological parameters; and b. identifying such of said gestures and physiological parameters, or any combination thereof, indicative of the cognitive state of the patient.

    22. (canceled)

    23. A patient monitoring system for identifying a cognitive state of a patient, the system comprising a camera unit configured for recording images of an eye of the patient; a data processing sub-system in data communication with the camera and being operable to i. receive and process eye image data from said camera; ii. classify said eye image data into gestures and identify such gestures being indicative of the delirium state of the patient based on defined criteria, and iii. transmit a signal communicating said cognitive state to a remote unit; an actuator module; and an output unit; wherein said actuator module is configured to drive said output unit to present an output to the patient; wherein said output is triggered by eye gestures classified by the system and comprises generic, or personally generated auditory content; wherein said data processing sub-system records reactive image data representative of a reactive eye movement that follows said output, classify said reactive image data into reactive gestures and identifying such gestures being indicative of the delirium state of the patient based on defined criteria.

    24. The patient monitoring system of claim 23, wherein said output is selected from at least one of: questionnaires, sets of audible questions and answers, orientation messages, music, recordings of family members, instructions how to respond using his eye gestures.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0135] In order to better understand the subject matter that is disclosed herein and to exemplify how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:

    [0136] FIG. 1 is a schematic block diagrams of a system in accordance a non-limiting embodiment of this disclosure.

    [0137] FIG. 2 is a schematic block diagram of a system in accordance with another non-limiting embodiment of this disclosure.

    [0138] FIG. 3 displays an exemplary embodiment of the system.

    [0139] FIG. 4 displays an exemplary embodiment of the system as may be worn by a patient.

    [0140] FIG. 5 shows an exemplary embodiment of a high-level system architecture.

    [0141] FIG. 6 shows an exemplary embodiment of a display screen, or dashboard, of a remote unit being a medical staff station

    DETAILED DESCRIPTION OF EMBODIMENTS

    [0142] Reference is first made to FIG. 1 illustrating a schematic block diagram of a patient monitoring system in accordance with a non-limiting embodiment of this disclosure. The patient monitoring system 100 comprises a camera 104, mounted on a head unit 102 configured for fitting onto a patient's head. The camera may also be mounted on any fixed frame in the vicinity of the patient. The camera 104 is operable for continuously capturing images of one or both of the patient's eye and eyelid and generating image data representative thereof. The system 100 includes a parallel or distributed data processing sub-system 106 that is in data communication with the camera 104. The data processing sub-system 106 receive and process eye image data from said camera,

    [0143] classify said eye image data into gestures and identify such gestures indicative of the cognitive state of the patient. Next the data processing sub-system 106 transmit a signal communicating said cognitive state to a remote unit.

    [0144] As a non-limiting example, an ICU patient hospitalized unconscious for several days, is being monitored by the patient monitoring system 100. A caregiver had placed the wearable head unit 102 onto the patient's head. Once the patient opens his eyes for the first time, his eyes movement is captured by the camera 104. The image data from said camera 104 is received by the data processing sub-system 106. Next the image data is classified into gestures, and in case an eye-opening gesture is classified, a wakeful state is indicated and a “call for help” signal is transmitted wirelessly to the nearest nurse unit.

    [0145] FIG. 2 shows a block diagram of the system of the present disclosure, wherein the system further includes an actuator module 108 that drives the first output unit 110. Output unit 110 may be a visual display, e.g. digital screen, or an audible device, e.g. speaker, headphones, etc.

    [0146] As a non-limiting example, a patient hospitalized and suspected to suffer from delirium is wearing the head unit 102 and is being monitored by the patient monitoring system 100. Once the patients blink twice, his eyes movement is captured by camera 104 and classified as a sequence of two blinks gesture by the data processing sub-system 106. The said gesture initiates an output of a digital CAM-ICU medical questionnaire via the output unit 110 that is driven by the actuator module 108. The patients respond to the CAM-ICU evaluation by performing a reactive eye movement, captured by the camera 104, and classified into reactive eye gestures via the data processing sub-system 106. These reactive eye gestures are taken to indicate whether the patient is indeed in a delirium state. If indeed a delirium state is identified by the data processing sub-system 106 a delirium signal is transmitted to the patient's physician by said data processing sub-system 106.

    [0147] FIG. 3 and FIG. 4 show a non-limiting exemplary element of the system of this disclosure. The elements in FIG. 3 and FIG. 4 are assigned with numerical indices that are shifted by 100 from the indices of the elements shown by FIG. 1. For example, head unit indicted 102 in FIG. 1 is indicated by 202 in FIG. 3. Accordingly, the reader may refer to the above text for details regarding the functionally of these elements.

    [0148] FIG. 3 shows a non-limiting example of the system, including a head unit 202, a camera 204 and a non-distributed data processing sub-system 206, which is in wireless communication (e.g. Wi-Fi, Bluetooth) with a remote unit.

    [0149] FIG. 4 shows a non-limiting illustration of the system as may be worn by a potential patient.

    [0150] FIG. 5 shows a high-level system architecture according to a non-limiting exemplary embodiment. According the exemplary system, the system comprises a data processing sub-system 306 and a headset 302. The system is in remote bi-directional communication with (i) a medical staff server 308 (for example via IoT protocol) which communicates with a remote medical staff station 310 (ii) a device settings cloud server 312, via wi-fi communication and (iii) web-based additional applications 314 via Bluetooth communication. The medical staff server comprises a system database 316, an event scheduler 318 (for example for allowing the medical staff to schedule a calendar event for a specific patient such as a “good morning” greeting every day at 08:00 or an initiation of the CAM-ICU or other medical questionnaire every 12 hours), a server 320 for storing and retrieving data (such as the media files for the vocal menu, a World Wide Web (WWW) page 322, and a text-to-speech application programming interface (API) 324. The staff server receives data via a remote family portal 326, for example voice messages. The family portal 326 may transmit recommendations to the device setting cloud server which comprise a web portal 328. The device settings cloud server comprises a voice banking 330 (generate an original content using synthetic voice based on a previously recorded voice), text-to-speech 332, and translation 334 APIs as well as a user database 336.

    [0151] The system comprises an output unit and an actuator module for driving an output selected from questionnaires, sets of audible questions and answers, orientation messages (location, date, time), music, recordings of family members etc. The outputs may be triggered by eye gestures classified by the system. In addition, response to said outputs may be provided by the patient via a reactive gesture classified by the data processing sub-system. A response may include answering questions. Overall, the system allows more natural, relaxed and controlled environment for the patient, thereby improving the quality of the hospitalization and reducing negative emotions during hospitalization such as feeling lack of control, anxiety, fear of not being able to communicate and more. The system is also linked to a secured remote support cloud 338 enabling a reverse tunnel to a remote technician service.

    [0152] FIG. 6 shows an example of a possible screen display (a dashboard) in a medical staff station which shows: [0153] Communication log including answers of the patient to questions sent from station in the form of audio (via speech). [0154] Communication module presenting communication options to the medical staff [0155] Sleeping/awake pattern of the patient [0156] Log of total alerts and alerts which require physical intervention by the staff in the user room (user call for help alert, camera is dislocated, device is disconnected from network etc.) [0157] Device location within the medical department reminder [0158] Activity log of the user and device [0159] Medical assessment questionnaire results (such as CAM-ICU, pain scale and more) [0160] Music, orientation, and family voice recordings buttons