METHOD AND SYSTEM FOR MONITORING A PATIENT'S MEDICAL CONDITION
20220039761 · 2022-02-10
Inventors
Cpc classification
A61B5/7425
HUMAN NECESSITIES
A61B5/744
HUMAN NECESSITIES
G16H10/60
PHYSICS
A61B5/0205
HUMAN NECESSITIES
International classification
Abstract
A method for monitoring and visualizing a patient's medical condition, wherein a graphical representation of the patient comprising a body having at least a torso and a head, as well as particularly two legs and two arms, is displayed using a display device, wherein said displayed graphical representation comprises at least one region which is allocated to at least one or several provided (e.g. measured and/or determined) patient monitoring quantities, and wherein the appearance of the at least one region is altered in real-time when the at least one patient monitoring quantity to which said at least one region is allocated changes.
Claims
1. (canceled)
2. A method for monitoring a patient through a dynamic visualization of a patient model, the method comprising: receiving a first position input indicative of a position of a viewer relative to the patient; receiving signals associated with at least one medical parameter of the patient; generating the dynamic visualization of the patient model based on the received first position input and the received at least one medical parameter, wherein the generating comprises: rendering a first stage of the dynamic visualization, wherein the first stage of the dynamic visualization includes the patient model positioned on a display to create an impression of the patient model being looked at from a head of the patient using a first point of view angle based on the first position input a patient and a first tilt angle based on the first position input, wherein the point of view angle is an angle enclosed by a projection of a viewing direction on a horizontal or coronal plane of the patient model and a longitudinal axis of the patient model; and rendering a second stage of the dynamic visualization, wherein a change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on at least one algorithm, wherein the algorithm is based on the received at least one medical parameter, the at least one algorithm comprising at least two of: a first patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a respiration rate threshold; a second patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a tidal volume threshold; a third patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a capnographic threshold; a fourth patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a heart rate threshold; or a fifth patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a blood oxygenation threshold.
3. The method of claim 2, wherein the dynamically rendered visualization of the patient model comprises at least one region of a patient.
4. The method of claim 2, wherein the first point of view angle is 45 degrees or less.
5. The method of claim 2, wherein the first tilt angle is 30 degrees or less.
6. The method of claim 2, wherein is the first tilt angle is an angle enclosed by the horizontal or coronal plane of the patient model and a viewing direction.
7. The method of claim 2, wherein the method further comprises receiving a second position input different from the first position input, and wherein the second stage of the dynamic visualization includes rotating the patient model positioned on the display using the second position input to create an impression of the patient model being looked at from the head of the patient using a second point of view angle based on the second position input a patient and a second tilt angle based on the second position input, wherein the second point of view angle is an angle enclosed by a projection of a viewing direction on a horizontal or coronal plane of the patient model and a longitudinal axis of the patient model.
8. The method of claim 2, wherein the first dynamic visualization and the second dynamic visualization are both additionally rendered using patient input including at least one of: patient gender, patient age, patient weight, and patient height.
9. A device comprising: a processor configured to generate a dynamic visualization of a patient model; and a display configured to display the dynamic visualization of the patient model, wherein the processor is configured to: receive a first position input indicative of a position of a viewer relative to the patient; receive signals associated with at least one medical parameter of the patient; generate the dynamic visualization of the patient model based on the received first position input and the received at least one medical parameter, wherein the generating comprises: render a first stage of the dynamic visualization, wherein the first stage of the dynamic visualization includes the patient model positioned on a display to create an impression of the patient model being looked at from a head of the patient using a first point of view angle based on the first position input a patient and a first tilt angle based on the first position input, wherein the point of view angle is an angle enclosed by a projection of a viewing direction on a horizontal or coronal plane of the patient model and a longitudinal axis of the patient model; and render a second stage of the dynamic visualization, wherein a change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on at least one algorithm, wherein the algorithm is based on the received at least one medical parameter, the at least one algorithm comprising at least two of: a first patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a respiration rate threshold; a second patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a tidal volume threshold; a third patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a capnographic threshold; a fourth patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a heart rate threshold; or a fifth patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a blood oxygenation threshold.
10. The device of claim 9, wherein the dynamically rendered visualization of the patient model comprises at least one region of a patient.
11. The device of claim 9, wherein the first point of view angle is 45 degrees or less.
12. The device of claim 9, wherein the first tilt angle is 30 degrees or less.
13. The device of claim 9, wherein is the tilt angle is an angle enclosed by the horizontal or coronal plane of the patient model and a viewing direction.
14. The device of claim 9, wherein the processor further receives a second position input different from the first position input, and wherein the second stage of the dynamic visualization includes rotating the patient model positioned on the display using the second position input to create an impression of the patient model being looked at from the head of the patient using a second point of view angle based on the second position input a patient and a second tilt angle based on the second position input, wherein the second point of view angle is an angle enclosed by a projection of a viewing direction on a horizontal or coronal plane of the patient model and a longitudinal axis of the patient model.
15. The device of claim 9, wherein the first dynamic visualization and the second dynamic visualization are both additionally rendered using patient input including at least one of: patient gender, patient age, patient weight, and patient height.
16. A non-transitory computer-readable medium comprising computer code, that when executed, cause a processor to: receive a first position input indicative of a position of a viewer relative to a patient; receive signals associated with at least one medical parameter of the patient; generate a dynamic visualization of a patient model based on the received first position input and the received at least one medical parameter, wherein the generating the dynamic visualization comprises: rendering a first stage of the dynamic visualization, wherein the first stage of the dynamic visualization includes the patient model positioned on a display to create an impression of the patient model being looked at from a head of the patient using a first point of view angle based on the first position input a patient and a first tilt angle based on the first position input, wherein the point of view angle is an angle enclosed by a projection of a viewing direction on a horizontal or coronal plane of the patient model and a longitudinal axis of the patient model; and rendering a second stage of the dynamic visualization, wherein a change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on at least one algorithm, wherein the algorithm is based on the received at least one medical parameter, the at least one algorithm comprising at least two of: a first patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a respiration rate threshold; a second patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a tidal volume threshold; a third patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a capnographic threshold; a fourth patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a heart rate threshold; or a fifth patient monitoring algorithm, wherein the change between the first stage of the dynamic visualization and the second stage of the dynamic visualization is based on a comparison of the received at least one medical parameter associated with a blood oxygenation threshold.
17. The non-transitory computer-readable medium of claim 16, wherein the dynamically rendered visualization of the patient model comprises at least one region of a patient.
18. The non-transitory computer-readable medium of claim 16, wherein the first point of view angle is 45 degrees or less.
19. The non-transitory computer-readable medium of claim 16, wherein the first tilt angle is 30 degrees or less.
20. The non-transitory computer-readable medium of claim 16, wherein is the tilt angle is an angle enclosed by the horizontal or coronal plane of the patient model and a viewing direction.
21. The non-transitory computer-readable medium of claim 16, wherein the processor further receives a second position input different from the first position input, and wherein the second stage of the dynamic visualization includes rotating the patient model positioned on the display using the second position input to create an impression of the patient model being looked at from the head of the patient using a second point of view angle based on the second position input a patient and a second tilt angle based on the second position input, wherein the second point of view angle is an angle enclosed by a projection of a viewing direction on a horizontal or coronal plane of the patient model and a longitudinal axis of the patient model.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0106]
[0107]
[0108]
[0109]
[0110]
[0111]
[0112]
[0113]
[0114]
[0115]
[0116]
[0117]
[0118]
[0119]
[0120]
[0121]
[0122]
[0123]
[0124]
[0125]
[0126]
[0127]
[0128]
[0129]
[0130]
[0131]
[0132]
[0133]
[0134]
[0135]
[0136]
[0137]
[0138]
[0139]
[0140]
[0141]
[0142]
[0143]
[0144]
[0145]
[0146]
[0147]
[0148]
[0149]
[0150]
[0151]
[0152]
DETAILED DESCRIPTION
[0153] A system and a method are disclosed herein. The system and method synthesize multiple streams of raw patient monitoring data into a single display device 20 (or instrument), showing a synthetic model of the monitored patient P, which is generated according to algorithms and rendered dynamically, particularly in real-time, by a graphics processor 21 (cf.
[0154]
[0155] The dynamic alterations of the states of the parts or regions 1 to 9 of the homunculus 10 include their presence or absence, as well as changes in volume, area, length and color of parts/regions 1 to 9. The homunculus 10 may be looked at from all angles according to user input. An embodiment places the point of view at an angle A of 45 degrees from the head 1b of the homunculus 10 with an angle A′ of 45 degrees of tilt. Various angles A, A′ of view of the homunculus 10 are shown in
[0156] An embodiment of the disclosed method creates a dynamic rendering of a synthetic patient model, representing the condition of the actual monitored patient P according to the raw input data. The method uses alterations in the states (or attributes) of specified parts or regions 1 to 9 of a displayed homunculus 10, i.e., the representation of a human being (i.e.,
[0157] The present disclosure addresses the problem of presenting an appropriate synthesis of patient monitoring quantities to a health care provider or another user of a monitoring device (e.g., personal consumer using an e-health app). The disclosed system and method present all monitoring information in a single, easy to understand instrument, which is dynamically rendered and shown on a display device 20 (i.e.,
[0158] According to
[0159] The disclosed system and method are particularly suited to be used integrated with a conventional patient monitoring device (a single screen showing both the visual patient instrument and conventional monitoring data) and displayed together with the raw monitoring data, e.g. the image B shown in
[0160] The present disclosure provides for the creation of a two- or three-dimensional instrument from the synthesis of raw monitoring data. According to the raw monitoring data, the disclosed system and method create a homunculus 10, which is a synthetic representation of the condition of the actual monitored patient. A graphics processor 21 dynamically renders the image.
[0161] An overview of the system for carrying out the disclosed method is shown schematically in
[0162] The process of rendering an instrument from the raw patient monitoring data takes place in two broad steps, which are detailed hereafter and outlined in
[0163] In step 1 the raw input data (i.e., “DATA STREAM” in
[0164] In step 2, the patient monitoring quantities (data points) are transformed by the e.g. general purpose computer 22 according to the algorithms 25 of the present disclosure (e.g.
[0165] While one process of selecting one or more specific parts or regions 1 to 9 of the homunculus 10 for each data point has been disclosed, others are possible and other methods of assigning data points to parts or regions of the homunculus 10 will occur to those skilled in the art and may be used in applications without varying from the spirit of this disclosure.
[0166] According to an embodiment of the disclosure, the rendering of the e.g. real-time instrument, showing the synthetic model of the patient P takes place following the subroutines outlined hereafter (i.e., subroutines A and B).
[0167] While subroutine A (cf.
[0168] Subroutine A starts by computing an individual homunculus model 10 based on the monitored patients medical profile (i.e., age, weight, gender, height, pediatric, medical conditions, e.g. obesity) by altering a default homunculus model creating a “customized patient avatar”. For example, if a patient is a woman, a homunculus having a body 1 representing a woman is displayed. The outcome (depicted as “INDIVIDUAL HOMUNCULUS MODEL” in
[0169] After the individual homunculus model 10 has been established, the next step is to handle the incoming patient monitoring data. In case patient data arrives in form of a data stream (“RAW PATIENT MONITORING DATA STREAM” in
[0170] Next, the computer 22 takes the available individual data points (i.e., the patient's vital parameters) and supplies them to the specific algorithms. Subsequently, these algorithms compute the state of each part or regions 1 to 9 of the homunculus model 10 (“COMPUTE INDIVIDUAL HOMUNCULUS MODEL ANIMATIONS” in
[0171] Furthermore, when new patient monitoring data becomes available, the process starting from “EXTRACT DATA POINTS MATCHING MONITORED PATIENT PARAMETERS” in
[0172]
[0173] Once the two- or three-dimensional representation of the homunculus 10 has been established, it is stored in memory 23 (“STORE HOMUNCULUS GRAPHICAL MODEL REPRESENTATION IN MEMORY”) for later rendering by the graphics processor 21. In order to accommodate existing hardware limitations as imposed by the graphics processor 21, data stream, general purpose computer 22, or other involved components, the next step is to wait for a specific interval to be elapsed (“WAIT FOR INTERVAL TO ELAPSE” in
[0174] The instructions for carrying out the present disclosure may be stored in any recordable medium such as a hard drive, magnetically recordable tape, or as a compact disk. They may be stored in the memory 23 (i.e., MEMORY in
[0175] Thus, the memory 23 may comprise, for example, random access memory (RAM), read-only memory (ROM), hard disk drives, solid state drives and/or other memory components, or a combination of any two or more of these memory components. The RAM may comprise, for example, static random access memory (SRAM), dynamic random access memory (DRAM), or magnetic random access memory (MRAM), non-volatile random-access memory (NVRAM), and other forms of memory. The ROM may comprise, for example, a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), compact flash memory, or other like memory device.
[0176] In the following examples for the individual algorithms will be described.
[0177] Table 1a outlining which regions of the homunculus are affected by which data points, which attributes are altered according to algorithms #1-12.
TABLE-US-00001 Region or part Affected by data of homunculus point Changed attributes Name of algorithm 1 (body) Blood pressure (BP) Volume or area of Visual patient parts/regions (3D) or monitoring area (2D) algorithm #1 Pulse Rate Frequency of volume Visual patient SpO2 Sensor (PR) or area of monitoring algorithm #1 parts/regions (3D) or area (2D) change Oxygen saturation Color change Visual patient (SpO2) monitoring algorithm #2 Body Temperature Amount of Visual patient (Temp) temperature monitoring algorithm #3 indicators 2 (visual heart) Blood pressure (BP) Volume/area Visual patient monitoring algorithm #4 ECG QRS Heart Frequency of volume Visual patient Rate (HR) or area of monitoring algorithm #4 parts/regions (3D) or area (2D) change ECG ST-segment Color change of Visual patient deflection (ST) visual heart and monitoring algorithm #5 reduction of pulsation dynamics in affected area Right ventricular Volume/area Visual patient pressure (RVP) monitoring algorithm #6 ECG rhythm Electrical conduction Visual patient detection system path form displayed monitoring (ECG) (Visual electrical Algorithm #7 heart activity) Pulmonary capillary Volume/area Visual patient wedge pressure or monitoring algorithm #8 wedge pressure (PCWP) Mixed venous Color change Visual patient oxygen saturation monitoring algorithm #9 (MVOS) Cardiac output (CO)/ Amount of cardiac Visual patient Cardiac index (CI) output/cardiac index monitoring indicators displayed Algorithm #10 3 (visual Blood pressure (BP) Volume/area Visual patient arterial system) monitoring Algorithm #11 4 (visual vena Central venous Volume/area Visual patient cava) pressure (CVP) monitoring algorithm #12
[0178] Table 1b outlining which parts of the homunculus are affected by which data points, which attributes are altered according to algorithms #12-19.
TABLE-US-00002 Part or region Affected by data Changed Name within homunculus point attributes of algorithm 5 (visual Respiratory rate Frequency of volume Visual patient respiration, (RR) or area of monitoring algorithm # 13 visual lungs) parts/regions (3D) or area (2D) change Tidal volume (TV) Volume/area Visual patient monitoring algorithm # 13 6 (visual CO2 & Respiratory rate Frequency of volume Visual patient Oxygen Cloud) (RR) or area of monitoring algorithm #14 parts/regions (3D) or area (2D) change Expiratory carbon Volume/area Visual patient dioxide measurement monitoring algorithm #14 (eCO2) Expiratory oxygen Volume/area Visual patient measurement (eO2) monitoring algorithm #15 7 (eye(s) or Brain activity from Degree of openness Visual patient visual brain e.g., Bispectral of eyes monitoring algorithm #16 activity) Index System (BIS), Amount of brain electroencephalogram activity indicators (EEG) Facial expression of the homunculus 8 (visual Brain) Intracranial pressure Appearance Visual patient (ICP) Amount of brain gyri monitoring algorithm #17 and sulci, thickness of brain wall Brain tissue oxygen Color change Visual patient tension (BO) monitoring algorithm #18 9 (hand(s) or Neuromuscular Degree of relaxation Visual patient visual transmission of hand muscles monitoring algorithm #19 neuromuscular measurement Relative position of transmission) systern (NMT) selected parts/regions of body 1 of the homunculus 10.
Example 1: Visual Patient Monitoring Algorithm #1 (Pulse Rate and Blood Pressure)
[0179] This algorithm is used to make region 1 (body) as shown in
[0180] Changes in input blood pressure BP lead to a change in volume or area of parts (3D) or area (2D) of region (body) 1 of the homunculus 10 particularly following an ease in ease out function (these functions are herein also denoted as smoothing functions, wherein
[0181] The arterial pressure curve form according to which the volume or area of parts (3D) or area (2D) change of region (body) 1 is achieved, is stored in memory 23. Several curve forms may be stored in memory 23 and displayed for different blood pressure amplitudes.
[0182] The change in volume or area of parts (3D) or area (2D) of region (body) 1 behaves according to the actual arterial pressure curve input from the patient P, when this input is available. A schematic overview of this algorithm is given in
Example 2: Visual Patient Monitoring Algorithm #2 (Visual Oxygen Saturation)
[0183] This algorithm allows region 1, i.e. the body 1 of the homunculus 10, to change its skin color in an intuitive way according to the oxygen saturation of the patient P. Here, the patient monitoring quantity (input) is the oxygen saturation e.g. from a SpO2 Sensor (SpO2).
[0184] Changes in the oxygen saturation lead to a change in color of body 1 of the homunculus 10. At 100% oxygen saturation, body 1 of the homunculus 10 has a normal skin color tone (e.g., HEX color #F8EFDA), representing the look of healthy skin at normal oxygen levels. As oxygen saturation decreases, body 1 gradually becomes light blue (e.g., HEX color #84B0E8) to dark blue (e.g., HEX color #0E3996) and finally purple (e.g., HEX color #723C7F) and grey (e.g., HEX color #DEDEDE) representing various degrees of hypoxia. Assigning other, different colors to the saturation data points will occur to those skilled in the art and may be used without varying from the spirit of this disclosure.
[0185] A schematic overview of this algorithm is given in
Example 3: Visual Patient Monitoring Algorithm #3 (Visual Patient Temperature)
[0186] This algorithm allows the patients temperature to be indicated in an intuitive way in body 1 (
[0187] According to the temperature data input, temperature indicators 11 appear on body 1 of the homunculus 10. The amount of temperature indicators 11 presented allows the user to understand the temperature of the patient P intuitively. Temperature indicators 11 for temperatures lower than normal may be icicles, snowflakes, temperature indicators 11 for temperatures higher than normal may be sweat pearls or heat waves rising from body 1 of the homunculus 10. The amount of temperature indicators 11 shown preferably follows an ease in/out function. This enables users to detect low and high-temperature extremes better. Assigning different designs of temperature indicators 11 than the ones described here as an embodiment of the disclosure, e.g. the appearance of sweat pearls or flames to indicate high temperature or a layer of snowflakes to indicate low temperature, will occur to those skilled in the art and may be used without varying from the spirit of this disclosure. A schematic overview of this algorithm is given in
Example 4: Visual Patient Monitoring Algorithm #4 (Visual Heart)
[0188] This algorithm allows region 2, i.e. the heart of the homunculus 10 (
[0189] Changes in input blood pressure lead to a change in volume or area of parts (3D) or area (2D) of the heart 2 particularly following an ease in/out function. The ease in/out function causes very low and very high pressures to cause less extensive changes in volume or area of parts (3D) or area (2D) of heart 2 when compared to changes in medium pressure ranges. This function will enable users to better detect low and high-pressure extremes. Region (heart) 2 of the homunculus 10 alternates between the volume or area of parts (3D) or area (2D) value of diastolic blood pressure (minimum) and systolic blood pressure (maximum). The changes in volume or area of parts (3D) or area (2D) of region (heart) 2 occurs with the frequency of the ECG QRS heart rate (HR), and follows an arterial pressure curve form.
[0190] The arterial pressure curve form according to which the volume or area of parts (3D) or area (2D) change of region (heart) 2 is achieved, is stored in memory 23. A schematic overview of this algorithm is given in
Example 5: Visual Patient Monitoring Algorithm #5 (Visual ST-Segment Deflection)
[0191] Inputs: ECG QRS heart rate HR and Blood pressure BP, ST-Segments of leads I, II, III, aVF, aVR, aVL, V1, V2, V3, V4, V5, V6.
[0192] This algorithm allows for specific sections of the heart 2 (
[0193] ST-segment deflections lead to a change in color of specific sections of heart 2 of the homunculus 10. At zero ST-segment deflection the sections of heart 2 of the homunculus 10 have a normal, default red color tone (e.g., HEX color #FF5555), representing the look of a healthy heart muscle at normal oxygen levels. As ST-segment deflection increases in one or more ECG leads, the sections of heart 2 of the homunculus 10 allocated to the ECG lead gradually become darker, e.g., from HEX color #C90000 to HEX color #8B0000 and finally purple (e.g., HEX color #723C7F) and grey (e.g., HEX color #DEDEDE). Assigning other, different colors to the ST-segment deflection data points will occur to those skilled in the art and may be used without varying from the spirit of this disclosure.
[0194] For example, an ST-segment deflection in leads representing the septal part of the heart 2 (i.e., leads V1, V2) will cause color changes and reduction of area/volume change (i.e., less dynamic movement) of the septal part of heart 2 of the homunculus 10. Likewise, an ST-segment deflection in leads representing the inferior part of the heart 2 (i.e., leads II, III, aVF) will cause color changes and less dynamic movement of the inferior part of heart 2 of the homunculus 10.
[0195] Changes in ST-segment deflection of specific ECG leads (e.g., V1, V2, v3), leads to a change in volume or area of parts (3D) or area (2D) of the sections of heart 2 of the homunculus 10 allocated to these ECG leads. The ease in/out function causes very small and very large ST-segment deflections to cause less extensive changes in volume or area of parts (3D) or area (2D) of heart 2 when compared to changes in medium ST-segment deflection ranges. This function will enable users to better detect relevant ST-segment deflections.
[0196]
[0197] Examples of allocations of leads to parts of the visual heart 2 according to an embodiment are given in
[0198]
Example 6: Visual Patient Monitoring Algorithm #6 (Visual Right Ventricular Pressure)
[0199] This algorithm allows for a specific section of heart 2 (e.g.
[0200] Changes in input right ventricular pressure lead to a change in volume or area of parts (3D) or area (2D) of a specific section of heart 2 of the homunculus 10, representing the right ventricle of the heart 2, particularly following an ease in/out function. The ease in/out function causes very low and very high pressures to cause less extensive changes in volume or area of parts (3D) or area (2D) of the specified section of heart 2 when compared to changes in medium pressure ranges. This function will enable users better to detect low and high extremes of right ventricular pressure. The volume or area of parts (3D) or area (2D) of the specific part of the homunculus 10 fluctuates according to the right ventricular pressure wave, when this input is available.
Example 7: Visual Patient Monitoring Algorithm #7 (Visual Electrical Heart Activity)
[0201] This algorithm is used to enable heart 2 of the homunculus 10 (e.g.
[0202] The detection of specific heart rhythms (i.e., electrical heart activity) causes the display of electrical conduction path forms 14 associated with the detected rhythm on part 2 of the homunculus 10.
[0203] The electrical conduction paths forms 14, which are displayed according to the detected heart rhythm, are stored in memory 23. Several conduction path forms 15 may be stored in memory 23 and displayed for different ECG heart rhythms.
Example 8: Visual Patient Monitoring Algorithm #8 (Visual Pulmonary Capillary Wedge Pressure or Wedge Pressure)
[0204] This algorithm allows for a specific section of heart 2 (e.g.
[0205] Changes in input pulmonary capillary wedge pressure lead to a change in volume or area of parts (3D) or area (2D) of a specific section of heart 2 of the homunculus 10, representing a section 15 of the pulmonary artery, particularly following an ease in/out function. The ease in/out function causes very low and very high pressures to cause less extensive changes in volume or area of parts (3D) or area (2D) of the specified section of heart 2 when compared to changes in medium pressure ranges. This function will enable users better to detect low and high extremes of pulmonary capillary wedge pressure. The volume or area of parts (3D) or area (2D) of the specific part 15 of the homunculus 10 fluctuates according to the pulmonary artery pressure wave, when this input is available. A schematic overview of this algorithm is given in
Example 9: Visual Patient Monitoring Algorithm #9 (Visual Mixed Venous Oxygen Saturation)
[0206] This algorithm allows a specific section of part 2 of the homunculus (
[0207] Changes in the mixed venous oxygen saturation lead to a change in color of a specific section of heart 2 of the homunculus 10, representing the blood inside the right ventricle of the heart 2. At 80% mixed venous oxygen saturation, the specific section of heart 2 of the homunculus 10 has an intuitive light blue color (e.g., HEX color #84B0E8) representing high mixed venous oxygen content. As oxygen content decreases the color gradually becomes darker (e.g., HEX color #0E3996) and eventually turns purple (e.g., HEX color #723C7F) and grey (e.g., HEX color #DEDEDE) representing various degrees of mixed venous hypoxia. Assigning other, different colors to the mixed venous saturation data points will occur to those skilled in the art and may be used without varying from the spirit of this disclosure.
[0208] The changes in color from light blue to dark blue and eventually grey particularly follow an ease in/out function. This function will enable users better to detect low and high mixed venous oxygen saturation extremes. A schematic overview of this algorithm is given in
Example 10: Visual Patient Monitoring Algorithm #10 (Visual Cardiac Output, Visual Cardiac Index)
[0209] This algorithm allows the patients cardiac output, or cardiac index, which is the cardiac output related to body surface area, to be indicated in an intuitive way in heart 2 (e.g.
[0210] According to the cardiac output or cardiac index input, cardiac output indicators 18 appear on a specific section 17 of heart 2 of the homunculus 10. The amount of cardiac output indicators 18 presented allows the user to understand the cardiac output of the patient P intuitively. Cardiac output or cardiac index indicators 18 include the amount of erythrocytes (red blood cells) being ejected from the left ventricle of the heart 2 into the aorta. The amount of cardiac output indicators 18 shown particularly follows an ease in/out function. This enables users to detect low and high cardiac output or cardiac index extremes better. Assigning different designs of cardiac output, respectively, cardiac index indicators 18 other than the red blood cells design described here as an embodiment of the disclosure, will occur to those skilled in the art and may be used without varying from the spirit of this disclosure. A schematic overview of this algorithm is given in
Example 11: Visual Patient Monitoring Algorithm #11 (Visual Arterial System)
[0211] This algorithm is used to enable region 3 in form of visual arteries (or artery system) of the homunculus 10 (e.g.
[0212] Changes in input blood pressure lead to a change in volume or area of parts (3D) or area (2D) of region 3 of the homunculus 10 particularly following an ease in ease out function. The ease in/out function causes very low and very high pressures to cause less extensive changes in volume or area of parts (3D) or area (2D) of part 3 when compared to changes in medium pressure ranges. This function will enable users to better detect low and high-pressure extremes. Regions 3 of the homunculus 10 alternate between the volume or area of parts (3D) or area (2D) value of diastolic blood pressure (minimum) and systolic blood pressure (maximum). The changes in volume or area of parts (3D) or area (2D) of region 3 occur with the frequency of the pulse rate (PR), derived from the e.g. SpO2 sensor or from e.g. an invasive blood pressure wave, and follow an arterial pressure curve form.
[0213] The arterial pressure curve form according to which the volume or area of parts (3D) or area (2D) change of part 3 is achieved, is stored in memory 23. Several curve forms may be stored in memory 23 and displayed for different blood pressure amplitudes.
[0214] The change in volume or area of parts (3D) or area (2D) of region 3 of the homunculus 10 behaves according to the actual arterial pressure curve input from the patient P, when this input is available.
Example 12: Visual Patient Monitoring Algorithm #12 (Visual Vena Cava)
[0215] This algorithm allows region 4 (
[0216] Changes in input central venous pressure lead to a change in volume or area of parts (3D) or area (2D) of the vena cava 4 particularly following an ease in/out function. The ease in/out function causes very low and very high pressures to cause less extensive changes in volume or area of parts (3D) or area (2D) of region 4 when compared to changes in medium pressure ranges. This function will enable users better to detect low and high extremes of central venous pressure. The volume or area of parts (3D) or area (2D) of region (vena cava) 4 of the homunculus 10 fluctuate according to the CVP pressure wave, when this input is available. A schematic overview of this algorithm is given in
Example 13: Visual Patient Algorithm #13 (Visual Respiration)
[0217] This algorithm allows regions 5 (e.g.
[0218] Changes in input tidal volume TV lead to a change in volume or area of parts (3D) or area (2D) of both regions 5 particularly following an ease in ease out function. The ease in/out function causes very low and very high tidal volumes to cause less extensive changes in volume or area of parts (3D) or area (2D) of regions (lungs) 5 when compared to changes in medium tidal volume ranges. This function will enable users better to detect low and high tidal volume extremes. Regions (lungs) 5 of the homunculus 10 alternate between a default volume or area of parts (3D) or area (2D) value (minimum) and the value according to the tidal value of the patient P (maximum). The changes in volume or area of parts (3D) or area (2D) of both lungs in part 5 occur with the frequency of the respiratory rate PR, and preferably follows a volume-time curve. The volume time curve according to which the volume or area of parts (3D) or area (2D) change of both lungs 5 is achieved, is stored in memory 23. Several curve forms may be stored in memory 23 and displayed for different respiratory rates. The change in volume or area of parts (3D) or area (2D) of both lungs 5 is according to the actual volume or area of parts (3D) or area (2D) curve input from the patient P, when this input is available.
Example 14: Visual Patient Monitoring Algorithm #14 (Visual CO2 Cloud)
[0219] This algorithm allows a region 6 in the form of a CO2 cloud of the homunculus 10 (e.g.
[0220] Changes in input expiratory carbon dioxide measurement (eCO2) measurement (end-tidal CO2) (eCO2) leads to a change in volume or area of parts (3D) or area (2D) of region 6 particularly following an ease in ease out function. An intuitive color to represent carbon dioxide for region 6 of the homunculus 10 representing the CO2 cloud can be for example HEX color #F9FB04. The ease in/out function causes very low and very high end-expiratory CO2 values to cause less extensive changes in volume or area of parts (3D) or area (2D) of region 6 when compared to changes in medium end-tidal CO2 measurement. This function will enable users to better detect low and high end-tidal CO2 extremes. Region 6 of the homunculus 10 alternates between nonexistent (minimum) and the value according to the end-expiratory CO2 measurement of the patient P (maximum). The changes in volume or area of parts (3D) or area (2D) of region 6 occur with the frequency of the respiratory rate PR, and particularly follows an expiratory CO2 curve. The end-expiratory CO2 curve according to which the volume or area of parts (3D) or area (2D) change of region 6 is achieved, is stored in a memory. The change in volume or area of parts (3D) or area (2D) of region 6 behave according to the actual end-expiratory CO2 curve input (capnography) from the patient P, when this input is available. Assigning other indicators different in design to the ones described here will occur to those skilled in the art and may be used without varying from the spirit of this disclosure. E.g., using the display of fewer or more expired gas bubbles to indicate the value of the end-expiratory CO2 value.
Example 15: Visual Patient Monitoring Algorithm #15 (Visual Oxygen Cloud)
[0221] This algorithm allows the fraction of expired oxygen (FeO2) to be indicated in an intuitive way in region 6 of the homunculus 10 (e.g.
[0222] Changes in the input expiratory oxygen (FeO2) lead to a change area/volume of a colored section 19 in region 6 of the homunculus 10 (visual CO2 & oxygen cloud) particularly following an ease in ease out function. The area[2D]/volume[3D] change will be relative to the dynamic area[2D]/volume[3D] changes of region 6 of the homunculus 10 occurring according to the visual patient monitoring algorithm #13 (visual respiration). An intuitive color to represent oxygen for the section 19 of region 6 representing expired oxygen can be for example HEX color #84B0E8. The ease in/out function causes very low and very high expiratory oxygen values to cause less extensive changes in volume or area of parts (3D) or area (2D) of the colored section 19 in region (e.g. cloud) 6 of the homunculus 10 when compared to changes in medium expiratory oxygen measurement. This function will enable users to better detect low and high FeO2 extremes. The colored section 19 in region 6 of the homunculus 10 alternates between nonexistent (minimum, FeO20%) and the value according to the expiratory oxygen measurement of the patient (maximum, FeO2100%). Assigning other indicators different in design to the ones described here as an embodiment of the disclosure will occur to those skilled in the art and may be used without varying from the spirit of this disclosure, e.g., using the display of more or fewer expired oxygen gas bubbles to indicate the value of the FeO2. A schematic overview of this algorithm is given in
Example 16: Visual Patient Monitoring Algorithm #16 (Visual Brain Activity)
[0223] This algorithm allows both eyes of part 7 (e.g.
[0224] According to the value of the Bispectral index (BIS) or electroencephalography (EEG) data input the state of both eyes 7 changes from closed 7b to open 7a. The degree of openness will allow the user intuitively to understand the deepness of anesthesia of the patient P. Brain activity input indicative of deep anesthesia lead to eyes with eyelids closed (7b), values representing high brain activity indicating shallow anesthesia depth cause partially open (7c) eyes or eyelids 7 or open (7a) eyes 7.
[0225] Also, according to the BIS or EEG data input brain activity indicators appear around the section of region 1 of the homunculus 10 representing the head 1b of the visual patient monitoring. The amount of brain activity indicators presented allows the user intuitively to understand the brain activity of the patient (which is an indicator of anesthesia depth). Brain activity indicators for brain activity levels may be represented by dynamic waveform patterns, circles and stars rotating around and moving outwards and inwards in random movement from region 8 of the homunculus 10, representing the brain 8 of the homunculus located in head 1b of the homunculus 10.
[0226] They may be complemented by periodic nodding of the region 1b of body 1 representing the head, as well as yawning represented as opening of the mouth or grimacing of the homunculus appearing in prespecified ranges of brain activity. Also, the homunculus 10 may gradually be lowered into a pool of liquid or clouds to represent anesthesia, a graphical example of which is depicted in
[0227] The degree of openness of the eyes 7 and the amount of brain activity indicators displayed, particularly follow an ease in/out function. This enables users better to detect low and high brain activity value extremes.
[0228] Assigning other designs to indicate anesthesia depth as described here as an embodiment of the disclosure will occur to those skilled in the art and may be used without varying from the spirit of this disclosure.
[0229]
Example 17: Visual Patient Monitoring Algorithm #17 (Visual Intracranial Pressure)
[0230] This algorithm allows region 8 (
[0231] Changes in input intracranial pressure lead to a change in volume or area of parts (3D) or area (2D) of brain 8 particularly following an ease in/out function. Also, the appearance of region 8 will gradually change according to an ease in/out function from an appearance where the brain 8 appears relaxed to an appearance where the brain 8 appears compressed and tense, representing high intracranial pressure. The ease in/out functions causes very low and very high pressures to cause less extensive changes in volume or area of parts (3D) or area (2D) of region 8 when compared to changes in medium pressure ranges. This function will enable users better to detect low and high extremes of intracranial pressure. The volume or area of parts (3D) or area (2D) of region (i.e. brain) 8 of the homunculus 10 fluctuate according to the ICP pressure wave, when this input is available. Assigning other designs to indicate intracranial pressure then described here as an embodiment of the disclosure will occur to those skilled in the art and may be used without varying from the spirit of this disclosure.
Example 18: Visual Patient Monitoring Algorithm #18 (Visual Brain Tissue Oxygen Tension)
[0232] This algorithm allows region (i.e. brain) 8 of the homunculus (e.g.
[0233] Changes in brain tissue oxygen tension lead to a change in color of brain 8 of the homunculus 10. At 100% oxygen saturation brain 8 of the homunculus 10 has a normal white color, representing well-oxygenated brain tissue. As oxygen content decreases the color gradually turns from a light blue color (e.g., HEX color #84B0E8) to a darker blue (e.g., HEX color #0E3996) and eventually turns purple (e.g., HEX color #723C7F) and dark grey (e.g., HEX color #DEDEDE) representing various degrees of brain tissue hypoxia. Assigning other, different colors to the central venous saturation data points will occur to those skilled in the art and may be used without varying from the spirit of this disclosure.
[0234] The changes in color from normal white to dark blue and eventually grey follow an ease in/out function. This function will enable users better to detect low and high brain tissue oxygen tension extremes.
[0235]
Example 19: Visual Patient Monitoring Algorithm #19 (Visual Neuromuscular Transmission)
[0236] This algorithm allows the body 1 and both hands 9 (particularly also thumbs 9a) of the homunculus 10 (e.g.
[0237] According to the value of the neuromuscular transmission measurement (NMT) data input, the body 1, hands 9 and thumbs 9a of the homunculus 10 change from relaxed (flaccid) to tense and the hands 9 show an extended thumb (thumbs up) 9a. The degree of relaxation of the hands and extension of the thumb will allow the user intuitively to understand the degree of neuromuscular transmission of the patient. NMT values indicative of good muscle relaxation lead to the fingers of the hands and all joints to hang down giving a relaxed indication. High NMT values, indicating good neuromuscular transmission, cause the hand to tighten and the thumbs on the hands to extend.
[0238] Also, according to the NMT data input part 1 of the homunculus may change its shape to appear tilted at the knees and the elbows to represent muscle weakness or muscle strength intuitively. NMT values indicative of good muscle relaxation lead to changes in the relative positions of the sections of part 1 of the homunculus representing the legs, arms and head, giving a relaxed indication. They appear to hang down following the gravitational force. High NMT values, indicating good neuromuscular transmission, cause the legs and arms to appear tense.
[0239] The degree of relaxation of the hands and extension of the thumb follows an ease in/out function. This enables users better to detect low and high NMT value extremes. Assigning other ways than described here an embodiment will occur to those skilled in the art and may be used without varying from the spirit of this disclosure.
[0240] The general visual patient monitoring algorithm (
[0241] The algorithms and subroutines illustrate the architecture, functionality, and operation of an implementation of the disclosure. If embodied in software, each block may represent a module, segment or portion of code that comprises program instructions to implement the specified logical function(s). The program instructions may be embodied in the form of source code that comprises human-readable statements written in a programming language or machine code that comprises numerical instructions recognizable by a suitable execution system such as a processor in a computer system or other system. The machine code may be converted from the source code, etc. If embodied in hardware, each block may represent a circuit or a number of interconnected circuits to implement the specified logical function(s).
[0242] Although the algorithms and subroutines show a specific order of execution, it is understood that the order of execution may differ from that which is depicted. For example, the order of execution of two or more blocks may be scrambled relatively to the order shown. Also, two or more blocks shown in succession in the algorithms and subroutines may be executed concurrently or with partial concurrence. In addition, any number of numerals, waveforms, state variables, data buffers, warning semaphores, or messages might be added to the logical flow described herein, for purposes of enhanced utility, accounting, performance measurement, or providing troubleshooting aids, etc. It is understood that all such variations are within the scope of the present disclosure. Also, the block diagrams, flow charts and graphics are relatively self-explanatory and are understood by those with ordinary skill in the art to the extent that software and/or hardware can be created by one with ordinary skill in the art to carry out the various logical functions as described herein.
[0243] Where the algorithms and subroutines comprise software or code, it can be embodied in any computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor in a computer system or other system. In this sense, the logic may comprise, for example, statements including instructions and declarations that can be fetched from the computer-readable medium and executed by the instruction execution system. In the context of the present disclosure, a “computer-readable medium” can be any medium that can contain, store, or maintain the algorithms and subroutines for use by or in connection with the instruction execution system. The computer readable medium can comprise any one of many physical media such as, for example, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor media. More specific examples of a suitable computer-readable medium would include but are not limited to, magnetic tapes, magnetic floppy diskettes, magnetic hard drives, flash memory, or compact discs. Also, the computer-readable medium may be a random access memory (RAM) including, for example, static random access memory (SRAM) and dynamic random access memory (DRAM), magnetic random access memory (MRAM), or non-volatile random-access memory (NVRAM). In addition, the computer-readable medium may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other type of memory device.
[0244] Although the disclosure has been shown and described with respect to certain preferred embodiments, equivalent alterations and modifications will occur to those skilled in the art upon reading and understanding this specification and the annexed drawings. In particular regard to the various functions performed by the above described integers (components, assemblies, devices, compositions, etc.), the terms (including a reference to a “means”) used to describe such integers are intended to correspond, unless otherwise indicated, to any integer which performs the specified function of the described integer (i.e., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary embodiment or embodiments of the disclosure. In addition, while a particular feature of the disclosure may have been described above with respect to only one of several illustrated embodiments, such feature may be combined with one or more other features of the other embodiments, as may be desired or advantageous for any given or particular application.
[0245] An example of how warning semaphores is implemented as part of an embodiment of the disclosure is through the use of an alarm system representing an annunciator panel. Accompanying the graphical homunculus display (or instrument) could be a panel displaying alerts showing fields with the designations of the monitored parameters in a specific order and place, as in an annunciator panel. The order of the fields and the normal range of the parameters may be configurable by the user or may be dynamically adapted by the software. When the measured value of a parameter is within its normal range, its corresponding field is invisible, i.e., not displayed.
[0246] According to predefined abnormal ranges, an abnormal measurement for a parameter may cause the field displaying the name of the abnormal parameter to become visible and turn yellow, i.e., to signify a state that requires CAUTION, or red, i.e., to indicate a WARNING of a dangerous state. In an embodiment, these alerts may be accompanied by a bar in the correspondent color (yellow or red), with the length of the colored bar indicating the magnitude of deviation from the normal, and the position of the bar on the upper or lower border of the field indicating a deviation above (upper border) or below (lower border) the normal range. When the measured value of a parameter reaches a value in the defined CAUTION range, the colored yellow bar spans over half of the fields upper or lower border (50%). When a measured value of a parameter reaches a value halfway between the value defined as the beginning of the CAUTION range and the value defined as the beginning of the WARNING range, the length of the yellow bar will gradually increase with the value of the measured parameter from 50% to 100%.
[0247] Additionally, the volume or area of parts (3D) or area (2D) or length representing the abnormal parameter in the visual patient display (or instrument) may alternate its color between the color displayed by the visual patient graphical display and the abnormal yellow or red color, e.g. in 1 second intervals. Combinations of two or more specified parameters in the CAUTION range will produce a WARNING and cause the display of an additional critical situations alert, indicating to the user that a dangerous situation may be occurring. A graphical example of this is shown in
[0248] In detail,
[0249] Further, in detail,
[0250] Critical situation alarm system
[0251] Table 2 outlines examples of combinations of parameters, which, when in an abnormal range will cause the display of an additional critical situation alert (“WARNING”) according to an embodiment of the disclosure.
TABLE-US-00003 above (↑) normal range below (↓) normal range Blood pressure (BP) BIS ↑ Pulse rate ↑, ECG QRS Heart rate ↑ intracranial pressure ↑ Pulse Rate Blood pressure (BP) ↓ SaO2 ↓ SpO2 Sensor (PR) Oxygen saturation (SpO2) (none) Tidal volume ↓, Respiratory rate ↓, Expiratory oxygen measurement ↓, Brain tissue oxygen tension ↓ Blood pressure (BP) ↓ Body Temperature (Temp) eCO2 ↑ (−> MH!) ECG QRS Heart rate ↓ ECG QRS Heart Rate (HR) Blood pressure (BP) ↓ Blood pressure (BP) ↓ ECG ST-segment deflection (ST) ECG rhythm detection (ECG) Right ventricular pressure eCO2 ↑ Blood pressure (BP) ↓ (RVP) SaO2 ↓ Pulmonary capillary wedge Blood pressure (BP) ↓ Blood pressure (BP) ↓ pressure or wedge pressure CO/CI ↓ (PCWP) Mixed venous oxygen CO/CI ↓ saturation (MVOS) Central venous pressure Blood pressure (BP) ↓ ECG QRS Heart rate ↑ (CVP) CO/CI ↓ Respiratory rate (RR) Oxygen saturation ↓, Tidal volume ↓, Expiratory oxygen measurement ↓, Brain tissue oxygen tension↓ Tidal volume (TV) Oxygen saturation ↓, Respiratory rate ↓, Expiratory oxygen measurement ↓, Brain tissue oxygen tension ↓ expiratory carbon dioxide Tidal volume ↓, (eCO2) Respiratory rate ↓ Expiratory oxygen Oxygen saturation ↓, measurement (e02) Tidal volume ↓, Respiratory rate ↓, Brain tissue oxygen tension ↓ Brain activity from e,g., Neuromuscular transmission Bispectral Index System measurement system ↓ (BIS), electroencephalogram (EEG) intracranial pressure Blood pressure ↓ Brain tissue oxygen tension Oxygen saturation ↓ Expiratory oxygen measurement ↓, Tidal volume ↓, Respiratory rate Neuromuscular transmission Brain activity ↑ measurement system (NMT)