A61B7/00

System and method for infrasonic cardiac monitoring

Cardiac Output (CO) has traditionally been difficult, dangerous, and expensive to obtain. Surrogate measures such as pulse rate and blood pressure have therefore been used to permit an estimate of CO. MEMS technology, evolutionary computation, and time-frequency signal analysis techniques provide a technology to non-invasively estimate CO, based on precordial (chest wall) motions. The technology detects a ventricular contraction time point, and stroke volume, from chest wall motion measurements. As CO is the product of heart rate and stroke volume, these algorithms permit continuous, beat to beat CO assessment. Nontraditional Wavelet analysis can be used to extract features from chest acceleration. A learning tool is preferable to define the packets which best correlate to contraction time and stroke volume.

System and method for infrasonic cardiac monitoring

Cardiac Output (CO) has traditionally been difficult, dangerous, and expensive to obtain. Surrogate measures such as pulse rate and blood pressure have therefore been used to permit an estimate of CO. MEMS technology, evolutionary computation, and time-frequency signal analysis techniques provide a technology to non-invasively estimate CO, based on precordial (chest wall) motions. The technology detects a ventricular contraction time point, and stroke volume, from chest wall motion measurements. As CO is the product of heart rate and stroke volume, these algorithms permit continuous, beat to beat CO assessment. Nontraditional Wavelet analysis can be used to extract features from chest acceleration. A learning tool is preferable to define the packets which best correlate to contraction time and stroke volume.

Augmenting real-time views of a patient with three-dimensional data
11481987 · 2022-10-25 · ·

Augmenting real-time views of a patient with three-dimensional (3D) data. In one embodiment, a method may include identifying 3D data for a patient with the 3D data including an outer layer and multiple inner layers, determining virtual morphometric measurements of the outer layer from the 3D data, registering a real-time position of the outer layer of the patient in a 3D space, determining real-time morphometric measurements of the outer layer of the patient, automatically registering the position of the outer layer from the 3D data to align with the registered real-time position of the outer layer of the patient in the 3D space using the virtual morphometric measurements and using the real-time morphometric measurements, and displaying, in an augmented reality (AR) headset, one of the inner layers from the 3D data projected onto real-time views of the outer layer of the patient.

Augmenting real-time views of a patient with three-dimensional data
11481987 · 2022-10-25 · ·

Augmenting real-time views of a patient with three-dimensional (3D) data. In one embodiment, a method may include identifying 3D data for a patient with the 3D data including an outer layer and multiple inner layers, determining virtual morphometric measurements of the outer layer from the 3D data, registering a real-time position of the outer layer of the patient in a 3D space, determining real-time morphometric measurements of the outer layer of the patient, automatically registering the position of the outer layer from the 3D data to align with the registered real-time position of the outer layer of the patient in the 3D space using the virtual morphometric measurements and using the real-time morphometric measurements, and displaying, in an augmented reality (AR) headset, one of the inner layers from the 3D data projected onto real-time views of the outer layer of the patient.

INFANT FEEDING REINFORCEMENT SYSTEM

Disclosed herein are systems and methods for providing feeding reinforcement in real-time or near real-time based on physiological sensor data acquired during feeding. In some examples, music reinforcement is rendered when one or more feeding features indicative of one or more feeding behaviors are detected using the physiological sensor data. When at least one feature is not detected, the music reinforcement is stopped. In this way, contingent reinforcement is provided in real-time or near real-time based on detection of the one or more feeding behaviors to encourage and improve independent feeding behavior.

DEVICES AND METHODS FOR PREDICTING, IDENTIFYING AND/OR MANAGING PNEUMONIA OR OTHER HEALTH STATUS

Devices and methods for predicting, identifying and/or managing pneumonia are disclosed and may generally comprise a housing, a mouthpiece in communication with the housing and configured for insertion within a mouth, a temperature sensor positioned within or along the housing, a pulse oximeter sensor positioned along the housing for sensing a heart rate or blood oxygen level, a microphone positioned within or along the housing for obtaining sounds associated with respiration, and a controller configured to receive physiologic parameters including the temperature, heart rate, blood oxygen level, and sounds associated with respiration. The controller can determine a likelihood of contracting pneumonia, a presence of pneumonia, or a status of pneumonia within the user based on deviations from a threshold value which are present in at least two of the physiologic parameters.

MATERNAL AND FETAL MONITORING DEVICE USING MULTIPLE SENSING UNITS

A maternal and fetal monitoring device using multiple sensing units is disclosed. The maternal and fetal monitoring device comprises: a processor module and a plurality of sensor modules, wherein each said sensor module comprises: an inertial sensor, a temperature sensor and a first acoustic sensor. After being attached onto a maternal body, each said sensor module collects a body temperature signal, an inertia signal and a sound signal. Subsequently, the processor module determines a maternal body posture by analyzing the inertia signal, determines a maternal physical condition and a fetal physical condition by analyzing the inertial signal, the plurality of first sound signals and the second sound signal, and estimates physiological parameters of the maternal body and a fetus by analyzing the body temperature sensing signal, the plurality of first sound signals, and the second sound signals.

MATERNAL AND FETAL MONITORING DEVICE USING MULTIPLE SENSING UNITS

A maternal and fetal monitoring device using multiple sensing units is disclosed. The maternal and fetal monitoring device comprises: a processor module and a plurality of sensor modules, wherein each said sensor module comprises: an inertial sensor, a temperature sensor and a first acoustic sensor. After being attached onto a maternal body, each said sensor module collects a body temperature signal, an inertia signal and a sound signal. Subsequently, the processor module determines a maternal body posture by analyzing the inertia signal, determines a maternal physical condition and a fetal physical condition by analyzing the inertial signal, the plurality of first sound signals and the second sound signal, and estimates physiological parameters of the maternal body and a fetus by analyzing the body temperature sensing signal, the plurality of first sound signals, and the second sound signals.

Smart mask for COVID-19 screening, tracking and monitoring
11478191 · 2022-10-25 ·

A smart face mask comprising a mask body; a temperature sensor; a respiration sensor; and a transmitter for transmitting information from the temperature sensor, the respiration sensor and a geotracker to a smart phone or smart watch.

DERIVING INSIGHTS INTO HEALTH THROUGH ANALYSIS OF AUDIO DATA GENERATED BY DIGITAL STETHOSCOPES
20230076296 · 2023-03-09 ·

Introduced here are computer programs and associated computer-implemented techniques for deriving insights into the health of patients through analysis of audio data generated by electronic stethoscope systems. A diagnostic platform may be responsible for examining the audio data generated by an electronic stethoscope system so as to gain insights into the health of a patient. The diagnostic platform may employ heuristics, algorithms, or models that rely on machine learning or artificial intelligence to perform auscultation in a manner that significantly outperforms traditional approaches that rely on visual analysis by a healthcare professional.