Intelligent apparatus for guidance and data capture during physical repositioning of a patient on a sleep platform
11033238 · 2021-06-15
Assignee
Inventors
Cpc classification
G16H20/30
PHYSICS
A61G2203/10
HUMAN NECESSITIES
G16H50/20
PHYSICS
G16H20/40
PHYSICS
A61B5/7264
HUMAN NECESSITIES
A61B5/11
HUMAN NECESSITIES
A61B5/1121
HUMAN NECESSITIES
A61B5/6898
HUMAN NECESSITIES
A61G5/1067
HUMAN NECESSITIES
A61B2562/0219
HUMAN NECESSITIES
A61G7/057
HUMAN NECESSITIES
International classification
A61B5/00
HUMAN NECESSITIES
G16H20/40
PHYSICS
A61B5/11
HUMAN NECESSITIES
G16H50/20
PHYSICS
Abstract
A system for guiding and evaluating physical positioning, orientation and motion of the human body, comprising: a cloud computing-based subsystem including an artificial neural network and spatial position analyzer said cloud computing-based subsystem adapted for data storage, management and analysis; at least one motion sensing device wearable on the human body, said at least one motion sensing device adapted to detect changes in at least one of spatial position, orientation, and rate of motion; a mobile subsystem running an application program (app) that controls said at least one motion sensing device, said mobile subsystem adapted to capture activity data quantifying said changes in at least one of spatial position, orientation, and rate of motion, said mobile subsystem further adapted to transfer said activity data to said cloud computing-based subsystem, wherein said cloud computing-based subsystem processes, stores, and analyzes said activity data.
Claims
1. A system for guiding and evaluating physical positioning, orientation and movement of a human patient during repositioning on a sleep platform, comprising: a cloud computing-based subsystem, said subsystem including an artificial neural network and spatial position analyzer, said artificial neural network and said spatial position analyzer determining personalized boundary values for at least one of said physical positioning, orientation, and movement of said patient, achieved independent of personal mobility devices aiding said positioning, orientation, and movement, and outputting said values; at least one of a wearable garment or body attachment instrumented with at least one gyroscopic motion sensing device and at least one processing module, said at least one motion sensing device detecting changes and rate of changes in motion, including at least one of said physical positioning, orientation, and movement of said patient, and said at least one processing module receiving from said motion sensing device unprocessed motion change indicators and angle changes between at least two vectors relative to any reference physical orientation of said patient, and outputting said indicators and changes; a mobile subsystem receiving personalized boundary values from said cloud computing-based subsystem and receiving said angle changes between at least two vectors relative to any reference physical orientation from said wearable garment or attachment, said mobile subsystem comparing said angle changes with said boundary values to quantify degree of deviation, and, thereafter, creating a patient record comprising said changes and said rate of changes in at least one of said physical positioning, orientation, and movement, and transmitting said patient record to said cloud computing-based subsystem, said mobile subsystem further providing actionable aural guidance in substantially real-time to at least one of limiting or actuating changes and said rate of changes in at least one of physical positioning, orientation, and movement relative to said boundary values, and said actionable aural guidance including corrective indicators directed to patient position adjustments to achieve therapeutic movement to relieve points of pressure and preclude pressure injury, said indicators responsive to said degree of deviation from said personalized boundary values and gaged response to said guidance during prescribed therapeutic repositioning of said patient on said sleep platform.
2. The system of claim 1, wherein said mobile subsystem further includes at least an artificial neural network and spatial position analyzer to operating independently without interacting with said cloud computing-based subsystem.
3. The system of claim 1, wherein said at least one motion sensing device is adapted to mount in a position on said at least one of said garment or body attachment so that arm movements can be measured during said patient repositioning.
4. The system of claim 1, wherein said at least one motion sensing device is mounted on said at least one of said garment or body attachment so that an angle of body rotation movements can be measured during said patient repositioning.
5. The system of claim 1, wherein said cloud computing-based subsystem includes a specific purpose graphical user interface that displays said patient record and enables monitoring and analysis as to whether patient repositioning follows prescribed therapeutic guidelines.
6. The system of claim 1, wherein said boundary values are personalized for a specific individual person.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
(26)
(27)
(28)
(29)
(30)
(31)
(32)
(33)
(34)
(35)
(36)
(37)
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS OF THE INVENTION
(38) In brief:
(39)
(40)
(41)
(42)
(43)
(44)
(45)
(46)
(47)
(48)
(49)
(50)
(51)
(52)
(53)
(54)
(55)
(56)
(57)
(58)
(59)
(60)
(61)
(62)
(63)
(64)
(65)
(66)
(67)
(68)
(69)
(70)
(71)
(72)
(73)
(74)
(75) In detail: Referring now to
(76) Referring now to
(77) Referring now to
(78) The preferable output includes (1) a range of tilt and recline angles that are favorable for pressure reduction for the user 31; (2) the optimal tilt and recline angles that are most effective in reducing the risk of pressure ulcers 32; and (3) the optimal frequency and duration to perform wheelchair tilt and recline functions 33.
(79) Referring now to
(80) Referring now to
(81) Referring now to
(82) The network structure and weights of the ANN in the application are determined offline by using clinical research data on clinically recommended tilt and recline angles. Specifically, wheelchair users with spinal cord injury were recruited to participate in the research. A testing condition includes a five-minute sitting-induced ischemic period, i.e., the research participant sits in the upright position with no tilt or recline for 5 minutes, and a five-minute pressure relief period, i.e., the research participant sits in a clinically recommended tilt and recline setting for 5 minutes. The skin blood flow was measured throughout the test so that we can know whether a tilt and recline setting is favorable for increasing skin blood flow, which has been widely used to determine the efficacy of wheelchair seating conditions. Then, the skin blood flow data was used to train the ANN to predict tilt and recline settings for individual wheelchair users. Other position parameters may be incorporated as well, such as the elevating leg-rest function of a power wheelchair. The ANN in the invention is fully configurable through adjusting the network structure 400 and weights. The ANN can be replaced by other artificial intelligence techniques, namely, any classification, clustering, and regression techniques, such as support vector machine (SVM), C4.5 decision tree, random forest, etc. The present invention will support such transparency in changing the AI module.
(83) Referring now to
(84) InitActivity.java: This class 51 shows the welcome screen when the application is loading. It calls ClsTrainner 52 to train the classifiers in the backend. Once it finishes initializing classifiers, this activity class will transfer to the Main 50A activity class.
(85) ClsTrainner.java: This class 52 is used to initialize a classifier and regression learner coded in the present invention. The classifier can classify whether a given tilt and recline setting is favorable for an individual with spinal cord injury (SCI) to reduce the risk of pressure ulcer. The regression learner can predict the extent of risk deduction for a given tilt and recline setting. This class runs in the backend as a thread when the application starts.
(86) Main.java: The Main class 50A is the container for all the fragment classes in this application. It provides the overall layout of the application.
(87) FragmentForm.java: This class 55 is used to provide the user interface to input data 53. Users can update their profiles (
(88) FragmentFrequency.java: This class 56 shows to the users the optimal duration and frequency to perform the wheelchair tilt and recline functions. It invokes the daemon thread that is running in the backend to return the optimal duration and frequency to the user interface (UI) thread.
(89) FragmentList.java: This class 58 provides a list of functions that is offered by the smartphone app. It redirects a user to the appropriate functions based on the user's choice.
(90) FragmentResult.java: This class 59 includes the template of My Range, My Optimal, and My Test screens (shown on
(91) InputData.java: This is a singleton class 53 that it has only a single instance in the memory. It contains all the data in this application. It acts as a data store in this application. The trained functions (classifier and regression) as well as user inputs are all stored in this class.
(92) ResultTask.java: The ResultTask class 54 is running in the backend as a daemon thread. Its functionality is to make predictions based on a user's profile (
(93) FragmentAngleMeter.java: This class 57 provides the goniometer function. It reads the accelerometer sensor in the smartphone and calculates the current angle of the phone orientation for the user. This class provides a novel algorithm to measure wheelchair tilt and recline (TR) angles by using the accelerometer in a smartphone. Specifically, the position of a smartphone is modeled with a vector ν=.sub.x, αy, αz
, which represents accelerations in three axes measured by the accelerometer. When the tilt or recline stabilizes to a new angle, accelerations in three axes will change due to the decomposition of the gravity along the new angle of the phone. Then, we utilize the dot product property to calculate angle changes between two vectors (positions):
ν.sub.1.Math.ν.sub.2=|ν.sub.1|×|ν.sub.2|×cos θ (1)
Or equivalently,
θ=arccos(ν.sub.1.Math.ν.sub.2/|ν.sub.1|×|ν.sub.2|) (2)
Hence, no matter how the smartphone is positioned, the TR angle θ between two vectors can be measured. In addition, this class employs the novel text-to-speech technique (see class IntentService.java), which enables the system to use voice alerts to guide wheelchair users for proper TR usage.
(94) IntentService.java: This class 571 implements the Android text-to-speech listener and initializes the text-to-speech function for the subsequent usage.
(95) Referring now to
(96) Index Page 61 (index.html): Index page 61 is the first web page that a user can access. It provides options for registered users to sign in and for unregistered users to register.
(97) Register 611 (SignInServlet): It is a Java Servlet that is invoked by index.html and allows unregistered users to register and create their own user names and passwords. A Java servlet is a class that is used to extend the functionality of the cloud.
(98) Sign in 612 (SignInServlet): It is a Java servlet used by index.html when to sign in and register users given a username and password.
(99) User Welcome Page 62 (welcome.jsp): It is the welcome page after a user successfully signs in the system.
(100) Profile Page 621 (profile.jsp): This page allows users to create their own profiles including their demographic attributes, neurological information, and pressure ulcer history, etc.
(101) Update Profile 6211 (UpdateUserServlet): It is a servlet class that is invoked by profile.jsp to update the user's profile.
(102) Check Angle Page 622 (check.jsp): This page gives a user the option to check whether a particular wheelchair tilt and recline setting will be favorable for the individual user to reduce pressure ulcer's risk.
(103) Check Angles 6221 (CheckAnglesServlet): It is a servlet class that is invoked by check.jsp to check whether a particular wheelchair tilt and recline setting will be favorable for the individual user to reduce pressure ulcer risk.
(104) Range of Angles Page 623 (result.jsp): This page shows the range of tilt and recline angles that are favorable for reducing pressure ulcers' risk.
(105) Optimal Angle Page 624 (optimal.jsp): This page shows the optimal wheelchair tilt and recline settings that may most effectively reduce risk of pressure ulcers.
(106) Duration and Frequency Page 625 (duration.jsp): This page illustrates the optimal duration and frequency to perform wheelchair tilt and recline functions. For example, the user should perform wheelchair tilt and recline functions every 15 minutes (i.e., frequency) and each time the user should maintain that setting for 3 minutes (i.e., duration).
(107) Admin User List Page 63 (admin.jsp): This is a page designed for administrators, who will maintain users, including “add”, “edit”, and “delete” users.
(108) Delete User 631 (DeleteUserServlet): It is a Java servlet used by admin.jsp when an administrator attempts to delete an application user.
(109) Edit User Page 632 (edituser.jsp): This is a web page that invokes Servlets to add a new user or update an existing user.
(110) Edit User 64 (UpdateUserServlet): It is a Java servlet used by admin.jsp when an administrator attempts to edit a user's information.
(111) Create New User 65 (UpdateUserServlet): The same UpdateUserServlet can also be used to create a new user.
(112) Referring now to
(113) Login Screen 71 (LoginActivity): It is the starting Android activity that calls register and signin methods and redirects user to the MenuActivity 701 if the user name and password are verified successfully. Activity is an Android term that represents a function that a user can perform.
(114) Register 72: It invokes the Datastore class (Datastore.register function) that interacts with the Google App Engine datastore to store new user's information (see
(115) Datastore 721: This class interacts with the Google App Engine datastore service and is used by both the mobile endpoints and java servlets.
(116) Sign In 73: It invokes the Datastore class (Datastore.signin function) that interacts with the Google App Engine datastore to validate the user's information (see
(117) User Menu Screen 701 (MenuActivity): It is the main activity that shows the main menu of the system. It consists of the currently selected fragment and a navigation list for changing fragments. A fragment is an Android term that represents a portion of the user interface.
(118) Profile Screen 74 (FragmentForm): It is a fragment that consists of the input fields for user information. Once the button at the bottom of the fragment is pressed, the given information is then updated 741 to the datastore in the cloud (see
(119) Check Angle Page 75 (FragmentCheck): It is a fragment that determines if the given tilt and recline angles 751 are in the ranges provided by the artificial neural network (see
(120) Range of Angles Page 76 (FragmentResult): It is a fragment that displays a list of ranges provided by the artificial neural network (see
(121) Optimal Angles Page 77 (FragmentOptimal): It is a fragment that displays the optimal angles of wheelchair tilt and recline provided by the artificial neural network (see
(122) Duration and Frequency Page 78 (FragmentFrequency): It is a fragment used to check the duration and frequency that the user should perform wheelchair tilt and recline functions. For example, the user should perform wheelchair tilt and recline functions in every 15 minutes (i.e., frequency) and each time the user should maintain that position for 3 minutes (i.e., duration).
(123) Goniometer 79 (FragmentAngleAdjustment): It is a fragment used to display the current angle of the phone. It reads the accelerometer sensor in the smartphone and calculates the current angle of the phone orientation for the user. A desired angle can be set by using the device's menu button. The background of this fragment will turn greener the closer the current angle is to the desired angle.
(124) Referring now to
(125) ApplicationUser 81: consists of all user fields and represents the entity structure stored in the Google App Engine (GAE) datastore.
(126) BloodFlowCore 82: contains methods for interacting with the WEKA API, which is an open source data mining platform and returning the BloodFlowResult object. This is where the artificial neural network is built and angles are returned.
(127) BloodFlowResult 83: contains all output results needed and eventually displayed to the user, including a list of tilt and recline ranges, the optimal angles, and duration and frequency.
(128) Range 84: is a class used to hold one set of tilt and recline ranges.
(129) UserEndpoint 85: this Endpoint class manipulates ApplicationUser entities in the datastore by calling the Datastore class methods. Endpoint classes are located in the GAE source code and are annotated to be generated into an API to be used with Android.
(130) CheckAnglesServlet 86: is a servlet class that checks whether a particular wheelchair tilt and recline setting will be favorable for the individual user to reduce pressure ulcer's risk.
(131) SignInServlet 87: is a Java servlet used when to sign in and register users given a username and password.
(132) ResultEndpoint 88: this endpoint creates a BloodFlowResult object to store results from the runBloodFlowCore method. Endpoint classes are located in the GAE source code and are annotated to be generated into an API to be used with Android.
(133) UpdateUserServlet 89: is a Java servlet used when an administrator attempts to edit a user's information.
(134) DeleteUserServlet 810: is a Java servlet used when an administrator attempts to delete an application user.
(135) SignOutServlet 811: This class provides the sign out function in the web application.
(136) MLP.java 812: The MLP class is customized by adding getNumWeights( ), importWeights( ), and exportWeights( ) methods. These methods allow us to reconstruct ANN if the network structure and weights are provided.
(137) MLP 812, LinearUnit 816, NeuralEnd 817, and NeuralConnection 818 are obtained from WEKA, which is an open source platform for data mining. These classes are used to model the artificial neural network. LinearUnit 816, NeuralEnd 817, and NeuralConnection 818 are used without any customizations.
(138) Referring now to
(139) AngleData 813: is the data type class that models tilt and recline angle data, which is sent from the mobile client.
(140) DataManager 814: is the class that handles the communication between the client and Google datastore.
(141) EMF 815: EntityManagerFactory helps communication between the Google datastore and the application.
(142) Referring now to
(143) LoginActivity 91: it is the starting Android activity that calls register and signin methods and redirects user to the MenuActivity 92 if the user name and password are verified successfully. Activity is an Android term that represents a function that a user can perform.
(144) MenuActivity 92: it is the main activity that shows the main menu of the system. It consists of the currently selected fragment and a navigation list for changing fragments. A fragment is an Android term that represents a portion of the user interface.
(145) FragmentForm 921: It is a fragment that consists of the input fields for user information. Once the button at the bottom of the fragment is pressed, the given information is then updated to the datastore in the cloud. A fragment is an Android term that represents a portion of the user interface.
(146) FragmentCheck 922: It is a fragment that determines if the given tilt and recline angles are in the ranges provided by the artificial neural network.
(147) FragmentResult 923: It is a fragment that displays a list of ranges provided by the artificial neural network. These ranges are favorable tilt and recline combinations that can help reduce the risk of pressure ulcers.
(148) FragmentOptimal 924: It is a fragment that displays the optimal angles of wheelchair tilt and recline provided by the artificial neural network.
(149) FragmentFrequency 925: It is a fragment used to check the duration and frequency that the user should perform wheelchair tilt and recline functions. For example, the user should perform wheelchair tilt and recline functions every 15 minutes (i.e., frequency) and each time the user should maintain that setting for 3 minutes (i.e., duration).
(150) FragmentAngleAdjustment 926: It is a fragment used to display the current angle of the wheelchair (tilt or recline). It reads the accelerometer sensor in the smartphone and calculates the current angle of the phone orientation for the user. A desired angle can be set by using the device's menu button. The background of this fragment will turn greener the closer the current angle is to the desired angle.
(151) FragmentList 927: is a fragment that provides a list of functions that is offered by the smartphone app. It redirects a user to the appropriate functions based on the user's choice.
(152) Datastore 93: this class is used by the mobile endpoints to interact with the Google App Engine datastore to manipulate data.
(153) UserEndpoint 94: this Endpoint class manipulates ApplicationUser entities in the datastore by calling the Datastore class methods. Endpoint classes are located in the GAE source code and are annotated to be generated into an API to be used with Android.
(154) ResultEndpoint 95: this endpoint creates a BloodFlowResult object to store results from the runBloodFlowCore method. Endpoint classes are located in the GAE source code and are annotated to be generated into an API to be used with Android.
(155) BloodFlowCore 96: contains methods for interacting with the WEKA API, which is an open source data mining platform and returning the BloodFlowResult object. This is where the artificial neural network is built and angles are returned.
(156)
(157)
(158)
(159)
(160)
(161)
(162)
(163)
(164)
(165) The present invention 10 can benefit all wheelchair users, who use a wheelchair with either a tilt or both tilt and recline functions. Both power and manual wheelchair users can benefit from this and other functions of the present invention 10. Healthcare providers and researchers will benefit from the present invention 10, as well. If they use the tilt and recline guidance provided by the present invention 10, the guidance will be automatically provided as inputs to the measurement and notification process 130 implemented in source code and operable on a mobile device. If the health providers and researchers do not use the personalized guidance, the present invention 10 will allow them to input alternative tilt and recline (TR) guidelines (see
(166) As shown in
(167) In step 2, the goniometer asks the wheelchair user to adjust the wheelchair to the upright position (i.e., no tilt or recline). As shown in
(168) In step 3, the wheelchair user adjusts the wheelchair to the upright position following the voice guidance.
(169) In step 4, the wheelchair user touches the screen of the smartphone after the wheelchair has been adjusted to the upright position.
(170) In step 5, the goniometer asks the user to sit still so that the goniometer can record the initial position of the smartphone. This step is needed to ensure the precision of angle calculation. Voice alert is used to guide the user. For example, the voice alert may recite the non-limiting script “Please do not move your phone for five seconds.” As shown in
(171) In step 6, the goniometer may be configured to ask the user to adjust the tilt angle by using a voice alert. For example, the voice alert may recite the non-limiting script “You may now adjust your position. Please adjust your tilt to 15 degrees.”
(172) In step 7, the wheelchair user starts to adjust the tilt angle as instructed by the voice alert. In the meantime, the goniometer will measure and display the current tilt angle on the screen of the smartphone as shown in
(173) In step 8, if the target tilt angle has been reached, the goniometer may be configured to ask the wheelchair user to stop with the voice alert. For example, the voice alert may recite the non-limiting script “Please stop!”
(174) In step 9, the goniometer may be configured to ask the wheelchair user to adjust the recline angle by using the voice alert. For example, the voice alert may recite the non-limiting script “Please adjust your Recline to 110 degrees.”
(175) In step 10, the wheelchair user starts to adjust the recline angle. In the meantime, the goniometer will measure and display the current recline angle on the screen of the smartphone as shown in
(176) In step 11, if the target recline angle has been reached, the goniometer of the present invention may be configured to use an aural instruction where the user may be asked with the voice alert to stop. For example the voice alert may recite the non-limiting script “Please stop!You are now in your target position.” In the meantime, the goniometer will also show the final angle and the stop message on the screen of the smartphone as shown in
(177)
(178)
(179)
(180)
(181)
(182) α.sub.x, α.sub.y, α.sub.z
, a which represents accelerations in three axes measured by the accelerometer. When the tilt or recline stabilizes to a new angle, accelerations in three axes will change due to the decomposition of the gravity along the new angle of the phone. Then, the present invention utilizes the dot product property to calculate angle changes between two vectors (positions):
ν.sub.1.Math.ν.sub.2=|ν.sub.1|×|ν.sub.2|×cos θ (1)
Or equivalently,
θ=arccos(ν.sub.1.Math.ν.sub.2/|ν.sub.1|×|ν.sub.2|) (2)
Hence, no matter how the smartphone is positioned, the TR angle θ between two vectors can be measured. In addition, the mobile subsystem 191 employs the novel text-to-speech technique, which enables the system to use voice alerts to guide wheelchair users for proper TR usage.
(183) The present invention 10 provides a cloud computing-based subsystem 192 that can provide personalized guidance on wheelchair tilt and recline usage using the artificial neural network, and process, store, and analyze wheelchair 193 TR usage data. This subsystem employs the cloud computing paradigm, which can provide virtually unlimited resources for computation and data storage. Based on the longitudinal TR usage data, the present invention 10 may be used to provide operational applications for mobile devices to evaluate whether wheelchair users adjust enough TR angles to relieve seating pressure and whether they frequently reposition themselves by performing TR functions. The present invention 10 may be used to provide a novel machine-learning approach to analyze historical data of an individual wheelchair user, and assess his or her pressure ulcer (PU) risks correspondingly.
(184) The present invention 10 may use the Google App Engine (GAE) as the cloud computing platform. GAE is managed by Google and provides a platform for developing and hosting web applications. Note that other techniques may be used to replace GAE. Essentially, there are currently three options: (1) continue to use commercial cloud computing platforms, such as Google App Engine, Microsoft Azure, Amazon EC2, etc.; (2) set up a dedicated private cloud computing platform; or (3) use a traditional web server as the data management and computation platform. Other options may emerge in the future and are anticipated as possible web development and hosting solutions to support implementation of various features of the present invention.
(185) The combination of mobile and cloud computing can yield a balanced and integrated system, in which the mobile subsystem 191 will collect user's information, display personalized guidance on TR usage, control the sensor, measure wheelchair TR angles, and transmit TR usage data to the cloud, while the cloud subsystem 192 will handle the subsequent data management and analysis. Therefore, the present invention 10 provides a practical way to improve wheelchair 193 TR usage and capture longitudinal TR usage data among wheelchair users
(186) The mobile application of the present invention 10 may be implemented for any mobile operating system, including the mainstream mobile operating systems, such as Google Android, Apple iOS, and Microsoft Windows. To use the mobile application provided by the present invention 10, the user needs to download it from an accessible public source where it may be made available, such as Google Play, Apple Store, or Windows App Store depending on the mobile operating systems they use.
(187) Referring now to
(188) Goniometric measurements (e.g., position, motion, orientation) provided using the present invention 20 may be used as outcome measures (e.g., after a course of treatment), as an exam finding to aid in the diagnosis of a condition, to monitor physical impact and activity, and to determine level of fitness for a specific purpose. System responses are anticipated to at least user 2021 touch and voice commands received 209 from a user 2021 of the mobile device 201 and 202. Audio recitation and response is anticipated. User 2021 touch, voice activation and audio recitation functions are generally programmable and operable on industry standard smart devices 201 such as various device models of iPhone, iPad, Samsung Galaxy, HP tablets, and wearable devices 202 such as AppleWatch, FitBit, Google Glass 203, etc., running on operating systems such as Android, iOS, and Windows. Any such mobile device 201, 202 and 203 having the minimum function set as described herein is anticipated as a useable component in the present invention 20.
(189) Wearable devices, including but not limited to trousers 205, shirts 206, gloves 207, footwear (e.g., socks, shoes) 208, and headgear (e.g., caps, helmets) 209, instrumented with detection devices 50 capable of providing at least goniometer functions (e.g., motion, position, orientation) may be used in preferred embodiments of the present invention 20. In some embodiments the detection devices 50 such as the Intel Cure™ Module may store and process physical parameters. Wearable devices 205, 206, 207, 208, and 209 comprising smartgarments and smartheadgear instrumented with detection devices 50 may be adapted to measure, among other parameters, flexion and extension of the joints in a skeletal system, physical impact and activity, as well as tilt and recline angles for wheelchair users. Measured parameters may be processed locally in the detection device 50 on a wearable device 202, 205, 206, 207, 208, and 209 or transmitted 209, 259, 269, 279, 289, and 299 using for example Bluetooth™ to a computation capable smart device 201 for processing. In some preferred embodiments, parameters measured by detection devices 50 on the wearable devices 202, 205, 206, 207, 208, and 209 may be sent using for example WiFi to the cloud computing-based subsystem 192 for processing and storage, as well as access by clinicians.
(190) Referring now to
(191)
(192)
(193) Referring now to
(194) In step 2, the smartphone application audibly and/or visually reminds the wheelchair user of performing wheelchair TR and checks whether the wheelchair user is ready.
(195) In step 3, the wheelchair user confirms his/her readiness.
(196) In step 4, the smartphone application audibly and/or visually directs the wheelchair user to adjust the wheelchair to the upright position (i.e., no tilt or recline). The novel voice alert technique may be used to guide the user, i.e., “Please make sure that your wheelchair is in the upright position. Say ready when you are ready!”
(197) After adjusting to the upright position (step 5), the wheelchair user confirms his/her readiness by saying “ready” (step 6) or signaling in an alternative fashion (e.g., touching the screen).
(198) In step 7, the smartphone application audibly and/or visually directs the wheelchair user to place the arm that wears the Microsoft Band 250 in
(199) After 5 seconds (step 8), the smartphone application audibly and/or visually directs the wheelchair user to adjust the tilt angle. While wheelchair user is adjusting the tilt angle (step 9), the smartphone application will read sensor data from the Microsoft Band 250 in
(200) Then, the smartphone application will audibly and/or visually direct the wheelchair user to adjust the recline angle. Similarly, while the wheelchair user is adjusting the wheelchair 193 in
(201) Once the prescribed angle is reached, the smartphone application will audibly and/or visually direct the wheelchair user to stop (steps 17 and 18). The present invention 20 considers the lag that occurs when the user hears the voice alert and then stops adjusting the wheelchair position. The invention calculates the anticipated time to reach the target angle based on the angular speed of wheelchair 193 in
(202) Next, in step 19, the smartphone application of the present invention 20 audibly and/or visually directs the wheelchair user to maintain the current position for a preset duration (e.g., 1 minute). After the preset duration is over, the smartphone application will notify the wheelchair user that he/she has completed the protocol (step 20), and can resume his/her normal activities. The wheelchair TR usage data will be sent to the cloud computing-based subsystem (192 in
(203) Referring now to
(204) Main 251 is the application's primary Activity. Responsibilities include holding all of the fragments that are used throughout the Lifecycle of the application, launching the appropriate interface for the user to see; and caching the in-memory representation of the statistical data gathered during the adjustments for quick loading in the statistics fragment (i.e., FragmentStats 255).
(205) FragmentSignIn 252 provides a “Sign-in” screen for a new user. The Sign-in screen only gets shown by Main 251 Activity if there is currently no username/password combo stored for the user. This is only on the first run of the application and stops appearing after a username/password combo has been set.
(206) FragmentSleepSettings 253 shows/provides access to user created sleep timers. This fragment presents sleep timers in a list view which shows the times and active days and gives buttons to toggle the active state of each timer. This fragment is responsible for launching FragmentSleepItemEdit 254 on clicking (i.e. activating) one of the timer list items or pressing the add timer button.
(207) FragmentSleepItemEdit 254 allows a user to edit sleep timer settings, and presents an interface to be used to edit an existing sleep timer or create a new one.
(208) FragmentStats 255 provides access to statistics for a given day. This fragment is responsible for launching FragmentStatDetail 256 for a clicked (activated) list item, and parsing the angle data csv file and constructing a data structure to hold the statistics using the AngleStatisticsManager 2520 class.
(209) FragmentStatDetail 256 shows statistical details for a selected day. This fragment is responsible for showing statistical data for each adjustment made on the selected day, and for displaying a graph showing angular displacement over time for selected adjustments.
(210) FragmentAngleMeter 257 provides the Main 251 tilt and recline meter interface. This fragment is responsible for communicating with the AngleMeterBackgroundService 258 to show interface components of a user's adjustment. The FragmentAngleMeter 257 displays angle changing in real time; shows any text based instructions to the user; verifies angle settings before sending them to AngleMeterBackgroundService 258 to start an adjustment; and initializes Microsoft Band 250 tile (if connected).
(211) AngleMeterBackgroundService 258 is responsible for initiating adjustments; providing Text to Speech and Voice Recognition features; notifying a user that it is time to make an adjustment; detecting Microsoft band 250 (if connected); telling AngleDataManager 2512 to send adjustment data to the cloud computing-based subsystem (192 in
(212) AngleMeterAdjustmentLogic 2514; and setting reminders for future adjustments at appropriate times.
(213) TimerManager 259 holds the list of user created sleep timers and is responsible for checking to see if a sleep timer is currently active; and for writing timers to/recalling timers from internal storage.
(214) SleepPeriod 2511 is a data model class to represent a sleep timer. This class holds information relevant to sleep timers, and contains helper methods to determine if a sleep timer is currently active.
(215) AngleDataManager 2512 interacts with Google app engine cloud storage. This class is responsible for uploading angle data to the cloud computing-based subsystem (192 in
(216) LocalDataOpenHandler 2513 is an Android helper class for creating and maintaining SQLLite database. This class is responsible for holding angle data until it can be uploaded.
(217) AngleMeterAdjustmentLogic 2514 performs logic needed to carry out an adjustment. This class is responsible for maintaining current adjustment state; proceeding to next step of adjustment as designed; setting reminders for future adjustments by interacting with the Android system through the built-in Android AlarmManager class operating in the mobile device (e.g., 201 in
(218) PhoneAccelerometerListener 2515 extends AngleMeterAdjustmentLogic 2514 when a smartphone (e.g., 201 in
(219) BandAccelerometerListener 2516 extends AngleMeterAdjustmentLogic 2514 when Microsoft Band 250 is used as the motion sensor. This class is responsible for getting accelerometer data and setting an AngleCalculationStrategy (e.g., 294
(220) AngleMeterSensorManager 2517 provides an interface that declares the actions that need to be carried out when registering and unregistering sensors for use with an AngleMeter application.
(221) BandSensorManager 2518 provides implementation AngleMeterSensorManager 2517 actions when a Microsoft Band 250 is being used.
(222) PhoneSensorManager 2519 provides implementation of AngleMeterSensorManger 2517 actions when smartphone (201 in
(223) AngleStatisticsManager 2520 parses angle statistic data from the local csv file and stores it in memory for use by the statistics fragments. This class also provides methods to get statistics for a given day and adjustment.
(224) AngleStatData 2521 is a class that represents discrete adjustment angle measurements.
(225) ContinuousRecognitionListener 2522 is responsible for configuring voice recognition and defining voice recognition error handling.
(226) AngleReminderReciever 2523 is a broadcast receiver to handle adjustment reminder intents from the Android system. This class is responsible for notifying the AngleMeterBackgroundService 258 to tell the user that it is time for an adjustment.
(227) AngleCalculationStrategy 2524 is an interface that defines the method for calculating angles.
(228) Referring now to
(229) As shown in step 1, the application of the present invention 20 running on a Google Glass (203 in
(230) In step 2, the wheelchair user confirms audibly and/or by other means (e.g., touch) the user's readiness.
(231) In step 3, the app of the present invention 20 audibly and/or visually directs the wheelchair user to adjust the wheelchair to the upright position (i.e., no tilt or recline). The novel voice alert technique may be used in the app of the present invention 20 to guide the user, i.e., “Please make sure that your wheelchair is in the upright position. Say ready when you are ready!” Other means (e.g., visual) may also be used as guidance.
(232) In step 4, the wheelchair user adjusts the wheelchair to the upright position following the guidance provided by the app of the present invention 20.
(233) In step 5, the wheelchair user confirms his/her readiness after the wheelchair has been adjusted to the upright position.
(234) In step 6, the app of the present invention 20 audibly and/or visually directs the user to sit still so that the Google Glass (203 in
(235) In step 7, the app of the present invention 20 audibly and/or visually directs the user to adjust the tilt angle by using the voice alert—“You may now adjust your position. Please adjust your tilt to 15 degrees.”
(236) In step 8, the wheelchair user starts to adjust the tilt angle as instructed by the voice and/or visual alert. In the meantime, the app of the present invention 20 will measure and display the current tilt angle on the display of the Google Glass (203 in
(237) In step 9, if the target tilt angle has been reached, the app of the present invention 20 audibly and/or visually direct the wheelchair user to stop with the voice and/or visual alert—“Please stop!”
(238) In step 10, the app of the present invention 20 audibly and/or visually directs the wheelchair user to adjust the recline angle by using the voice and/or visual alert—“Please adjust your Recline to 110 degrees.”
(239) In step 11, the wheelchair user starts to adjust the recline angle. In the meantime, the app of the present invention 20 will measure and display the current recline angle on the Google Glass (203 in
(240) In step 12, if the target recline angle has been reached, the app of the present invention 20 will audibly and/or visually direct the wheelchair user to stop with the voice and/or visual alert—“Please stop! You are now in your target position.” In the meantime, the app of the present invention 20 will also show the final angle and the stop message on the display of the Google Glass (203 in
(241) In steps 13 and 14, the app of the present invention 20 will audibly and/or visually direct the wheelchair user to maintain the current position for the prescribed duration by using voice and/or visual alerts, e.g., “Please maintain the current position for 1 minute”.
(242) In step 15, the app of the present invention 20 will notify the wheelchair user that he/she has finished the protocol after the prescribed duration is over.
(243) In step 16, the wheelchair TR usage data will be sent to the cloud for storage and analysis.
(244) Referring now to
(245) AngleDataManager 271 is a service for managing upload of angle data to the cloud. This service 271 maintains a local database of angle data and keeps track of which entries have been uploaded. Once started, the service 271 uploads all entries that have not been uploaded. It then adds any new angle data to the database.
(246) AngleMeterListener 272 is a class for receiving sensor events. This class 272 monitors sensor events. When the sensor, i.e., accelerometer, detects a movement, it will generate a sensor event, including sensor readings in each dimension in the space. This class 272 also maintains the status of wheelchair tilt and recline adjustment and guide the user to finish the protocol.
(247) AngleMeterService 273: A service class for managing the input and output of the application.
(248) This class 273 accepts voice commands from the user and displays angle information on the head-up display of the Google Glass (203 in
(249) ContinuousRecognitionListener 274 is a class for providing a mechanism for recognizing voice commands and defining voice recognition error handling. This class 274 implements the built-in RecognitionListener in Google Glass (203 in
(250) LiveCardMenuActivity 275 is a class providing the user interface. This class 275 allows users to provide voice commands (i.e., inputs) and then invoke AngleMeterService 273 to handle the command.
(251) LocalDataOpenHelper 276 is a class handling local database for temporary data storage. Before the tilt and recline usage information is sent to the cloud, the data is temporarily stored in the local SQLLite database. This class 276 deals with the local database for temporary data storage.
(252) Referring now to
(253) In step 1, the app of the present invention 20 running on the mobile/wearable device (e.g. Smartphone 201, FitBit 202, smartgarment 208 in
(254) In step 2, the person confirms his/her readiness.
(255) In step 3, the mobile device (e.g. FitBit 202, smartphone 201 in
(256) In steps 4 and 5, while the person performs knee (or other orthopedic) exercises, the mobile device (e.g. FitBit 202, smartgarment 205, smartphone 201 in
(257) In step 6, once the user finishes the protocol, the mobile device (e.g. FitBit 202, smartgarment 205, smartphone 201 in
(258) Referring now to
(259) GyroscopeStrategy 291 is a class for measuring angles when the movements are parallel with the ground, i.e., the decomposition of gravity along three dimensions remains the same during the movements. The gyroscope sensor in a mobile device (e.g. FitBit 202, smartgarment 208, smartphone 201 in
(260) RotationVectorStrategy 292 is a class for measuring angles when the movements are parallel with the ground, i.e., the decomposition of gravity along three dimensions remains the same during the movements. This class can be used together with GyroscopeStrategy 291 to cross-validate the results to ensure correct measurement.
(261) AccelerometerStrategy 293 is a class for measuring angles when the movements are not parallel with the ground, i.e., the decomposition gravity among three dimensions keeps changing. The accelerometer sensor is used to calculate the angle changes.
(262) AngleCalculationStrategy 2524 is a class (defined in
(263) AngleMeterAdjustmentLogic 2514 is a class (defined in
ADDITIONAL EMBODIMENTS OF THE PRESENT INVENTION
(264) Preferred embodiments of the present invention may comprise generating personalized adjustment parameters directed to positioning and control of seating configurations in both commercial and private automotive vehicles, including trucks and passenger cars. Outcome objectives may reflect both safety and comfort. A smartphone implementation providing a user interface to display at least current position and shape parameters and send related control parameters to powered, adjustable seats is anticipated. System responses are anticipated to at least user touch and voice commands. Audio recitation and response is anticipated. User touch, voice activation and audio recitation functions are generally programmable and operable on industry standard smart devices, such as various device models of iPhone, iPad, Samsung Galaxy, HP tablets, Google Glass, Apple Watch, Intel Curie™ Module etc., running on operating systems such as Android, iOS, and Windows, where such devices include an accelerometer or other types of motion sensors. Any such mobile device having the minimum function set as described herein is anticipated. Implementation using on-board devices installed as vehicle equipment is also anticipated.
(265) Preferred embodiments of the present invention may comprise generating personalized adjustment parameters directed to positioning and control of seating configurations in aircraft including both crew and passenger seating. Outcome objectives may reflect both safety and comfort. A smartphone implementation providing a user interface to display at least current position and shape parameters and send related control parameters to powered, adjustable seats is anticipated. System responses are anticipated to at least user touch and voice commands. Audio recitation and response is anticipated. User touch, voice activation and audio recitation functions are generally programmable and operable on industry standard smart devices, such as various device models of iPhone, iPad, Samsung Galaxy, HP tablets, Google Glass, Apple Watch, Intel Curie™ Module, etc., running on operating systems such as Android, iOS, and Windows, where such devices include an accelerometer or other types of motion sensor. Any such mobile device having the minimum function set as described herein is anticipated. Implementation using on-board devices installed as vehicle equipment is also anticipated.
(266) Preferred embodiments of the present invention may comprise generating personalized adjustment parameters directed to positioning and control of seating configurations in furniture, including tilt and recline angle, seat and back shape, firmness and support. Outcome objectives may reflect both safety and comfort. A smartphone implementation providing a user interface to display at least current position and shape parameters and send related control parameters to powered, adjustable seats is anticipated. System responses are anticipated to at least user touch and voice commands. Audio recitation and response is anticipated. User touch, voice activation and audio recitation functions are generally programmable and operable on industry standard smart devices, such as various device models of iPhone, iPad, Samsung Galaxy, HP tablets, Google Glass, Apple Watch, Intel Curie™ Module, etc., running on operating systems such as Android, iOS, and Windows, where such devices include an accelerometer or other types of motion sensor. Any such mobile device having the minimum function set as described herein is anticipated. Implementation using on-board devices installed as furniture components is also anticipated.
(267) Preferred embodiments of the present invention may comprise generating personalized adjustment parameters directed to positioning and control of support and comfort configurations in both commercial and private sleep platforms for healthcare, hospitality and in-home applications. A smartphone implementation providing a user interface to display at least current position and shape parameters, and send related control parameters to powered, adjustable seats is anticipated. System responses are anticipated to at least user touch and voice commands. Audio recitation and response is anticipated. User touch, voice activation and audio recitation functions are generally programmable and operable on industry standard smart devices, such as various device models of iPhone, iPad, Samsung Galaxy, HP tablets, Google Glass, Apple Watch, Intel Curie™ Module, etc., running on operating systems such as Android, iOS, and Windows. Any such mobile device having the minimum function set as described herein is anticipated. Implementation using on-board devices installed as sleep-platform equipment components is also anticipated.
(268) Those skilled in the art will appreciate that in some embodiments of the invention, the functional modules of the Web implementation, as well as the personal and the integrated communication devices, may be implemented as pre-programmed hardware or firmware elements (e.g., application specific integrated circuits (ASICs), electrically erasable programmable read-only memories (EEPROMs), etc.), or other related components. Mobile communication devices that can use the present invention may include but are not limited to any of the “smart” phones or tablet computers equipped with digital displays, wireless communication connection capabilities such as iPhones and iPads available from Apple, Inc., as well as communication devices configured with the Android operating system available from Google, Inc and with the Windows operating system available from Microsoft. In addition, it is anticipated that new types of communication devices and operating systems will become available as more capable replacements of the forgoing listed communication devices, and these may use the present invention as well. New types of motion sensors and motion detection methods may also become available and these devices may be used as components in the present invention, and may be mounted directly or indirectly on the human body.
(269) In other embodiments, the functional modules of the mobile-to-cloud implementation may be implemented by an arithmetic and logic unit (ALU) having access to a code memory which holds program instructions for the operation of the ALU. The program instructions could be stored on a medium which is fixed, tangible and readable directly by the processor, (e.g., removable diskette, CD-ROM, ROM, or fixed disk), or the program instructions could be stored remotely but transmittable to the processor via a modem or other interface device (e.g., a communication adapter) connected to a network over a transmission medium. The transmission medium may be either a tangible medium (e.g., optical or analog communications lines) or a medium implemented using wireless techniques (e.g., microwave, infrared or other transmission schemes).
(270) The program instructions stored in the code memory can be compiled from a high level program written in a number of programming languages for use with many computer architectures or operating systems. For example, the program may be written in assembly language suitable for use with a pixel shader, while other versions may be written in a procedural programming language (e.g., “C”) or an object oriented programming language (e.g., “C++” or “JAVA”).
(271) In other embodiments, cloud computing may be implemented on a web hosted machine or a virtual machine. A web host can have anywhere from one to several thousand computers (machines) that run web hosting software, such as Apache, OS X Server, or Windows Server. A virtual machine (VM) is an environment, usually a program or operating system, which does not physically exist but is created within another environment (e.g., Java runtime). In this context, a VM is called a “guest” while the environment it runs within is called a “host.” Virtual machines are often created to execute an instruction set different than that of the host environment. One host environment can often run multiple VMs at once.
(272) While specific embodiments of the present invention have been described and illustrated, it will be apparent to those skilled in the art that numerous modifications and variations can be made without departing from the scope of the invention as defined in the appended claims. It is understood that the words that have been used are words of description and illustration, rather than words of limitation. Although the invention has been described with reference to particular means, materials and embodiments, the invention is not intended to be limited to the particulars disclosed; rather, the invention extends to all functionally equivalent structures, methods and uses, such as are within the scope of the appended claims.