ELECTROMYOGRAPHIC CONTROL SYSTEMS AND METHODS FOR THE COACHING OF EXOPROSTHETIC USERS
20200265948 ยท 2020-08-20
Inventors
- Blair Andrew Lock (Chicago, IL, US)
- Frank Daniel Cummins, II (Chicago, IL, US)
- Levi John Hargrove (Chicago, IL, US)
- John Arthur Thompson, IV (Madison, WI, US)
Cpc classification
G06F3/017
PHYSICS
G16H40/40
PHYSICS
A61F2/76
HUMAN NECESSITIES
G06F3/015
PHYSICS
G06N3/061
PHYSICS
G06F3/016
PHYSICS
International classification
G06N3/06
PHYSICS
G16H40/40
PHYSICS
Abstract
Systems and methods are described for the coaching of users through successful calibration of a myoelectric prosthetic controller. The systems and methods are comprised of, and/or utilize, hardware and software components to input and analyze electromyography (EMG) based signals in association with movements, and to calibrate and output feedback about the signals. The hardware is further comprised of an apparatus for the detection of EMG signals, a prosthesis, an indicator, and a user interface. The software is further comprised of a user interface, a pattern recognition component, a calibration procedure, and a feedback mechanism. The systems and methods facilitate calibration of a myoelectric controller and provides the user with feedback about the calibration including information of the signal inputs and outputs, and messages about connected hardware and how to optimize signal data.
Claims
1. An electromyographic control system configured to coach prosthetic users to calibrate prosthetic devices, the electromyographic control system comprising: a myoelectric prosthetic controller configured to control a prosthetic device; an electromyographic software component communicatively coupled to a plurality of electrodes in myoelectric contact with a user, wherein the electromyograph software component is configured to perform an analysis of electromyographic (EMG) signal data of the user, the EMG signal data received from the plurality of electrodes; and a user interface configured to provide, based on the analysis of the EMG signal data, a feedback indication to the user as to a calibration quality of the EMG signal data, wherein the user interface is configured to initiate a calibration procedure to calibrate the myoelectric prosthetic controller, and wherein the user interface comprises at least one of: (i) a button user interface including a calibration button, or (ii) a virtual user interface configured to display the feedback indication as at least one of: (a) a quality metric corresponding to the calibration quality of the EMG signal data, or (b) a message corresponding to the calibration quality of the EMG signal data.
2. The electromyographic control system of claim 1, wherein the message comprises at least one of (a) an indication of a cause for a non-optimal signal data input of the EMG signal data, or (b) a recommended procedure for optimizing signal data input.
3. The electromyographic control system of claim 1, wherein the calibration procedure is initiated during a calibration session, wherein the virtual user interface provides a recommended procedure for optimizing signal data input, and wherein a further calibration procedure is initiated from the user interface during a further calibration session to recalibrate the myoelectric prosthetic controller based on the recommended procedure.
4. The electromyographic control system of claim 3, wherein the further calibration session is configured to facilitate at least one of (a) deleting EMG signal data corresponding to one or more data sets or movements, (b) adding EMG signal data corresponding to one or more data sets or movements, (c) replacing EMG signal data corresponding to one or more data sets or movements with new EMG signal data.
5. The electromyographic control system of claim 1, wherein the myoelectric prosthetic controller is calibrated to control the prosthetic device based on the EMG signal data.
6. The electromyographic control system of claim 1, wherein the calibration button is configured to provide the feedback indication by at least one of an auditory stimulus, a tactile stimulus, or a visual stimulus.
7. The electromyographic control system of claim 1, wherein virtual user interface displays a visualization of the EMG signal data in real time.
8. The electromyographic control system of claim 1, wherein the calibration procedure comprises the virtual user interface instructing the user to perform one or more indicated motions in relation to the prosthetic device, and wherein the one or more indicated motions produce the EMG signal data as received from the plurality of electrodes.
9. The electromyographic control system of claim 8, wherein the virtual user interface is configured to receive one or more selections indicating at least one of the one or more indicated motions for the user to perform.
10. The electromyographic control system of claim 1, wherein the electromyographic software component further comprises a pattern recognition component configured to analyze the EMG signal data of the user, the pattern recognition component further configured to identify or categorize the EMG signal data of the user based on a particular motion performed by the user.
11. The electromyographic control system of claim 10, wherein the pattern recognition component comprises an adaptive machine learning component configured to determine the particular motion performed by the user based on the EMG signal data of the user.
12. The electromyographic control system of claim 11, wherein the adaptive machine learning component is further configured to determine an appropriate feedback indication based on the EMG signal data of the user.
13. The electromyographic control system of claim 1, wherein the user interface is configured to reset calibration data of the user to calibrate the myoelectric prosthetic controller.
14. An electromyographic control method for coaching prosthetic users to calibrate prosthetic devices, the electromyographic control method comprising: receiving, by an electromyographic software component communicatively coupled to a plurality of electrodes in myoelectric contact with a user, electromyographic (EMG) signal data from the plurality of electrodes; analyzing, by the electromyograph software component, the EMG signal data of the user; providing to a user interface, based on analyzing the EMG signal data, a feedback indication to the user as to a calibration quality of the EMG signal data; and initiating, based on the calibration quality of the EMG signal data, a calibration procedure to calibrate a myoelectric prosthetic controller, the myoelectric prosthetic controller configured to control a prosthetic device, wherein the user interface comprises at least one of: (i) a button user interface including a calibration button, or (ii) a virtual user interface configured to display the feedback indication as at least one of: (a) a quality metric corresponding to the calibration quality of the EMG signal data, or (b) a message corresponding to the calibration quality of the EMG signal data.
15. The electromyographic control method of claim 14, wherein the calibration procedure is initiated during a calibration session, wherein the virtual user interface provides a recommended procedure for optimizing signal data input, and wherein a further calibration procedure is initiated from the user interface during a further calibration session to recalibrate the myoelectric prosthetic controller based on the recommended procedure.
16. The electromyographic control method of claim 14, wherein the calibration procedure comprises the virtual user interface instructing the user to perform one or more indicated motions in relation to the prosthetic device, and wherein the one or more indicated motions produce the EMG signal data as received from the plurality of electrodes.
17. The electromyographic control method of claim 14, wherein the electromyographic software component further comprises a pattern recognition component configured to analyze the EMG signal data of the user, the pattern recognition component further configured to identify or categorize the EMG signal data of the user based on a particular motion performed by the user.
18. The electromyographic control method of claim 17, wherein the pattern recognition component comprises an adaptive machine learning component configured to determine the particular motion performed by the user based on the EMG signal data of the user.
19. A tangible, non-transitory computer-readable medium storing instructions for coaching prosthetic users to calibrate prosthetic devices, that when executed by one or more processors cause the one or more processors to: receive, by an electromyographic software component communicatively coupled to a plurality of electrodes in myoelectric contact with a user, electromyographic (EMG) data from the plurality of electrodes; analyze, by the electromyograph software component, the EMG signal data of the user; provide to a user interface, based on analyzing the EMG signal data, a feedback indication to the user as to a calibration quality of the EMG signal data; and initiate, based on the calibration quality of the EMG signal data, a calibration procedure to calibrate a myoelectric prosthetic controller, the myoelectric prosthetic controller configured to control a prosthetic device, wherein the user interface comprises at least one of: (i) a button user interface including a calibration button, or (ii) a virtual user interface configured to display the feedback indication as at least one of: (a) a quality metric corresponding to the calibration quality of the EMG signal data, or (b) a message corresponding to the calibration quality of the EMG signal data.
20. The tangible, non-transitory computer-readable medium of claim 19, wherein the calibration procedure is initiated during a calibration session, wherein the virtual user interface provides a recommended procedure for optimizing signal data input, and wherein a further calibration procedure is initiated from the user interface during a further calibration session to recalibrate the myoelectric prosthetic controller based on the recommended procedure.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The objects, features and advantages of the present invention and additional embodiments will be more readily appreciated upon reference to the following disclosure when considered in conjunction with the accompanying drawings, wherein like reference numerals are used to identify identical components in the various views.
[0018] The Figures described below depict various aspects of the system and methods disclosed therein. It should be understood that each Figure depicts an embodiment of a particular aspect of the disclosed system and methods, and that each of the Figures is intended to accord with a possible embodiment thereof. Further, wherever possible, the following description refers to the reference numerals included in the following Figures, in which features depicted in multiple Figures are designated with consistent reference numerals.
[0019] There are shown in the drawings arrangements which are presently discussed, it being understood, however, that the present embodiments are not limited to the precise arrangements and instrumentalities shown, wherein:
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029] The Figures depict preferred embodiments for purposes of illustration only. Alternative embodiments of the systems and methods illustrated herein may be employed without departing from the principles of the invention described herein.
DETAILED DESCRIPTION
[0030] While the present invention is susceptible of embodiment in many different forms, there are shown in the drawings and will be described herein in detail specific exemplary embodiments thereof, with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the invention to the specific embodiments illustrated. In this respect, before explaining at least one embodiment consistent with the present invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and to the arrangements of components set forth above and below, illustrated in the drawings, or as described in the examples. Methods and apparatuses consistent with the present invention are capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein, as well as the abstract included below, are for the purposes of description and should not be regarded as limiting.
[0031] As disclosed in various embodiments herein, a system 100 is described for the coaching of exoprosthetic users. System 100 may also be referred to herein as the system and/or the electromyograph control system. System 100 is generally comprised of hardware and software components to input and analyze EMG based signals in association with movements of a user (e.g., user 123), and to calibrate and output feedback about the signals. System 100, and portions thereof, including its various hardware and software components, are illustrated herein by the provided Figures.
[0032] For example,
[0033] In addition,
[0034] As illustrated by at least
[0035]
[0036]
[0037]
[0038] Software component 136 may be comprised of a user interface (e.g., computer-based user-interface 125 as illustrated by
[0039] Once the method of interface has been selected, user 123 must decide whether to reset their calibration data (104) of prosthetic device 124, fully calibrate (105) the device (e.g., prosthetic device 124), or calibrate a single portion (106) of the device (e.g., prosthetic device 124). As shown by
[0040] In addition,
[0041] Once the user 123 has selected the method for guidance 107, the electromyograph control system (system 100) will begin or initiate a calibration protocol 109. User 123 may then go through a select series of motions, i.e., calibration classes, such as, e.g., an indicated motion 134, including any one or more of an elbow motion (flex and/or extend), wrist motion (pronate and/or supinate), and/or a hand motion (open, close, tool, key, and/or pinch) as illustrated by
[0042] Referring to
[0043] If, however, signal data 127 is determined (e.g., by software component 136 and/or analyzing pattern recognition algorithm) to be inadequate (116) (e.g., a poor calibration determined as illustrated by any of
[0044] In various embodiments, signal data 127 may be determined by system 100 (e.g., by software component 136 and/or by pattern recognition component) as inadequate because it is defined or classified as noisy, quiet, or inconsistent, etc., e.g., over a given time period, as illustrated for
[0045] In either case (i.e., for an adequate calibration or an inadequate calibration), as illustrated by any of
[0046] In various embodiments, pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein) is configured to determine statistics from EMG signal data during calibration for providing recommendations as described herein. For example, based on statistical analysis of the EMG signal data (e.g., signal data 127), common user errors (e.g., liftoff) can be identified, and solutions suggested. Statistics determined and analyzed by pattern recognition component may include covariance of time domain features, signal magnitude, variation of signal magnitude over time, separation index between motion classes, frequency of electrode liftoff, and a variety of others. Additionally, or alternatively, fuzzy logic may be applied to these statistics to indicate when a specific, but yet widely observed error, is likely to be occurring, e.g., as caused by user error and/or user contact with electrodes 122. In such embodiments, fuzzy logic may be applied to each statistic. Additionally, or alternatively, statistics generated from EMG signal data may be converted into a value compatible with fuzzy logic by assigning a stochastic value indicating a confidence that the statistic is outside an expected range. These values (e.g., as described for
[0047] Analysis of EMG signal data 127 of a user by pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein) may further be described with respect to the following Tables 1-5, which describe the signal data collection, derived features (statistics), fuzzy logic scaling, fuzzy logic implementation, and rating and issue selection.
[0048] Data Collection
[0049] The pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein), e.g., implementing the calibration quality algorithm, needs input EMG signal data 127 from the calibration session task to analyze. Such data may be used to build a classifier for the pattern recognition component, while others may be collected for implementation of the calibration quality algorithm. Additionally, or alternatively, for single class calibrations (e.g., only one class is calibrated), some data will only be needed from the relevant class, while others will be needed from all classes regardless. Table 1, below, provides examples of data or data types (e.g., EMG signal data 127) that may be collected, determined, or otherwise received as described herein. It is to be understood that the data or datatypes in the below table or provided by way of example only, and are not limiting. Additional or different data, sizes, etc. may be utilized that is consistent with the disclosure herein.
TABLE-US-00001 TABLE 1 (Data Collection) Single Class Item Data Description Size Requirement 1 All-Calibration One covariance matrix for each K (f c) May be Covariances for class in the classifier, as used to (f c) 4B = required for each Class build the classifier K 12544B all Classes 2 All-Calibration One centroid vector for each K (f c) May be Centroids for class in the classifier, as used to 4B = K 224B required for each Class build the classifier all Classes 3 All-Calibration One value for each class in the K 4B May be Number of classifier indicating the number required for Frames for each of frames used to calculate the all Classes Class above data 4 This-Calibration One centroid vector for each k (f c) This class Centroids for class in the most recent 4B = k 244B each Class calibration only representing the data collected during this calibration 5 This-Calibration One vector representing the c 4B = 32B Typically No-Motion variance of the MRV data required for Mean Relative collected during no-motion Full Value (MRV) recording of this calibration; Calibration Variance Typically for full calibration only only 6 Reallocation A set of values for each k r p This Class Data for each recording in the most recent 4B = k 24B recording calibration indicating the proportion of recorded frames reallocated during each part of a partition of the recording
[0050] In the above Table 1, K is total number of classes recognized by the classifier (e.g., typically K=2 to 20), k is the number of classes used for calibration (e.g., typically 1 or K), f is the number of measured data features per channel (e.g., 7 standard), c is the number of channels (e.g., 8 channels), which may be associated with electrodes 122 and/or EMG signal data 127 received via electrodes 122, and B is the number of Bytes (in a memory, as described herein), where each value may be either a float or uint32 (e.g., each 4 Bytes). As shown in the size column above, the pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein), e.g., implementing the calibration quality algorithm, requires minimum amounts of memory (e.g., a few bytes of information in some cases), allowing for the pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein) to be implemented on devices having minimum memory and/or processing resources. Each of this data may be collected during a calibration session ad described herein.
[0051] Derived Features (Statistics)
[0052] With respect to an additional portion of calibration quality algorithm, pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein), may process the EMG signal data 127 to generate derived data features, e.g., including statistics as used by fuzzy logic implementations as described herein.
TABLE-US-00002 TABLE 2 (Derived Features (Statistics)) Derived Utilized Data Item Features Description Size (Items of Table 1) 7 Separation For each pair of one class in the most k (K 1) 4B 1, 2, 4 Index recent calibration and one class in the classifier (excluding pairs of identical classes), the Mahalanobis distance between their centroids (Items 4 and 2 of Table 1 respectively) using the average of their covariances (Item 1) 8 Separation For each Separation Index, a scaling k (K 1) 4B 3 Index factor that scales inversely to the Scaling Factor number of frames of data contributing to the averaged covariances (Item3) 9 MRV Ratio Ratio of the average MRV of each k_m 4B 2 motion class to the average MRV of the no motion class. 10 No-Motion Average across channels of the ratio 4B 4, 5 Variability of the square root of MRV variance to the MRV mean in the no-motion class
[0053] In the above Table 2, and similarly with respect to Table 1, K is total number of classes recognized by the classifier (e.g., typically K=2 to 20), k is the number of classes used for calibration (e.g., typically 1 or K), k_m is the number of motion classes (e.g., classes for detecting motion, e.g., (open, close, tool, key, and/or pinch) as illustrated by
[0054] As described for Table 2 (Item 7), a separation index may be computed by pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein) with the following formula:
[0055] Fuzzy Logic Conversion
[0056] With respect to an additional portion of calibration quality algorithm, pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein), may covert EMG signal data 127, including any one or more of EMG signal data 127, including data of Table I, and/or feature data of Table II. The conversion prepares the data for fuzzy logic implementation(s) as further described herein. Generally, fuzzy logic may rely on all variables being between 0 and 1, inclusive, heuristically representing the likelihood, confidence, or severity with which a given feature applies. In some cases, two polar opposite features may be represented with one variable between 1 and 1, where 0 represents that neither feature applies. The conversion may be implemented by a piecewise linear function, including, for example, the following function:
[0057] In the above function, a is a lower bound and b is an upper bound. Table 3 below represents fuzzy features that may be used by the calibration quality algorithm of pattern recognition component:
TABLE-US-00003 TABLE 3 (Fuzzy Logic Conversion) Fuzzy Utilized Data Item Features Description Bounds Size (Items of Table 2) 11 Fuzzy Fuzzy conversion 1 for values less k 7, 8 Separation of the Separation than s*2.125 (K 1) 4B Index Index, with bounds 0 for values modified by the greater than s*3.4 Separation Index Scaling Factor 12 Fuzzy MRV Fuzzy conversion 1 for values k_m 4B 9 Ratio of the MRV Ratio greater than 8 1 for values less than 1.2 0 for values between 2 and 4.5 13 Fuzzy No- Fuzzy conversion 1 for values 4B 10 Motion of the No-Motion greater than 0.3 Variability Variability 1 for values less than 0.09 0 for values between 0.14 and 0.2 14 Fuzzy Early The average of the N/A k_m 4B 6 Reallocation difference between the percentage of frames reallocated early and the period where fewest frames were reallocated 15 Fuzzy Late The average of the N/A k_m 4B 6 Reallocation difference between the percentage of frames reallocated late and the period where fewest frames were reallocated 16 Fuzzy Total The percentage of N/A k_m 4B 6 Reallocation frames reallocated
[0058] In the above Table 3, and similarly with respect to Tables 1 and 2, K is total number of classes recognized by the classifier (e.g., typically K=2 to 20), k is the number of classes used for calibration (e.g., typically 1 or K), k_m is the number of motion classes (e.g., classes for detecting motion, e.g., (open, close, tool, key, and/or pinch) as illustrated by
[0059] Fuzzy Logic Implementation
[0060] With respect to an additional portion of calibration quality algorithm, pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein), may determine common issues (e.g., too weak) that are prevalent in the data signal by implementing fuzzy logic operations on the fuzzy features of Table 3. The pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein) may implement the following fuzzy logic operations across the fuzzy features of Table 3:
NOT(X)=1X(1)
AND(X, Y)=X*Y(2)
OR(X,Y)=X+YX*Y(3)
[0061] Implementation of these fuzzy logic operations yield the following fuzzy feature results and related conclusions (for the given indicated operation). These fuzzy feature results and related conclusions generally correspond to classifications of
TABLE-US-00004 TABLE 4 (Fuzzy Logic Implementation) Fuzzy Class Item Feature Conclusion Operations Type 17 Quiet No-motion 13(negative No-Motion No-Motion recording only) appears too relaxed 18 Noisy No-Motion AND(13(positive No-Motion No-Motion recording only), 16) appears to include motion 19 Low Signal The signal is OR(11(this/ Motion too similar no-motion), to no signal 12(negative only)); may not be a final operation 20 Too Weak The signal is AND(16, 19) Motion too weak 21 Inconsistent The signal AND(16, Motion cuts out NOT(19)) 22 Indistinct Many motion Mean(11(this/ Motion classes are motions)) similar 23 Too Strong Contractions AND(12(positive Motion are too only), 22) intense 24 Nearest A motion is Max(11(this/ Motion Neighbor too similar motions)) 25 Started Beginning of 14 Motion Late recording is reallocated 26 Ended End of 15 Motion Early recording is reallocated
[0062] Rating and Issue Selection
[0063] With respect to an additional portion of calibration quality algorithm, pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein), may select what gets displayed. For each class, a signal will be sent from pattern recognition component (or, alternatively, and more generally, the electromyographic software component as described herein) to software component 136, which may render or display the result, as described herein, on a user interface (e.g., computer-based user-interface 125 as illustrated by
[0064] While each issue may be identified via a fuzzy logic implementation, as described above, in some embodiments, some issues or classes may be determined or set as more severe than others. In such embodiments, pattern recognition component may multiply an issue or class by a scaling factor to determine its final priority and maximum rating penalty. Examples of scaling factors and penalty scalings are illustrated in Table 5 below. The fuzzy features and items of Table 5 correspond to those of Table 4, where the scaling factors and/or penalty scalings are applied to the output values for the fuzzy features of Table 4. It is to be understood that additional and/or different such scaling factors and penalty scalings may be used.
TABLE-US-00005 TABLE 5 (Rating and Issue Selection) Fuzzy Priority Penalty Item Feature Scaling Scaling Class Type 17 Quiet 0.5 2 No-Motion No-Motion 18 Noisy 1 4 No-Motion No-Motion 19 Too Weak 1.2 4 Motion 20 Inconsistent 1.2 4 Motion 21 Indistinct 1 3 Motion 22 Too Strong 1 2 Motion 23 Nearest 1 3 Motion Neighbor 24 Started Late 1 2 Motion 25 Ended Early 1 2 Motion 26 Quiet 0.5 2 No-Motion No-Motion
[0065] Generally, each message (e.g., as described for Table 6 herein and
[0066] Each of the fuzzy features of Tables 4 and 5 generally correspond to classifications of
[0067] In various embodiments, if user 123 receives an indication of poor calibration (116), a message 118 will be generated to the user 123 based on the analyzed and categorized signal data 127. For example, system 100, based on signal data 127, may provide user 123 with one or more messages (117) related to the signal data 127. Message 118 be a list of one or more possible message classes that would be provided to user 123 after system 100 had determined that the signal data 127 collected was inadequate (116). As illustrated by the embodiment of
[0068] The virtual user interface 102 may represent the signal data 127 as a data quality metric 114, where the message 118 may be provided to the user 123 along with a recommended procedure for optimization 133. For example, as shown for
TABLE-US-00006 TABLE 6 (Calibration Quality Metrics and Messages) Message type (Calibration issue class type) Description Message(s) to User Tip(s) to User NO_MESSAGE Implemented when user N/A N/A is provided with no message (115) as shown for FIG. 1 NOISY_NM Implemented when there There was quite a Make sure all is enough variation in the bit of noisy signal electrodes are EMG signal to indicate during recording making contact. user muscle contraction of Relax/No Be sure not to make or some other constantly Motion. muscle contractions changing noise. Such when Relax/No noise causes problems Motion is being with a no-motion recorded. reallocation algorithm (too aggressive, motions hard to access) and often indicates issues with EMG signal quality or the user's understanding of the prompts/messages QUIET_NM Implemented when no Your Relax/No To make variation in the EMG Motion signals everything a little signal, as typically were super quiet.; more robust, try expected from electrodes or wiggling your being over muscle and Absent signal(s) fingers and/or supporting the user's during recording moving your arm body weight, is detected. of Relax/No around when No Such lack of EMG Motion. Motion is being signals can cause the no- recorded. motion reallocation algorithm to be too permissive (missing late/early issues amplified) and may make motions (when the prosthetic device is used) more likely when the user does not intend such motions INDISTINCT Implemented when two Your [X, Y, and For better or more calibration Z] are very performance, try classes are detected/ similar.; or calibrating [X, Y, confused Your [X, Y, and and Z] with a more Z] are somewhat distinct feel. similar.; or Your [X, Y, and Z] are a little bit similar. TOO_SIMILAR Implemented when a Your [X] is very For better specific pair of similar to [Y].; or performance, try calibration classes are Your [X] is calibrating [X and likely to be confused. In somewhat similar Y] with a more such cases, a message for to [Y].; or distinct feel. either INDISTINCT or Your [X] is a bit TOO_SIMILAR will be similar to [Y]. indicated based on severity of the EMG signal quality. MISSING_EARLY Implemented for a late You started [X] For the system to user reaction to a quite late during work at its best, be prompted action, such calibration. sure to hold [X] that the beginning EMG throughout the signal data is reallocated. whole orange circle during calibration. MISSING_LATE Implemented when the You stopped For the system to user relaxes too early holding [X] quite work at its best, be from the indicated early during sure to hold [X] prompted action. calibration.; or throughout the You stopped whole orange circle holding [X] a little during calibration. bit early during calibration. INCONSISTENT Tracks reallocation, but Your [X] was For the system to triggers for large amounts cutting in and out a work at its best, be of reallocation, regardless lot during sure to hold [X] of whether the calibration calibration.; or throughout the is at the beginning, Your [X] was whole orange circle middle, or end of a cutting in and out during calibration. prompted action. Differs during from TOO_WEAK in calibration.; or that data is inconsistently Your [X] was missing (sparse EMG cutting in and out signal), not just quiet somewhat during (low EMG signal). calibration.; or Your [X] was cutting in and out a little bit during calibration. TOO_WEAK Detects when the EMG Your [X] was too Make sure the signal for a contraction is soft during system gets your not much higher than no calibration.; or data; try calibrating motion (no or low EMG Your [X] was a [X] a little bit signal). TOO_WEAK little too soft stronger. also tracks from during reallocation, such that calibration.; or TOO_WEAK competes Your [X] was directly with somewhat soft INCONSISTENT but during differs in that EMG calibration.; or signal data is detected Your [X] was soft (i.e., a low EMG signal during data is detected). calibration. TOO_STRONG Implemented when the Your [X] was too To help the system EMG signal for a hard during perform better and to contraction is from a user calibration.; or keep you from who is likely Your [X] was a getting muscle straining/exerting little too hard fatigue, try themselves (causing a during calibrating [X] a strong EMG signal)? calibration. little bit softer. LIFTOFF Provided as a global Some electrodes You are not likely to warning message in the are not making get good event that significant good skin contract performance until liftoff of electrode(s) during calibration electrode-skin from the user's skin was for [X]. contact issues are detected during prosthetic addressed. device calibration.
[0069] In the above Table 6, different indicated motions/prompted actions (e.g., elbow motion (flex and/or extend), wrist motion (pronate and/or supinate), and/or hand motion (open, close, tool, key, and/or pinch) as illustrated by
[0070] Message 118 may further include a recommended procedure for optimization 133 of signal data 127. A given recommended procedure for optimization 133 corresponds to each connected prosthetic device (121) connected to the system and its related indicated motion 134. For example, as shown for
[0071] As shown for
[0072] ASPECTS OF THE DISCLOSRE
[0073] 1. A system for the coaching of exoprosthetic users comprising: an apparatus for the collection of signal data; a button; and a software component.
[0074] 2. A system for the input of signal data comprising: a plurality of electrodes.
[0075] 3. A button comprising: a housing; an indicator; and a tactile interface.
[0076] 4. A software component comprising: a user interface; and a pattern recognition component.
[0077] 5. The system for the input of signal data of Aspect 2, wherein the plurality of electrodes comprise a composition of an electrically conductive material.
[0078] 6. The system for the input of signal data of Aspect 2, further comprising: a method for the communicating of the signal data to the software component of Aspect 4.
[0079] 7. The button of Aspect 3, wherein the indicator is comprised of: an apparatus for visual stimulus; an apparatus for auditory stimulus; and/or an apparatus for tactile stimulus.
[0080] 8. The software component of Aspect 4, wherein the user interface further comprises: a system for the feedback of information to the user.
[0081] 9. The User Interface of Aspect 8, wherein the system for the feedback of information to the user is further comprised of: a quality metric, identifying an objective level of calibration quality; and/or a message, identifying probable causes of poor signal data.
[0082] 10. The system for the feedback of information to the user of Aspect 9, wherein the message is further comprised of: an indication of the cause for non-optimal signal data input; and a recommended procedure for optimizing the signal data input.
[0083] 11. The software component of Aspect 4, wherein the user interface is further comprised of: a virtual application that can be interacted with by the user.
[0084] 12. The software component of Aspect 4, wherein the user interface is further comprised of: a calibration procedure; and a set of instructions to guide the user through the calibration procedure.
[0085] 13. The software component of Aspect 4, wherein the user interface is further comprised of: a selection of a prostheses connection; a selection of one or more movements; an indication of signal data input; or, an indication of signal data output.
[0086] 14. The User Interface of Aspect 13, wherein the indication of signal data input is further comprised of: an identification of connected hardware; and an identification of the status of connected hardware.
[0087] 15. The software component of Aspect 4, wherein the user interface is further comprised of: a system for the user to directly monitor the signal data, in real time.
[0088] 16. The software component of Aspect 4, wherein the pattern recognition component further comprises: a system for the receiving of signal data; a system for the analysis of signal data; and a system for the output of signal data.
[0089] 17. The software component of Aspect 4, wherein the pattern recognition component further comprises: an adaptive machine learning system to recognize the users unique signal data.
[0090] 18. The pattern recognition component of Aspect 17, wherein the adaptive machine learning system further comprises: a system for the recognition of a user's unique signal data in reference to a particular motion by the user.
[0091] 19. The software component of Aspect 4, wherein the pattern recognition component further comprises: a system for identifying and categorizing signal data from a user.
[0092] 20. The software component of Aspect 4, wherein the pattern recognition component further comprises: a system for the communication of the categorized signal data to the user.
[0093] 21. The user interface of Aspect 11, wherein the virtual application further comprises: an installation method involving downloaded content from the internet; an installation method involving uploadable content from a physical disk or drive; and the installation methods having compatibility with digital operating systems.
[0094] 22. The button of Aspect 3, wherein the tactile interface further comprises: a system to initiate the calibration procedure of Aspect 12 from the prostheses.
[0095] 23. The user interface of Aspect 11, wherein the virtual application further comprises: a system to initiate the calibration procedure of Aspect 12 from the virtual application.
[0096] The foregoing aspects of the disclosure are exemplary only and not intended to limit the scope of the disclosure.
[0097] ADDITIONAL ASPECTS OF THE DISCLOSRE
[0098] 1. An electromyographic control system configured to coach prosthetic users to calibrate prosthetic devices, the electromyographic control system comprising: a myoelectric prosthetic controller configured to control a prosthetic device; an electromyographic software component communicatively coupled to a plurality of electrodes in myoelectric contact with a user, wherein the electromyograph software component is configured to perform an analysis of electromyographic (EMG) signal data of the user, the EMG signal data received from the plurality of electrodes; and a user interface configured to provide, based on the analysis of the EMG signal data, a feedback indication to the user as to a calibration quality of the EMG signal data, wherein the user interface is configured to initiate a calibration procedure to calibrate the myoelectric prosthetic controller, and wherein the user interface comprises at least one of: (i) a button user interface including a calibration button, or (ii) a virtual user interface configured to display the feedback indication as at least one of: (a) a quality metric corresponding to the calibration quality of the EMG signal data, or (b) a message corresponding to the calibration quality of the EMG signal data.
[0099] 2. The electromyographic control system of additional aspect 1, wherein the message comprises at least one of (a) an indication of a cause for a non-optimal signal data input of the EMG signal data, or (b) a recommended procedure for optimizing signal data input.
[0100] 3. The electromyographic control system of any of additional aspects 1 or 2, wherein the calibration procedure is initiated during a calibration session, wherein the virtual user interface provides a recommended procedure for optimizing signal data input, and wherein a further calibration procedure is initiated from the user interface during a further calibration session to recalibrate the myoelectric prosthetic controller based on the recommended procedure.
[0101] 4. The electromyographic control system of additional aspect 3, wherein the further calibration session is configured to facilitate at least one of (a) deleting EMG signal data corresponding to one or more data sets or movements, (b) adding EMG signal data corresponding to one or more data sets or movements, (c) replacing EMG signal data corresponding to one or more data sets or movements with new EMG signal data.
[0102] 5. The electromyographic control system of any of the previous additional aspects, wherein the myoelectric prosthetic controller is calibrated to control the prosthetic device based on the EMG signal data.
[0103] 6. The electromyographic control system of any of the previous additional aspects, wherein the calibration button is configured to provide the feedback indication by at least one of an auditory stimulus, a tactile stimulus, or a visual stimulus.
[0104] 7. The electromyographic control system of any of the previous additional aspects, wherein virtual user interface displays a visualization of the EMG signal data in real time.
[0105] 8. The electromyographic control system of any of the previous additional aspects, wherein the calibration procedure comprises the virtual user interface instructing the user to perform one or more indicated motions in relation to the prosthetic device, and wherein the one or more indicated motions produce the EMG signal data as received from the plurality of electrodes.
[0106] 9. The electromyographic control system of additional aspect 8, wherein the virtual user interface is configured to receive one or more selections indicating at least one of the one or more indicated motions for the user to perform.
[0107] 10. The electromyographic control system of any of the previous additional aspects, wherein the electromyographic software component further comprises a pattern recognition component configured to analyze the EMG signal data of the user, the pattern recognition component further configured to identify or categorize the EMG signal data of the user based on a particular motion performed by the user.
[0108] 11. The electromyographic control system of additional aspect 10, wherein the pattern recognition component comprises an adaptive machine learning component configured to determine the particular motion performed by the user based on the EMG signal data of the user.
[0109] 12. The electromyographic control system of additional aspect 11, wherein the adaptive machine learning component is further configured to determine an appropriate feedback indication based on the EMG signal data of the user.
[0110] 13. The electromyographic control system of any of the previous additional aspects, wherein the user interface is configured to reset calibration data of the user to calibrate the myoelectric prosthetic controller.
[0111] 14. An electromyographic control method for coaching prosthetic users to calibrate prosthetic devices, the electromyographic control method comprising: receiving, by an electromyographic software component communicatively coupled to a plurality of electrodes in myoelectric contact with a user, electromyographic (EMG) signal data from the plurality of electrodes; analyzing, by the electromyograph software component, the EMG signal data of the user; providing to a user interface, based on analyzing the EMG signal data, a feedback indication to the user as to a calibration quality of the EMG signal data; and initiating, based on the calibration quality of the EMG signal data, a calibration procedure to calibrate a myoelectric prosthetic controller, the myoelectric prosthetic controller configured to control a prosthetic device, wherein the user interface comprises at least one of: (i) a button user interface including a calibration button, or (ii) a virtual user interface configured to display the feedback indication as at least one of: (a) a quality metric corresponding to the calibration quality of the EMG signal data, or (b) a message corresponding to the calibration quality of the EMG signal data.
[0112] 15. The electromyographic control method of additional aspect 14, wherein the calibration procedure is initiated during a calibration session, wherein the virtual user interface provides a recommended procedure for optimizing signal data input, and wherein a further calibration procedure is initiated from the user interface during a further calibration session to recalibrate the myoelectric prosthetic controller based on the recommended procedure.
[0113] 16. The electromyographic control method of additional aspects 14 or 15, wherein the calibration procedure comprises the virtual user interface instructing the user to perform one or more indicated motions in relation to the prosthetic device, and wherein the one or more indicated motions produce the EMG signal data as received from the plurality of electrodes.
[0114] 17. The electromyographic control method of any of additional aspects 14 to 17, wherein the electromyographic software component further comprises a pattern recognition component configured to analyze the EMG signal data of the user, the pattern recognition component further configured to identify or categorize the EMG signal data of the user based on a particular motion performed by the user.
[0115] 18. The electromyographic control method of additional aspect 17, wherein the pattern recognition component comprises an adaptive machine learning component configured to determine the particular motion performed by the user based on the EMG signal data of the user.
[0116] 19. A tangible, non-transitory computer-readable medium storing instructions for coaching prosthetic users to calibrate prosthetic devices, that when executed by one or more processors cause the one or more processors to: receive, by an electromyographic software component communicatively coupled to a plurality of electrodes in myoelectric contact with a user, electromyographic (EMG) data from the plurality of electrodes; analyze, by the electromyograph software component, the EMG signal data of the user; provide to a user interface, based on analyzing the EMG signal data, a feedback indication to the user as to a calibration quality of the EMG signal data; and initiate, based on the calibration quality of the EMG signal data, a calibration procedure to calibrate a myoelectric prosthetic controller, the myoelectric prosthetic controller configured to control a prosthetic device, wherein the user interface comprises at least one of: (i) a button user interface including a calibration button, or (ii) a virtual user interface configured to display the feedback indication as at least one of: (a) a quality metric corresponding to the calibration quality of the EMG signal data, or (b) a message corresponding to the calibration quality of the EMG signal data.
[0117] 20. The tangible, non-transitory computer-readable medium of additional aspect 19, wherein the calibration procedure is initiated during a calibration session, wherein the virtual user interface provides a recommended procedure for optimizing signal data input, and wherein a further calibration procedure is initiated from the user interface during a further calibration session to recalibrate the myoelectric prosthetic controller based on the recommended procedure.
[0118] The foregoing additional aspects of the disclosure are exemplary only and not intended to limit the scope of the disclosure.
[0119] ADDITIONAL CONSIDERATIONS
[0120] Although the disclosure herein sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this patent and equivalents. The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical. Numerous alternative embodiments may be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.
[0121] The following additional considerations apply to the foregoing discussion. Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
[0122] Additionally, certain embodiments are described herein as including logic or a number of routines, subroutines, applications, or instructions. These may constitute either software (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware. In hardware, the routines, etc., are tangible units capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
[0123] In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations.
[0124] A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
[0125] Accordingly, the term hardware module or hardware component should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
[0126] Hardware modules may provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and may operate on a resource (e.g., a collection of information).
[0127] The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
[0128] Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location, while in other embodiments the processors may be distributed across a number of locations.
[0129] The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
[0130] This detailed description is to be construed as exemplary only and does not describe every possible embodiment, as describing every possible embodiment would be impractical, if not impossible. A person of ordinary skill in the art may implement numerous alternate embodiments, using either current technology or technology developed after the filing date of this application.
[0131] Those of ordinary skill in the art will recognize that a wide variety of modifications, alterations, and combinations can be made with respect to the above described embodiments without departing from the scope of the invention, and that such modifications, alterations, and combinations are to be viewed as being within the ambit of the inventive concept.
[0132] The patent claims at the end of this patent application are not intended to be construed under 35 U.S.C. 112(f) unless traditional means-plus-function language is expressly recited, such as means for or step for language being explicitly recited in the claim(s). The systems and methods described herein are directed to an improvement to computer functionality, and improve the functioning of conventional computers.