COMPUTER-IMPLEMENTED METHODS AND COMPUTER PROGRAMS FOR PROVIDING ASSISTANCE REGARDING WEARABLE MEDICAL EQUIPMENT
20210298401 ยท 2021-09-30
Assignee
Inventors
Cpc classification
A41H1/02
HUMAN NECESSITIES
G16H15/00
PHYSICS
A41H3/007
HUMAN NECESSITIES
International classification
A41H3/00
HUMAN NECESSITIES
A41H1/02
HUMAN NECESSITIES
A61B5/107
HUMAN NECESSITIES
Abstract
Computer-implemented method for providing assistance regarding at least one wearable medical equipment (6), in particular a custom-tailored compression garment (8), wherein product specification values (35, 36, 38, 39, 40) for the wearable medical equipment (6), which are used for manufacturing the wearable medical equipment (6) and/or selecting the pre-manufactured wearable medical equipment (6), are determined, characterised in that, for a measurement group (37) of product specification values (38) to be determined by manual measurement performed by a user, a communications process (47) using an, in particular handheld, mobile device (18) is implemented as a measurement assistance function (2), the communication process (47) comprising, for each product specification value (38) in the measurement group (37) and at least one user, at least outputting an instruction information guiding the user to at least one measurement position (29) on a body part (13) of a future wearer of the medical equipment (6), the measurement position (29) being associated with the current product specification value (38), receiving an input information comprising at least the measured product specification value (38) from the user.
Claims
1. Computer-implemented method for providing assistance regarding at least one wearable medical equipment (6), in particular a custom-tailored compression garment (8), wherein product specification values (35, 36, 38, 39, 40) for the wearable medical equipment (6), which are used for manufacturing the wearable medical equipment (6) and/or selecting the pre-manufactured wearable medical equipment (6), are determined, characterised in that, for a measurement group (37) of product specification values (38) to be determined by manual measurement performed by a user, a communications process using an, in particular handheld, mobile device (18) is implemented as a measurement assistance function (2), the communication process comprising, for each product specification value (38) in the measurement group (37) and at least one user, a communication subprocess (47) with at least outputting an instruction information guiding the user to at least one measurement position (29) on a body part (13) of a future wearer of the medical equipment (6), the measurement position (29) being associated with the current product specification value (38), receiving an input information comprising at least the measured product specification value (38) from the user.
2. Method according to claim 1, characterised in that the product specification values (38) of the measurement group (37) comprise at least one length value and/or at least one circumference value of the body part (13) and/or that at least two product specification values (38) sharing all of their at least one associated measurement positions (29) share a communication subprocess (47), in particular a pair of a skin value and a tension value.
3. Method according to claim 1, characterised in that the instruction information is at least partly output acoustically, in particular using a speaker (25) of the mobile device (18), and/or at least partly visually, in particular using a display (21) of the mobile device (18), and/or that the input information is at least partly received acoustically, in particular using a microphone (24) and/or a speech analysis unit of the mobile device (18), and/or at least partly using an electronic pen (26), in particular interacting with a touch display (21) of the mobile device (18).
4. Method according to claim 1, characterised in that the communication process is interactively dynamically adaptable according to received input information.
5. Method according to claim 4, characterised in that the instruction information is derived from a rule set describing the relative location of the measurement position (29) to at least one anatomical feature (30), wherein at least a part of the rules of the rule set comprise multiple guidance steps, wherein each communication subprocess (47) comprises communication steps (48), in particular outputting guidance information, for each guidance step, wherein, in particular, each communication step (48) is ended when input information describing the successful performance of the guidance step by the user is received.
6. Method according to claim 4, characterised in that, if an input information comprising a help request is received, a guidance information regarding the help request is retrieved from a help database and/or generated by an artificial intelligence interaction algorithm and output to the user.
7. Method according to claim 1, characterised in that the measurement assistance function (2) comprises multiple operating modes (43, 44, 45) with at least partly different communication processes and/or rules associated with each operating mode (43, 44, 45), wherein an operating mode (43, 44, 45) is chosen depending on an experience information (46) describing the experience of the user regarding the measurement of the product specification values (38).
8. Method according to claim 1, characterised in that, for at least a part of the received product specification values (38) of the measurement group (37), a plausibility check using a body part information describing typical ranges of such product specification values (38), in particular relative to at least one other product specification value (38), is performed.
9. Method according to claim 1, characterised in that the instruction information at least partly comprises at least one image (27) and/or image sequence and/or interactive visualization of the body part (13).
10. Method according to claim 1, characterised in that a mobile device (18) having a camera (22) is used, wherein at least one picture of the body part (13) is taken by the camera (22), wherein the at least one picture is evaluated to derive at least one wearer-specific instruction information and/or an action information rating a measurement action of the user, when the measurement action and/or a result of the measurement action is visible in the at least one picture, and/or to determine or modify at least one body part information used in a plausibility check using a body part information describing typical ranges of such product specification values (38), relative to at least one other product specification value (38), is performed.
11. Method according to claim 1, characterised in that, additional to the measurement assistance function (2), a product properties assistance function (3) to determine further product specification values (35, 36, 39, 40) of a non-measurement group is used, and/or that the measurement assistance function (2) and/or the product properties assistance function (3) comprises at least one additional communication process to acquire at least one wearer information (33, 34) describing at least one characteristic of the wearer.
12. Method according to claim 11, characterised in that the measurement group (37) is compiled or chosen depending on at least one product specification value (35, 36, 39, 40) of the non-measurement group and/or at least one wearer information (33, 34).
13. Method according to claim 11, characterised in that at least one product specification value (35, 36, 39, 40) of the non-measurement group is determined or adapted depending on at least one measured product specification value (38) of the measurement group (37) and/or at least one wearer information (33, 34), and/or at least one proposal value for at least one product specification value (35, 36, 39, 40) of the non-measurement group is determined depending on at least one measured product specification value (38) of the measurement group (37) and/or at least one wearer information (33, 34), wherein the at least one proposal is output for at least one user to select.
14. Method for producing or selecting a wearable medical equipment (6) for a body part (13) of a wearer, comprising automatically performing the steps of a method according to claim 1, whereafter the medical equipment (6) is automatically produced by a garment production apparatus, in particular a knitting machine, or selected using the at least one product specification value (35, 36, 38, 39, 40).
15. Computer program (1), which performs the steps of a method according to claim 1, when the computer program (1) is executed on a mobile device (18).
16. Computer program (1) according to claim 15, characterised in that it comprises, additional to the measurement assistance function (2), the product properties assistance function (3) and at least one use assistance function (4).
Description
[0076] Further details and advantages of the current invention will become apparent from the following description of exemplary embodiments taken in conjunction with the drawing, in which
[0077]
[0078]
[0079]
[0080]
[0081]
[0082]
[0083]
[0084]
[0085] The computer program 1 is, in such preferred embodiments, an application (app) for a handheld mobile device, in particular a mobile phone (smartphone) or a tablet. This application serves as a companion over the lifetime of the wearable medical equipment 6, including design/selection phase, in
[0086] The measurement assistance function 2 and the product properties assistance function 3 of the group 5 are both used at a time-point 9 corresponding to the action of compiling a complete set of product specification values, which allow, at a time-point 10, the production/manufacturing of a custom-tailored medical equipment 6 or a selection of a pre-manufactured medical equipment 6 from among standard sizes. The medical equipment 6 is then provided to a future wearer, who starts using the medical equipment 6 at a time-point 11. The use assistance function 4 provides assistance to the wearer during the time period 12 in which the medical equipment 6 is used.
[0087] The measurement assistance function 2 is used for determining a measurement group of product specification values, which require manual measurement performed by a user, in particular the future wearer himself. To this end, a communication process is implemented.
[0088] A corresponding scenario is shown in
[0089] As can be seen from
[0090] For each product specification value of the measurement group, which may, for example, comprise circumference values and/or length values of the body part, associated measurement positions are provided. For manual measurement at such a measurement position, a communication subprocess of the communication process is performed, wherein at least one instruction information guiding the user to the measurement position is output, wherein, in this case, both the touch display 21 and the speaker 25 are used as output means for the instruction information.
[0091] An exemplary visual instruction information on the touch display 21 is shown in
[0092] The measurement positions 29 may be defined by a standard, for example German RAL standard, which contains measurement positions 29 for different body parts 13, for example hands, arms, feet and legs 14.
[0093] In not yet laid-open European Patent Application EP 19167008.2, such rule sets have already been proposed to automatically locate measurement positions in three-dimensional data sets of a limb. In the method disclosed there, anatomical features of the limb are located and rules are applied to automatically find respective measurement positions. Such rules may also, in the context of the current invention, be used to provide instructions to a user, since anatomical features are easier to locate as such for the user than measurement positions and, once the relative location of the measurement position to the anatomical feature is known, corresponding guidance can be provided, in particular, at least in some cases, in multiple guidance steps.
[0094] Generally speaking, a rule of the rule set may describe how a measurement position may be derived from the position of an anatomical feature. While the position of an anatomical feature may, in some cases, already define at least one measurement position, regarding other measurement positions, physical and/or anatomical considerations may lead to more complex relationships, for example certain offset to at least one position of an anatomical feature, relative positions regarding multiple anatomical features and the like. For example, in the case of a leg 14, a first measurement position may be determined as three centimetres above the malleolus, a second measurement position may be determined as two centimetres below the lower edge of the patella and/or five centimetres below the middle of the patella, and the like. Those rules may be determined in empirical studies of a plurality of persons or otherwise.
[0095] As can be seen in the example of
[0096] It is noted that, additionally or alternatively to the visual instruction information described with respect to
[0097] Once the user has followed the instructions, found the measurement position 29 and measured the product specification value, the product specification value is input and thus received in the mobile device 18. This is performed contactless, either, preferably, by using the microphone 24 or by using the electronic pen 26. Speech reception and analysis is especially preferred, since the user can fully concentrate on the measurement task. In particular a combination of speech output of the instruction information, which may be supported by visual output as described above if needed, with speech input is advantageous, such that the user can be in speech dialogue with the measurement assistance function 3.
[0098] It should further be noted that some product specification values of the measurement group, for example a skin value and a tension value, may share at least one measurement position, such that the communication subprocess is preferably performed only once for both these product specification values, such that the user has to be guided to each measurement position only once.
[0099] Generally, the communication process is interactively dynamically adaptable according to received input information, in particular may comprise at least one question-answer process. A communication subprocess of a concrete embodiment is illustrated in
[0100] Thus, instruction information for a first of the at least one guidance step (and later on further guidance steps) is output in the first occurrence of step S1.
[0101] In a step S2, input information is received, as described, and evaluated in steps S3, S4 and S5.
[0102] If, in step S3, it is determined that the input information describes unsuccessful performance of the guidance step, a help request, or some predefined interaction with the image 27 or sequence of images displayed on the touch display 21, in a step S6, correspondingly updated output information is output, for example a refined, more accurate instruction information in the case of a help request or unsuccessful performance of the guidance step, or by providing a detailed view or additional information regarding the feature which was interacted with. For example, when interacting with the anatomical feature 30, that is the ankle, in
[0103] After step S6, input information is again received in step S2.
[0104] In step S4, it is checked whether the input information comprises a chat request, such that, in a step S7, for example, a chat window may be opened in which interactive communication with a chat bot or support staff may be initiated.
[0105] In step S5, it is checked whether the input information contains the product specification value to be measured. In other words, it can be evaluated if further guidance steps follow. If this is the case, the instruction information is updated accordingly in step S8 and the previous steps are repeated for the next guidance step, such that an additional communication step results.
[0106] If, in step S5, it has been determined that the measured product specification value has been received, in a step S9, a plausibility check for the product measurement value may be performed. It is noted that step S9 may also be performed at a later stage, for example, when multiple product specification values have been measured.
[0107] The plausibility check uses body part information and may comprise comparisons with at least one threshold and/or already measured and input product specification values. For example, as can be seen from
[0108] It is noted that, if plausibility rules are used in the plausibility check, these rules may also be included in the rule set described above, such that only one rule set regarding the body part is required and may, for example, be provided as a database containing all-purpose body part information.
[0109] In some embodiments, in step S9, an artificial intelligence plausibility algorithm may be employed, which may also have been trained regarding feedback data describing shortcomings of already produced and/or selected medical equipment 6, such that the artificial intelligence plausibility algorithm may detect patterns of product specification values often leading to complaints and providing corresponding output and/or correction.
[0110] If the plausibility check in step S9 fails, the user is notified accordingly using the speaker 25 and/or the touch screen 21, such that the input information may be corrected and/or the measurement re-performed and/or, in particular in the case of preventing complaints, adapted, in particular according to a certain proposal.
[0111] It is noted at this point that, in the measurement assistance function 2 and/or in the product properties assistance function 3, wearer information may be additionally received from the user or from another source, wherein the wearer information may determine some product specification values and/or product specification values to be used; however, it is also possible that product specification values of the non-measurement group influence product specification value of the measurement group and vice versa, as will be further explained below with regard to
[0112] Additionally, in the stage 3a, some product specification values 35 of the non-measurement group may be received as input information from the user. It is noted that further product specification values or proposals therefore may, in stage 3a, be automatically determined from the wearer information 33, 34, sometimes in conjunction with input product specification values 35. That is, the output of stage 3a regarding the non-measurement group may not only comprise the user-received product specification values 35, but also additional product specification values 36 of the non-measurement group derived from the received product specification values 35 and/or the wearer information 33, 34.
[0113] However, in this embodiment, the measurement group 37, that is, the measurement specification values to be requested, may also be compiled or chosen depending on at least one product specification value 35, 36 of the non-measurement group and/or at least one wearer information 33, 34. In an example, if a custom-tailored compression garment for a leg 14 is to be specified, the product specification values of the measurement group 37 and the respective measurement positions may be defined according to a standard, for example RAL. It is noted at this point that, if a rule set is used, the rule set may also comprise an order in which the product specification values are to be measured, which may also be provided as input to the measurement assistance function 2.
[0114] It is noted that, in some embodiments, wearer information 33, 34 may be used to refine and personalise instruction information regarding the wearer, as indicated by the dashed lines. In any case, measured product specification values 38 for the measurement group 37 are received from the user, as described above. However, as should be noted, for example depending on wearer information 33, 34, some product specification values of the measurement group 37 may also be automatically determined in rare cases.
[0115] Some product specification values 39 of the non-measurement group may also be automatically determined based on received product specification values 38 of the measurement group 37 and/or wearer information 33, 34, such that they do not need to be entered in the second stage 3b of the product properties assistance function 3. It is, in some embodiments, also possible that proposals for the user to select may be output for at least some product specification values of the non-measurement group. In a concrete example, it may be derived from the user-receipt product specification values 38 of the measurement group 37 whether flat knitting or circular knitting is a preferred production method.
[0116] In some embodiments, it is also possible to derive a body part health information from measured and user-received product specification values 38 of the measurement group 37, wherein at least one product specification value or a proposal value therefore may be determined such that a medical equipment 6 suitable for treating a health condition of the body part 13 described by the body part health information is provided. In this evaluation process, additionally, wearer information 33, 34 may be preferably used. For example, another medical equipment category may be recommended, in particular even replacing a user-received product specification value 35 or derived product specification value 36 of the first stage 3a.
[0117] In stage 3b, analogously to stage 3a, product specification values 40 of the non-measurement group that are still missing may be requested and received correspondingly from the user. Hence, as an output of stage 3b, a complete set 41 of product specification value 35, 36, 38, 39, 40 is provided, such that, in a step 42, a corresponding medical equipment 6 may be manufactured or selected. In the case of manufacturing a custom-tailored wearable medical equipment 6, in particular a compression garment 8, production values for controlling a garment production apparatus, in particular a knitting machine, may be automatically derived and the garment 8 may be automatically produced.
[0118] It is noted that, while speech recognition and acoustical dialogue with the user are very advantageous during the measurement process, of course, also the input of product specification values 35, 40 of the non-measurement group may be performed voice-controlled.
[0119] As shown in
[0120] Depending on the experience information 46, a beginners operating mode 43, a normal operating mode 44 and an expert operating mode 45 are chosen. In the operating mode 43, a communication subprocess 47 of the communication process may comprise multiple, thoroughly explained communication steps 48 associated with guidance steps. In the normal mode, the number of communication steps 48 in the communication subprocess 47 has already been considerably reduced, while in the expert mode, the communication subprocess 47 may be reduced to a single communication step 49. In a concrete example, when measuring circumference values at a certain RAL measurement position, in the beginners operating mode 43, the user may be guided step-by-step from the anatomical feature 30 to the measurement position 29. In the expert operating mode 45, the RAL measurement position may simply be named.
[0121] Finally,
[0122] The therapy controlling subfunction 50 may also comprise outputting information, for example instructions for use of the medical equipment 6, skin care information and/or information regarding additional therapy actions to be performed by the wearer.
[0123] A reminder subfunction 51 may be used to remind a user, in particular the wearer, of events associated with the medical equipment, for example certain therapy actions to be performed, like a gymnastic exercise or checking compression levels. The use assistance function may also use a communication interface already used by the product properties assistance function 3, as described above, to access the electronic health record 32, such that information may be exchanged, in particular regarding therapy, and/or even an electronic prescription may be triggered and/or implemented.
[0124] The use assistance function 4 may further comprise a feedback subfunction 53, such that feedback information, in particular a complaint, may be submitted to a manufacturer (producer) and/or a provider (seller) of the medical equipment 6.
[0125] Feedback information may, preferably, comprise at least one, in particular annotated, picture taken with the camera 22 of the mobile device 18.
[0126] Finally, the use assistance function 4 may comprise an interactive help assistant 54, which may preferably comprise a question interpreter to provide general support and assistance regarding the medical equipment 6.