METHOD AND SYSTEM FOR TRACKING MOVEMENT OF A PERSON WITH WEARABLE SENSORS

20220183592 · 2022-06-16

    Inventors

    Cpc classification

    International classification

    Abstract

    Methods and systems for tracking movement of a person, including a method comprising: arranging a plurality of sensors of a motion tracking system on a body of the person; tracking movement of the person with the plurality of sensors at least while the person performs a movement; digitally estimating, with a computing device, a position of either the first joint or the sensor on the second body member, wherein the position of the first joint is estimated using measurements of the first sensor and the position of the sensor is estimated using measurements of both the first sensor and the sensor arranged on the second body member, and wherein the position is estimated while movement of the person is tracked; digitally computing, with the computing device, an acceleration of the estimated position while movement of the person is tracked; digitally computing, with the computing device, a first comparison between the computed acceleration of the estimated position and acceleration measurements of the sensor arranged on the second body member; and digitally determining, with the computing device, the movement performed by the person based on the first comparison.

    Claims

    1. A method for tracking movement of a person, the method including the following steps: arranging a plurality of sensors of a motion tracking system on a body of the person, the plurality of sensors at least comprising first, second, and third sensors each of the plurality of sensors at least comprising an accelerometer and a gyroscope, wherein the first sensor is arranged on a first body member of the person, a first one of the second and third sensors is arranged on a second body member of the person, and a second one of the second and third sensors is arranged on a third body member of the person, the first and second body members being connected by a first joint; tracking movement of the person with the plurality of sensors at least while the person performs a movement; digitally estimating, with a computing device, a position of either the first joint or the sensor on the second body member, wherein the position of the first joint is estimated using both measurements of the first sensor and a first predetermined length, and the position of the sensor is estimated using measurements of both the first sensor and the sensor arranged on the second body member and further using both the first and a second predetermined lengths, and wherein the position is estimated while movement of the person is tracked; digitally computing, with the computing device, an acceleration of the estimated position while movement of the person is tracked; digitally computing, with the computing device, a first comparison between the computed acceleration of the estimated position and acceleration measurements of the sensor arranged on the second body member; and digitally determining, with the computing device, the movement performed by the person based on the first comparison.

    2. The method of claim 1, further comprising: digitally computing, with the computing device, a second comparison between the computed acceleration of the estimated position and acceleration measurements of the sensor arranged on the third body member; and digitally determining, with the computing device, whether the second and third sensors have been arranged on the second and third body members, respectively, based on both the first and second comparisons.

    3. The method of claim 2, wherein if the computing device is digitally determines that the second and third sensors have been arranged on the third and second body members, respectively, the computing device digitally substitutes at least one of measurements of the second sensor for measurements of the third sensor and/or vice versa while movement of the person is tracked.

    4. The method of claim 2, wherein the step of digitally determining whether the second and third sensors have been arranged on the second and third body members, respectively, is carried out prior to the step of digitally determining the movement performed by the person based on the first digital comparison.

    5. The method of claim 1, wherein each comparison is digitally computed so that a result thereof is indicative of a level of similarity or dissimilarity between the computed acceleration of the estimated position and the acceleration measurements of the corresponding sensor of the plurality of sensors.

    6. The method of claim 5, further including the following steps: digitally computing, with the computing device, a second comparison between the computed acceleration of the estimated position and acceleration measurements of the sensor arranged on the third body member; and digitally determining, with the computing device, whether the second and third sensors have been arranged on the second and third body members, respectively, based on both the first and second comparisons, wherein the computing device digitally determines that the second and third sensors have been arranged on the second and third body members, respectively, if the result of the first comparison is: greater than the result of the second comparison if both results are indicative of a level of similarity; or lower than the result of the second comparison if both results are indicative of the level of dissimilarity; otherwise the computing device digitally determines that the second and third sensors have been arranged on the third and second body members, respectively.

    7. The method of claim 1, wherein the position of the sensor arranged on the second body member is digitally estimated by further using a factor in the form of a decimal number greater than 0 and smaller than 1.

    8. The method of claim 1, wherein the step of digitally determining the movement comprises determining that a first predetermined movement has been performed by the person if the first comparison is above a predetermined threshold.

    9. A system for tracking movement of a person, comprising: a plurality of sensors at least comprising first, second and third sensors each at least comprising an accelerometer and a gyroscope, wherein the first sensor is arrangeable on a first body member of the person, a first one of the second and third sensors is arrangeable on a second body member of the person, and a second one of the second and third sensors is arrangeable on a third body member of the person; and a computing device comprising at least one processor, at least one memory and means for transmitting and receiving data; wherein the computing device is programmed to: estimate either a position of a first joint connecting the first and second body members or a position of the sensor on the second body member when the person has the plurality of sensors arranged thereon tracking the movement of the person, wherein the position of the first joint is estimated using both measurements of the first sensor and a first predetermined length, and the position of the sensor on the second body member is estimated using measurements of both the first sensor and the sensor arranged on the second body member and further using both the first and a second predetermined lengths; compute an acceleration of the estimated position while movement of the person is tracked; compute a first comparison between the computed acceleration of the estimated position and acceleration measurements of the sensor arranged on the second body member; and determine the movement performed by the person based on the first comparison.

    10. The system of claim 9, wherein the computing device is further programmed to: compute a second comparison between the computed acceleration of the estimated position and acceleration measurements of the sensor arranged on the third body member; and determine whether the second and third sensors have been arranged on the second and third body members, respectively, based on both the first and second comparisons.

    11. The system of claim 10, wherein the computing device is further programmed to substitute at least one of measurements of the second sensor for measurements of the third sensor and vice versa while movement of the person is tracked and when the computing device determines that the second and third sensors have been arranged on the third and second body members, respectively.

    12. The system of claim 9, wherein the computing device computes each comparison so that a result thereof is indicative of a level of similarity or dissimilarity between the computed acceleration of the estimated position and the acceleration measurements of the corresponding sensor of the plurality of sensors.

    13. The system of claim 9, wherein the computing device determines that a first predetermined movement has been performed by the person if the first comparison is above a predetermined threshold.

    14. The system of claim 9, wherein the computing device estimates the position of the sensor arranged on the second body member by further using a factor in the form of a decimal number greater than 0 and smaller than 1.

    15. A method for tracking movement of a person, the method including the following steps: arranging a plurality of sensors of a motion tracking system on a body of the person, the plurality of sensors at least comprising first, second and third sensors each at least comprising an accelerometer and a gyroscope, a first one of the first, second and third sensors is arranged on a first body member of the person, a second one of the first, second and third sensors is arranged on a second body member of the person, and a third one of the first, second and third sensors is arranged on a third body member of the person, the first and second body members being connected by a first joint; and tracking movement of the person with the plurality of sensors at least while the person performs a movement, the movement performed by the person involving the first body member and at least one of the second and third body members, and the first body member undergoing a rotation greater than rotation of at least one of the second and/of third body members during the movement; for each sensor of the plurality of sensors, digitally computing, with a computing device, the rotation undergone by the sensor while movement of the person is tracked; digitally determining, with the computing device, that the sensor with the greatest computed rotation is arranged on the first body member; digitally estimating, with the computing device while movement of the person is tracked: a position of the first joint, the position being estimated using both measurements of the determined sensor and a first predetermined length; or a position of each of the two other sensors thereby providing first and second estimated positions, the first position being estimated using measurements of both the determined sensor and the first one of the two other sensors and further using both the first and a second predetermined lengths, and the second position being estimated using measurements of both the determined sensor and the second one of the two other sensors and further using both the first and the second predetermined lengths; digitally computing, with the computing device, an acceleration of each estimated position while movement of the person is tracked; digitally computing, with the computing device-: if the estimated position is the position of the first joint, a first comparison between the computed acceleration of the estimated position and acceleration measurements of the first one of the two other sensors, and a second comparison between the computed acceleration of the estimated position and acceleration measurements of the second one of the two other sensors; or if the estimated position is the first and second estimated positions, a first comparison between the computed acceleration of the first estimated position and acceleration measurements of the first one of the two other sensors, and a second comparison between the computed acceleration of the second estimated position and acceleration measurements of the second one of the two other sensors; and digitally determining, with the computing device, which sensor has been arranged on the second body member and which sensor has been arranged on the third body member based on each of the first and second comparisons.

    16. The method of claim 15, wherein the second and third body members are connected by a second joint.

    17. The method of claim 15, further comprising digitally determining, with the computing device, the movement performed by the person based on one of the first and second comparisons corresponding to the computed acceleration of the estimated position of the second one of the first, second, and third sensors.

    18. The method of claim 17, wherein the computing device determines that at least one of: a first predetermined movement has been performed by the person if the one of the first and second comparisons is above a predetermined threshold; and either the first predetermined movement has not been performed by the person or a second predetermined movement has been performed by the person if the one of the first and second comparisons is below the predetermined threshold.

    19. The method of claim 15, wherein all the body members having a sensor arranged thereon form a kinematic chain.

    20. The system of claim 13, wherein the computing device determines that either the first predetermined movement has not been performed by the person or a second predetermined movement has been performed by the person if the first comparison is below the predetermined threshold.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0137] To complete the description and in order to provide for a better understanding of the invention, a set of drawings is provided. Said drawings form an integral part of the description and illustrate embodiments of the invention, which should not be interpreted as restricting the scope of the invention, but just as examples of how the invention can be carried out. The drawings comprise the following figures:

    [0138] FIG. 1 diagrammatically shows a motion tracking system in accordance with an embodiment.

    [0139] FIGS. 2-3 show a person performing predetermined movements while wearing motion tracking sensors.

    [0140] FIGS. 4 and 5 illustrate the estimation of the position in procedures in accordance with the present disclosure.

    [0141] FIG. 6 illustrates a method for identifying sensors arranged on a person in accordance with an embodiment.

    DESCRIPTION OF WAYS OF CARRYING OUT THE INVENTION

    [0142] FIG. 1 diagrammatically shows a motion tracking system 10 in accordance with an embodiment. The motion tracking system 10 includes a plurality of sensors 21-23 and a computing device 40.

    [0143] The sensors 21-23 are sensors that at least include a gyroscope 31 and an accelerometer 32. The sensors 21-23 also include at least one processor 36 and at least one memory 37. In preferred embodiments such as the one of FIG. 1, the sensors 21-23 further include a first communications module 38 for transmitting and receiving data that enables the sensors 21-23 to transmit (through a wired or wireless communications technology and protocol known by a skilled person, for instance but without limitation, Bluetooth communications, cellular network communications such as GSM, UMTS or LTE, wireless LAN communications, etc.) measurements of each of the sensing devices 31, 32 to the computing device 40. The same first communications modules 38 enable the sensors 21-23 to receive data from the computing device 40. In less preferred embodiments, the sensors 21-23 are not provided with the first communications module 38; in these embodiments, data can be extracted from the sensors 21-23 and/or provided to the sensors 21-23 by means of a computer readable storage medium.

    [0144] The computing device 40 includes at least one processor 42 and at least one memory 44. Preferably, the computing device 40 further includes a second communications module 46 for transmitting and receiving data. When the computing device 40 is not provided with the second communications module 46, data can be extracted therefrom and/or introduced therein by means of a computer readable storage medium.

    [0145] FIG. 2 shows a person 50 performing a first predetermined movement while wearing sensors 21-23 of the motion tracking system 10.

    [0146] The person 50 has arranged thereon three sensors 21-23 tracking the movement of said person: the first sensor 21 is on the right upper leg 51, the second sensor 22 is on the right lower leg 52, and the third sensor 23 is on the chest 53. These body members 51-53 form a kinematic chain.

    [0147] In this example, the first predetermined movement is a squat, which involves the lower legs or shanks 52, and the upper legs or thighs 51, and the chest 53. In this movement, the end 57 of the lower legs 52 that connects to the ankle has a known position that remains still or almost still during the entire movement.

    [0148] FIG. 3 shows the person 50 performing a second predetermined movement while wearing sensors 21-23 of the motion tracking system 10.

    [0149] The person 50 has the three sensors 21-23 arranged as in FIG. 2, the difference of this example being that the second predetermined movement is a knee-up, which involves the lower leg 52 and the upper leg 51 of a same leg. In this movement, the end 56 of the upper leg 51 that connects to the hip has a known position that remains still or almost still during the entire movement.

    [0150] As it can be seen from FIGS. 2 and 3, at the most flexed position in the two physical exercises the chest 53, and the right upper and lower legs 51, 52 of a same leg have a similar angular arrangement despite the differences between the two movements.

    [0151] If, as in the prior art, the orientations of the sensors 21-23 are used for determining the movement performed by the person, the angular relationship existing between the sensors 21-23 when the person 50 performs a first one of the two movements would be similar to that when the person 50 performs a second one of the two movements. In that case, it would not be possible to know whether, for example, the foot has been actually raised such that it leaves the ground or not, or if the hip has not moved. Just with the orientations measured while performing movements it is not possible to determine which one of the two movements has been performed. Methods and systems according to the present disclosure, however, make possible to determine the movement actually performed with a reduced probability of error.

    [0152] FIG. 4 illustrates the estimation of the position of a sensor 22 of the person 50 having arranged the sensors of the motion tracking system 10 thereon in accordance with a predetermined sensor arrangement, namely the sensors have not been swapped by mistake.

    [0153] In this example, the person 50 is performing the movement of FIG. 3, namely a knee-up exercise.

    [0154] The computing device 40 of FIG. 1 receives acceleration and orientation measurements from each sensor 21-23. By means of first and second orientations of the respective first and second sensors 21, 22, the position of the second sensor 22 on the right lower leg 52 is estimated.

    [0155] Owing to the attachment of the first sensor 21 to the right upper leg 51, the first orientation 21 is indicative of the orientation of the right upper leg 51. With the first orientation 21, the position of the knee 55 is estimated with reference to the hip 56. This is achieved by processing the first orientation together with a value corresponding to a possible length of the right upper leg 51; this may be represented as the first vector 61 which is the result of combining said orientation and said length value. For the sake of clarity only, it can be considered that the orientation results in a unit vector (or an analogous mathematical tool) and the length value is the magnitude thereof.

    [0156] Even though in this example the position of the hip 56 is known, the position of the knee 55 can also be estimated without knowledge of the position of the hip 56, in which case the position of the knee 55 is a position relative to the other endpoint of the first vector 61 (preferably made to coincide with the position of the hip 56 when this is known). Thus, as mentioned, the position of the knee 55 (which is a second endpoint of the first vector 61) is determinable by the computing device 40 relative to a first endpoint of the first vector 61.

    [0157] From this estimated knee position, it is then estimated the position of the second sensor 22. This is achieved by processing the second orientation together with a value corresponding to a possible length of the right lower leg 52 times a factor (decimal value between 0 and 1 not including the endpoints) indicating the estimated position of the sensor 22 along the extension of the right lower leg 52. For instance, if the second sensor 22 is to be placed at the middle point of the right lower leg 52, the factor is 0,50, whereas if it is closer to the knee than to the ankle, the factor is less than 0,50 (but greater than 0,00). This combination may be represented as the second vector 62, which has its starting point at the estimated position of the knee. The estimated position of the second sensor 22 is then the combination of both vectors 61, 62, thus the position is referenced to the end of the right upper leg 51 with known position, hence referred to the hip 56.

    [0158] As it is readily apparent, the first and second vectors 61, 62 have a direction based on the orientations provided by the respective sensors 21, 22. As the movement is performed, the orientations change and so does one or both of the estimated positions. Albeit in this example the computation for determining the movement performed is based on the estimated position of the second sensor 22, in other examples an equivalent computation could be based on the estimated position of the knee 55 because the estimated position thereof may vary similarly to the estimated position of the second sensor 22 (but if there is a further movement performed by the right lower leg 52, then the evolution of these two estimated positions differ in a greater degree).

    [0159] The computing device 40 double differentiates the estimated position so as to obtain an acceleration of the estimated position. As the movement is performed by the person 50, the computing device 40 computes said acceleration. Then, by computing a comparison between the computed acceleration and the acceleration measurements of the second sensor 22, the computing device 40 determines that the movement actually performed by the person 50 is the knee-up instead of the squat of FIG. 2. This is so because the computed acceleration will be similar to the acceleration measurements of the second sensor 22, and thus the result of the comparison will be above a predetermined threshold set for determining that the movement performed is the knee-up, whereas if the squat had been performed the result of the comparison would not be above the predetermined threshold set. What is more, in this latter example, the computing device 40 could determine that since the predetermined threshold has not been exceeded, the movement performed by the person 50 is a squat, not just that the movement performed by the person 50 is not a knee-up.

    [0160] The predetermined threshold is preferably set by carrying out tests for setting and/or adjusting the value of the predetermined threshold for given movements.

    [0161] In this example, the computing device 40 has stored in the memory thereof that in a knee-up exercise the right lower leg 52 is to have an acceleration going upwards as the right upper leg 51 moves to a higher position, whereas in a squat exercise the right lower leg 52 has a reduced acceleration which, furthermore, does not go upwards but rather towards the front. Hence, if the person 50 is requested to perform a knee-up but instead the person 50 performs a squat, upon computing the comparison the computing device 40 will determine that no knee-up movement has been performed due to the differing accelerations. The same determination would be made by the computing device 40 if the estimated position to be double-differentiated were that of the knee 55 rather than the position of the second sensor 22.

    [0162] In another example not illustrated, the computing device is configured to determine whether the movement performed by the person is a squat. In that example, the computing device preferably considers the lower leg as the first body member since one end thereof (i.e. the ankle) has a known position, which furthermore undergoes little or no movement at all while the squat is performed. The ankle position is thus preferably used as the referential for the processing of sensor measurements in order to determine if the movement performed is a squat.

    [0163] Likewise, in yet another example in which it is attempted to determine whether the person performs a squat, the position of the knee is known or can be estimated thanks to, for example, measurements of the sensor on the lower leg. With the position of the knee as a reference, the computing device estimates the position of the hip or the position of the sensor on the chest. Then, the computing device computes an acceleration for said position and a first comparison between the computed acceleration and acceleration measurements of the sensor on the chest. Therefore, in this example, the upper leg and the chest are regarded as the first and second body members, respectively.

    [0164] FIG. 5 illustrates the estimation of the position of a sensor 23 of the person 50 in order to determine if two sensors 22, 23 have been placed on the person swapped.

    [0165] The person 50 is performing the knee-up movement once again. However, in this example the person 50 has inadvertently swapped the second and third sensors 22, 23; accordingly, the second sensor 22 is placed on the chest 53 whereas the third sensor is placed on the right lower leg 52.

    [0166] If the computing device 40 is expecting a different sensor arrangement, e.g. the second sensor 22 on the right lower leg 52 and the third sensor 23 on the chest 53, because a different sensor and body member correspondence is stored in the memory of the computing device 40, the interchange of sensors 22, 23 will affect the determination of the movement performed by the person 50.

    [0167] In this example, the computing device 40 attempts to estimate the position of the sensor arranged on the lower leg 52 using both sensors 22, 23 as potential candidates. The computing device 40 thus estimates a first position for that sensor by using measurements of both the sensor 21 that is arranged on the right upper leg 51 and a first one of the two sensors remaining, e.g. the second sensor 22, and also estimates a second position for the sensor arranged on the lower leg 52 by using measurements of both the sensor 21 that is arranged on the right upper leg 51 and a second one of the two sensors remaining, e.g. the third sensor 23. For clarity purposes only, the first estimated position at the particular time instant represented in FIG. 5 is at the end of the vector 64 (the direction of the vector 64 corresponding to the measurements provided by the second sensor 22, the length of the vector 64 corresponding to a second predetermined length intended to represent the distance at which the second sensor 22 is expected to be from the knee 55 should it have been arranged on the right lower leg 52, and the origin of the vector 64 at the estimated position 61 of the knee 55 is because the knee 55 is the joint connecting the right upper and lower legs 51, 52 that are involved in the movement), and the second estimated position at the particular time instant represented in FIG. 5 is at the end of the vector 63.

    [0168] The accelerations of each of the first and second estimated positions are computed (taking the second derivative), and then each is compared to the acceleration measurements of the respective sensor of the two sensors 22, 23, thereby producing first and second comparisons. The computing device identifies which one of the two remaining sensors 22, 23 is placed on the right lower leg 52 by finding the comparison that has a greater degree of similarity between the computed acceleration and the acceleration measurements of one of the sensors (in this example, the third sensor 23), or conversely the comparison that has a lower degree of dissimilarity between the computed acceleration and the acceleration measurements of one of the sensors (i.e. the third sensor 23).

    [0169] In some embodiments, after identifying which sensor is placed on the (second) body member connected to the (first) body member for estimating the position of the joint connecting both body members, the computing device 40 substitutes the measurements of, for example, the second sensor 22 for those of the third sensor 23 and/or vice versa, so that despite the incorrect placement of the sensors on the body of the person 50, the motion tracking procedure is continued without rendering incorrect the determination of the movement performed by the person 50. This process can be carried out before the process described with reference to FIG. 4.

    [0170] With identified positions of the sensors 21-23, for example by way of the above procedure, the computing device of the motion tracking system is also capable of determining the movement performed by the person 50. In this case, the processing described with reference to FIG. 4 results in the determination of the movement, in this case a knee-up.

    [0171] FIG. 6 illustrates a method for identifying the sensors 21-23 arranged on the person 50 in accordance with an embodiment.

    [0172] The person 50 is performing the predetermined movement of a squat. Even though the person 50 intended to arrange the first, second and third sensors 21-23 on the right upper leg 51, right lower leg 52, and chest 53, respectively, the person 50 inadvertently swapped the sensors 21-23 and arranged the first sensor 21 on the right lower leg 52, the second sensor 22 on the chest 53, and the third sensor 23 on the right upper leg 51. The computing device of the corresponding motion tracking system 10 has stored thereon the intended sensor arrangement, thus the motion tracking procedure will not be correct.

    [0173] The computing device 40 has stored thereon data indicative of how the body members would have to move during a squat movement. Accordingly, the data indicates that the right upper leg 51 is to be subject to a rotation 70 greater than rotations of both the right lower leg 52 and the chest 53, in this case the orientation 65 of the right upper leg 51 is about the knee 55 as shown with the dashed arrow 70.

    [0174] The first, second and third sensors 21-23 provide measurements to the computing device 40, which in turn processes them to compute the rotations undergone by each one of the sensors 21-23 while the person 50 is performing the squat movement. The computing device 40 determines that the third sensor 23 is the sensor actually placed on the right upper leg 51 instead of the first sensor 21.

    [0175] The computing device 40 estimates either the position of the knee 55 by testing the two possibilities because the position of the knee 55 is to be referenced to a known position, in this case the position of the ankle 57; the first possibility is the estimation using the measurements of the first sensor 21 and the second possibility is the estimation using the measurements of the second sensor 22. Alternatively, the computing device 40 computes first and second estimated positions corresponding to the expected position of the sensor on the right lower leg 52 by using the measurements of each of the two sensors different from the sensor identified as the sensor arranged on the right upper leg 51 (which in this example are the first and second sensors 21, 22) for the respective first and second positions. For estimating each of these positions, a predetermined length value corresponding to an estimated or measured length of the right lower leg 52 is used, and in the case of the first and second positions also a scale factor is used corresponding to the expected position of the sensor along the extension of the right lower leg 52. Then, the acceleration of the estimated position or the accelerations of the first and second estimated positions are computed.

    [0176] The acceleration(s) computed is compared with the acceleration measurements of said two other sensors, thereby providing first and second comparisons. These comparisons are used for identifying the first and second sensors 21, 22. In this example, the computing device determines that the first sensor 21 is arranged on the right lower leg 52 because the acceleration measurements of that sensor are similar to the computed acceleration (of the estimated position for the knee 55), and determines that the second sensor 22 is the only sensor that may have been arranged on the chest 53.

    [0177] Although the identification of the third sensor 23 as being the sensor arranged on the right upper leg 51 may not be attained until the movement is halfway performed or even completely performed by the person 50, all of the position(s) estimation, acceleration(s) computation and comparison(s) computation may be carried out with past measurements (the sensor measurements are stored on the memory of the computing device 40), therefore the computing device 40 is capable of completing the whole procedure when the movement is performed by the person 50, i.e. in the first squat each of the first, second and third sensors 21-23 are identified. In some embodiments, however, the computing device 40 may only identify the sensor on the right upper leg 51 when the movement is first performed, and identify the remaining sensors when the movement is repeated and, thus, additional sensor measurements are provided, i.e. in the first squat the third sensor 23 is identified, and in the second or further squats the first and second sensors 21, 22 are identified.

    [0178] Also, in some embodiments, for the estimation of positions and subsequent computation of acceleration(s) as described with reference to FIGS. 4-6, a number of different measurements provided by the sensors are used for estimating the different positions so that the acceleration may be computed, for example five, ten, twenty, or even measurements of each sensor are used. Further, in some of these embodiments, said different measurements are taken when the movement performed by the person has reached or is about to reach an end position (such as the positions illustrated in FIGS. 2-6) so that the determination is made after or about completion of the movement; to this end, the computing device may compare the orientation measurements provided by one or more sensors with a predetermined movement limit threshold so as to derive whether the movement has reached or is about to reach such position. By way of example, if the knee-up movement is being performed, the predetermined movement limit threshold is e.g. 90 degrees, and the computing device compares the orientation of the right upper leg as measured by the sensor arranged thereon with said predetermined movement limit threshold to check if the leg has been raised as expected; when that occurs, the computing device takes the last e.g. five, ten, twenty, etc. measurements of the sensors to estimate the positions, compute the corresponding acceleration(s) and make the determination.

    [0179] The comparison(s) described in relation to the process of FIGS. 4, 5 and 6 can be, for example but without limitation, any one of the following: the dot product between the two signals (i.e. the computed acceleration and the acceleration measurements of the corresponding sensor) in a given time window, the dot product in the frequency domain, the cross correlation between the two signals (evaluated at t=0), the mean squared difference, etc.

    [0180] It is readily apparent that even if the examples described herein with reference to the FIGS. 2-6 refer to sensors 21-23 placed on the chest 53, right upper leg 51 and right lower leg 52, and the exercises are squats or knee-ups, other sensor placements and movements are possible within the scope of the present disclosure. In this respect, the computing device is to have stored in a memory thereof on which body members the plurality of sensors 21-23 is to be placed, and which movements are to be performed, together with a predetermined threshold and/or data indicative of the motion involved by the body members while the movement is performed, in this way the computing device is capable of determining the movement performed by the person 50 (i.e. by computing and comparing rotations, estimating position(s), computing the acceleration(s) associated therewith and comparing the computed acceleration(s) with the acceleration measurements in the manner described).

    [0181] The following table shows a non-exhaustive list of placements of the first and second sensors 21, 22 and movements within the scope of the present disclosure; each row corresponds to a configuration for arranging the first and second body members 21, 22 and the possible movement(s) in such configuration:

    TABLE-US-00001 First body Second body member with member with first sensor second sensor Possible movements Upper chest Head Any torso movement for which the hip position is constant (e.g. tilt, rotate, sit-ups) Upper chest Upper arm Any torso movement for which the hip position is constant (e.g. tilt, rotate, sit-ups) Lower chest Upper chest Any torso movement for which the hip position is constant (e.g. tilt, rotate, sit-ups) Upper arm Lower arm Any upper arm movement for which the shoulder position is constant (e.g. flexion, abduction, rotation) Lower arm Hand Any lower arm movement for which the elbow position is constant (e.g. flexing/extending the elbow, pronation/supination) Hand Fingers Any hand movement for which the wrist position is constant (e.g. flexion/extension, ulnar/radial deviation) Upper leg Lower leg Any upper leg movement for which the position of the hip is constant (e.g. hip flexion, hip abduction, donkey kicks) Lower leg Foot Flex/extend the knee while the knee position is constant Foot Toes Any foot movement for which the ankle position is constant (e.g. plantar flexion/dorsiflexion, inversion/eversion) Lower leg Upper leg Any lower leg movement for which the position of the foot is constant (e.g. squat, lunges) Lower arm Upper arm Any lower arm movement for which the position of the hand is constant (e.g. push-ups, sitting push- ups) Upper leg Chest Any chest movement for which the position of the knee is constant (e.g. knee push-ups) Upper arm Chest Any chest movement for which the position of the shoulder is constant (e.g. plank) Head Chest Any chest movement for which the head is still (e.g. headstand)

    [0182] In each of the above possible placements of the first and second sensors 21, 22, the third sensor 23 may not be arranged on the person, or be arranged on any body member of the person, including a body member connected with the body member of the first sensor 21 or second sensor 22 by a joint different from that connecting the body members of the first and second sensor 21, 22.

    [0183] The above configurations and movements are possible in respect of methods and systems according to any one of the first to the sixth aspects of the invention.

    [0184] Even though the terms first, second, third, etc. have been used herein to describe several parameters, variables or devices, it will be understood that the parameters, variables or devices should not be limited by these terms since the terms are only used to distinguish one parameter, variable or device from another. For example, the first sensor could as well be named second sensor, and the second sensor could be named first sensor without departing from the scope of this disclosure.

    [0185] In this text, the term “comprises” and its derivations (such as “comprising”, etc.) should not be understood in an excluding sense, that is, these terms should not be interpreted as excluding the possibility that what is described and defined may include further elements, steps, etc.

    [0186] On the other hand, the invention is obviously not limited to the specific embodiment(s) described herein, but also encompasses any variations that may be considered by any person skilled in the art (for example, as regards the choice of materials, dimensions, components, configuration, etc.), within the general scope of the invention as defined in the claims.