Joint Axis Direction Estimation
20220409097 · 2022-12-29
Inventors
Cpc classification
A61B5/6801
HUMAN NECESSITIES
A61B2560/0223
HUMAN NECESSITIES
A61B5/1121
HUMAN NECESSITIES
A61B2562/0219
HUMAN NECESSITIES
International classification
Abstract
A method for calibrating respective estimated joint axis directions for each of a pair of body mounted sensors, one of the pair of sensors being located to each side of the joint comprising a joint axis, the sensors each calculating a pitch angle about respective first sensor axes and a roll angle about respective second sensor axes, the first and second sensor axes together with a third sensor axis orthogonal to the first and second sensor axes forming a sensor frame, the method comprising: receiving orientation data for each of the two sensors, the orientation data being associated with at least two different poses of the joint for each of the two sensors and the orientation data comprising the pitch angle and the roll angle of the sensor for each pose; calculating a sensor frame estimated gravity vector for each pose associated with each sensor based on the pitch and roll angles for each pose associated with each sensor and a gravity vector running along a vertical direction; and determining the estimated joint axis directions for the joint axis, relative to the first and second sensor axes, for each sensor that minimise a loss function concerning projections of each sensor frame estimated gravity vector for each pose associated with each sensor on to the estimated joint axis direction for the respective sensor.
Claims
1. A method for calibrating estimated joint axis directions for each of a pair of sensors, one of the pair of sensors being mounted to each side of a joint comprising a joint axis, each sensor of the pair of sensors calculating a pitch angle about a first sensor axis and a roll angle about a second sensor axis, the first sensor axis and second sensor axis together with a third sensor axis orthogonal to the first sensor axis and the second sensor axis forming a sensor frame, the method comprising: receiving orientation data for each of the pair of sensors, the orientation data being associated with at least two different poses of the joint for each of the pair of sensors and the orientation data comprising the pitch angle and the roll angle of the sensor for each pose; calculating a sensor frame estimated gravity vector for each pose associated with each sensor based on the pitch and roll angles for each pose associated with each sensor and a gravity vector running along a vertical direction; and determining the estimated joint axis directions for the joint axis, relative to the first sensor axis and the second sensor axis, for each sensor that minimise a loss function concerning projections of each sensor frame estimated gravity vector for each pose associated with each sensor on to the estimated joint axis direction for the respective sensor.
2. The method according to claim 1, wherein the gravity vector runs along the vertical direction in a negative direction.
3. The method according to claim 1, wherein the gravity vector runs along the vertical direction in an upward direction.
4. The method according to claim 1, wherein calculating the sensor frame estimated gravity vector for each pose associated with each sensor comprises forming a rotation matrix for each pose associated with each sensor using the pitch angle and the roll angle for each pose associated with each sensor.
5. The method according to claim 4, wherein forming the rotation matrix for each pose associated with each sensor using the pitch and roll angles for each pose associated with each sensor comprises assuming the rotation about the third sensor axis is zero.
6. The method according to claim 4, wherein each rotation matrix defines the rotation of the respective sensor about the first sensor axis, the second sensor axis, and the third sensor axis.
7. The method according to claim 4, wherein calculating the sensor frame estimated gravity vector for each pose associated with each sensor comprises applying the rotation matrix for each pose associated with each sensor to the gravity vector to transform the direction in which the gravity vector acts to that of the respective pitch and roll angle of the sensor.
8. The method according to claim 1, wherein the sensor frame estimated gravity vector are vectors defining the direction along which the gravity vector acts for the respective pitch and roll angle of the sensor.
9. The method according to claim 1, the method comprising: receiving a register pose signal which indicates the joint is in one of the poses of the at least two different poses; and in response to the register pose signal, storing the orientation data for each of the pair of sensors from when the register pose signal is received as the orientation data for that one of the poses of the at least two different poses.
10. The method according to claim 9, the method comprising repeating the steps of claim 9 for each pose of the at least two different poses.
11. The method according to claim 1, wherein the orientation data is associated with four different poses of the joint for each of the pair of sensors.
12. The method according to claim 1, wherein the poses are selected from, or are all of, a sitting extension pose, a sitting pose, a standing pose and a standing flexion pose.
13. The method according to claim 1, wherein the estimated joint axis directions are each three-dimensional vectors in a coordinate system for each sensor of the pair of sensors, the coordinate system being defined by the first sensor axis, the second sensor axis, and the third sensor axis for each sensor of the pair of sensors.
14. The method according to claim 1, wherein the projections of each sensor frame estimated gravity vector for each pose associated with each sensor on to the estimated joint axis direction for the respective are calculated by taking a scalar product of the estimated joint axis direction for a particular sensor with the respective sensor frame estimated gravity vector.
15. The method according to claim 1, wherein the loss function combines each of the projections on to the estimated joint axis direction with the sensor frame estimated gravity vector for one sensor of the pair of sensors with the projection on to the estimated joint axis direction with the sensor frame estimated gravity vector for a remaining sensor of the pair of sensors.
16. The method according to claim 15, wherein the combination of the projections of the pair of sensors is taking a difference between the projections.
17. The method according to claim 15, wherein the loss function aggregates the combination of the projections for each pose.
18. The method according to claim 17, wherein the loss function aggregates the combination of the projections for each pose by summing together the combined projections.
19. The method according to claim 17, wherein the loss function aggregates a square of the combination of the projections for each pose.
20. The method according to claim 1, the method comprising calculating an angle of the joint about the joint axis using the estimated joint axis directions for the joint axis for each sensor and orientation data for each of the pair of sensors.
21. (canceled)
Description
[0017] The present invention will now be described by way of example with reference to the accompanying drawings. In the drawings:
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024] The following description is presented to enable any person skilled in the art to make and use the invention, and is provided in the context of a particular application. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art.
[0025] The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present invention. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
[0026] The present invention relates to a method for calibrating respective estimated joint axis directions for each of a pair of body mounted sensors, one of the pair of sensors being located to each side of the joint comprising a joint axis, the sensors each calculating a pitch angle about respective first sensor axes and a roll angle about respective second sensor axes, the first and second sensor axes together with a third sensor axis orthogonal to the first and second sensor axes forming a sensor frame. The method comprises receiving orientation data for each of the two sensors, the orientation data being associated with at least two different poses of the joint for each of the two sensors and the orientation data comprising the pitch angle and the roll angle of the sensor for each pose. The method further comprises calculating a sensor frame estimated gravity vector for each pose associated with each sensor based on the pitch and roll angles for each pose associated with each sensor and a gravity vector running along a vertical direction. The method further comprises determining the estimated joint axis directions for the joint axis, relative to the first and second sensor axes, for each sensor that minimise a loss function concerning projections of each sensor frame estimated gravity vector for each pose associated with each sensor on to the estimated joint axis direction for the respective sensor. The sensor frame may be a coordinate system of the respective sensor. The coordinate system being defined by the three sensor axes of the respective sensor.
[0027]
[0028]
[0029] The orientation of any sensors associated with the knee typically have two components. A rotation of the sensor about the x-axis is a roll motion, identified by arrow 18, and defines a roll angle. A rotation of the sensor about the y-axis is a pitch motion, identified by arrow 19, and defines a pitch angle. Each sensor has its own sensor frame of reference which defines the rotational direction of the sensor about each of the axes.
[0030]
[0031] However, as will be appreciated, the shape and form of a human leg does not generally permit such alignment to be possible, so when the sensors 10a, 10b are in place as shown in
[0032]
[0033] The sensor device 10 comprises a processor 24 and a non-volatile memory 25. The sensor device 10 may comprise more than one processor 24 and more than one memory 25. The memory 25 stores a set of program instructions that are executable by the processor, and reference data such as look-up tables that can be referenced by the processor in response to those instructions. The processor 24 may be configured to operate in accordance with a computer program stored in non-transitory form on a machine-readable storage medium. The memory 25 may be the machine-readable storage medium. The computer program may store instructions for causing the processor to perform the method described herein.
[0034] The processor 24 may be connected to the wireless communication unit(s) 22 to permit communication between them. The processor 24 may use at least one wireless communication unit to send and/or receive data over a wireless communication network. For instance, the wireless communication unit(s) 22 may be: [0035] A cellular communication unit configured to send and receive data over a cellular communication network. The cellular communication unit may be configured to communicate with cellular base stations to send and receive data. [0036] A Wi-Fi communication unit configured to send and receive data over a wireless communication network such as a Wi-Fi network. The Wi-Fi communication unit may be configured to communicate with wireless base stations to send and receive data. [0037] A Bluetooth communication unit configured to send and receive data over a Bluetooth communication network. The Bluetooth communication network may be configured to communicate with other Bluetooth devices to send and receive data.
[0038] It will be appreciated that the wireless communication unit(s) may be configured to communicate using other wireless protocols.
[0039] One or more of the wireless communication units 22 may be part of processor 24. Part or all of a wireless communication unit's function may be implemented by processor 24 by processor running software to process signals received by an antenna 23.
[0040] The sensor device 10 may use the wireless communication unit(s) 22 to communicate between the sensor device 10 and another sensor device 10 to share information concerning the sensed rotation of the sensor device 10 with the other sensor device 10. In this case, the sensor devices 10 may make use of a short range communication protocol such as Bluetooth or Zigbee. The sensor device 10 may also communicate with another form of device such as a smartphone. This communication may be used to share data concerning the knee angle estimates over time either in the form of aggregated data collected over time or by streaming live knee angle estimates to the smartphone as they are calculated. In this case, the sensor device 10 may use a short-range communication protocol such as Bluetooth or Zigbee if the other device is located nearby, or a longer-range communication protocol such as Wi-Fi or even cellular-based communications.
[0041] The sensor device 10 may comprise a power source 29 such as a battery. The sensor device 10 may accept an external power supply to enable the power source 29 to be charged. The sensor device 10 may be wirelessly chargeable. The sensor device 10 may also comprise a display. The sensor device 10 may be configured to display information on the display. The sensor device 10 may also comprise a user interface. The user interface may be configured to permit a user of the device to interact with the sensor device 10. The user interface may at least in part be formed as part of the display. For instance, the display may be a touch screen and display buttons and other interactive features of the display that the user can interact with by touching the touch screen.
[0042] The sensor device 10 comprises at least one movement sensor. The processor 24 is connected to the movement sensors to permit communication between them. The processor 24 can receive movement data from the movement sensors. The movement sensors may comprise at least one accelerometer, a magnetometer, and/or a gyroscope. The processor 24 can use the movement data from the movement sensors to derive information about the current movement and, in particular, the current orientation of the sensor device 10. Advantageously, the sensor device 10 comprises a triaxial accelerometer and a gyroscope. Some or all of the movement sensors may be packaged together in an inertial measurement unit (IMU).
[0043] The mobile device comprises at least one accelerometer 26. The accelerometer may calculate the acceleration rate that the device 10 is being moved in a direction. The accelerometer may output time series data of acceleration readings in the direction that the accelerometer 26 gathers data. The device 10 may comprise more than one accelerometer 26. The accelerometers 26 may be orientated so as to gather acceleration readings in different directions. The accelerometers 26 may gather acceleration readings in orthogonal directions. The device may comprise three accelerometers 26 each gathering acceleration readings in different, orthogonal directions, thus the device may comprise a triaxial accelerometer. The processor 4 can receive the time series data of acceleration readings from the at least one accelerometer.
[0044] The mobile device may comprise a magnetometer 27. The magnetometer may calculate the orientation of the device 10 relative to the local magnetic field of the Earth. This can be used to derive data concerning the movement of the device 10 relative to the surface of the Earth. The magnetometer may be a hall effect sensor that detects the magnetic field of the Earth. The magnetometer 27 may output time series data of rotation movement readings relative to the magnetic field of the Earth. The processor 24 can receive the time series data of rotation movement readings.
[0045] The mobile device 10 comprises a gyroscope 28. The gyroscope 28 may be a MEMS gyroscope. The gyroscope 28 may calculate the rate of rotation about a rotation axis that the device 10 is being moved about. The gyroscope 28 may output time series data of rotation rate readings about the rotation axis that the gyroscope 28 gathers data. The time series data of rotation rate readings may be rotation rate data. The gyroscope 28 may gather data about the rate of rotation about more than one rotation axis that the device 10 is being moved about. The gyroscope 28 may calculate the rate of rotation about two rotation axes that the device 10 is being moved about. The gyroscope 28 may calculate the rate of rotation about three rotation axes that the device 10 is being moved about. Thus, the gyroscope 28 may be a triaxial gyroscope. The rotation axes may be orthogonal to each other. The gyroscope 28 may output time series data of rotation rate readings about each rotation axis that the gyroscope 28 gathers data. The time series comprise rotation reading(s) at each time step in the time series. The processor 24 can receive the time series data of rotation rate readings about one or more axes.
[0046] As discussed herein, two sensor devices 10a, 10b are used to calculate an estimate of knee angle at each time step. One sensor device 10a acts as a master sensor device and one sensor device 10b acts as a slave sensor device. The slave sensor device 10b sends orientation data for each time step to the master sensor device 10a. The master sensor device 10a then processes the slave's orientation data together with its own orientation data to estimate the knee angle at each time step. The orientation data is sent from the slave to the master using the wireless communication units 22 present in each sensor device 10a, 10b.
[0047] The orientation data comprises a pitch angle and a roll angle that has been sensed by the sensor device 10 at a particular time step. The pitch angle sensed by the sensor device 10 is about a first sensor axis. This first sensor axis may be known as a pitch axis. The roll angle sensed by the sensor device 10 is about a second sensor axis. This second sensor axis may be known as a roll axis. Each sensor device 10 senses its orientation (and thus rotation) about their own respective first and second sensor axes. The first and second sensor axes are orthogonal to each other. There is a third sensor axis about which the sensor 10 can move. This sensor axis is orthogonal to the first and second sensor axes. In the case of a knee joint, the sensors may be attached to the body about the joint so that the third sensor axis runs in a generally vertical direction when the user is standing up with the leg fully extended. Thus, in this position the third sensor axis may run generally along the third global coordinate frame axis. The sensors may be attached to the body so that the first sensor axis runs generally parallel to the joint axis, however as described herein there may be some difference between the joint axis and the sensor axis which needs to be corrected for. The sensors may be attached to the body so that the second sensor axis points in a forward direction and runs perpendicular to the first sensor axis. The three sensor axes define the sensor frame of reference.
[0048] The pitch and roll angles are derived from the data calculated by the movement sensors. For instance, data from the accelerometers and the gyroscope may be combined to give the current pitch and roll angles of the sensor device 10. The sensor device 10 may use current and historic data from the movement sensors to derive the current pitch and roll angles for the sensor device 10. The method by which the pitch and roll angles are calculated may use any conventional method. By way of example, one such method is described in “Estimation of IMU and MARG orientation using a gradient descent algorithm” S. Madgwick et al, 2011 IEE International Conference on Rehabilitation Robotics, Rehab Week Zurich Science City, Switzerland, Jun. 29-Jul. 1, 2011.
[0049] A method for calculating an estimated direction of the joint axis in the sensor measurement frames will now be described with reference to
[0050] As shown at step 30, the sensors are attached to the body of a user about a joint. The joint has a joint axis and one of the pair of sensors is located to each side of the joint. The sensors are switched on and paired together so that one of the sensors 10b sends its orientation data to the other sensor 10a. As discussed herein, the sensor that sends data to the other sensor is a slave sensor 10b and the sensor that receives data from the other sensor is a master sensor 10a.
[0051] As shown at step 31, the user is instructed to orient the leg that has the sensors attached to it into one of the poses of at least two different poses. As shown at step 32, the master sensor receives orientation data for the pose from both itself and the slave sensor. In the case of the master sensor, it may receive the orientation data from a separate process running on the processor 4 which takes the movement sensor raw data and processes it to get the orientation data for that pose. As shown at step 33, the user sends a signal to the master sensor to indicate that the leg has been oriented in one of the poses of at least two different poses and in response to this signal the master sensor stores the orientation data as being associated with that particular pose. This signal may be sent to the master sensor by pressing a button on the master sensors. Alternatively, the master sensor may be in communication with another device, such as a smartphone, and the signal is sent from the other device to the master sensor. The user may have pressed a button on the other device to cause the signal to be sent to the master sensor. The process of steps 31 to 33 are repeated until the orientation data from both sensors for each of the poses has been received by the master sensor.
[0052] The poses that the user is instructed to put the leg in are used to orient the sensors in different directions to enable an estimate of the joint axis, as seen by each sensor, to be determined. The poses are chosen so that each pose gives some different information about the rotation of the sensors relative to the joint axis. For instance, the sensors are placed in the same or different rotational positions relative to each other so that the rotation axis of the joint runs in particular directions relative to the sensor position at in a given pose.
[0053] Advantageously, the user is instructed to orient the leg in four poses. These poses are shown in
[0054] Once the orientation data for each pose has been received from each of the two sensors, the orientation data can be processed to form estimated gravity vectors in the sensor frames for the poses. This is as shown in step 34. A sensor frame estimated gravity vector gives the orientation of the sensor relative to the gravity vector that would be recorded by the sensor based on the current rotation of the sensor about the roll and pitch axes assuming that gravity acts along a vertical direction (i.e. along the third global coordinate frame axis). Thus, the sensor frame estimated gravity vector may be based on the pitch and roll angles and a gravity vector running along a vertical direction. The sensor frame estimated gravity vectors are three dimensional vectors.
[0055] As described here, the orientation data for each pose from each sensor comprises a pitch angle and a roll angle. These describe the orientation of each sensor whilst the leg is in a particular pose. A rotation matrix is formed for each associated pitch angle and roll angle. I.e. there is one rotation matrix formed for the pitch angle and roll angle recorded for a particular pose by one of the sensors. Therefore, a rotation matrix is formed for each pair of pitch and roll angles associated with a respective pose for a respective sensor. The rotation matrix defines the rotation of the sensor about the three sensor axes. As only the roll and pitch measurements are important for calculating the knee joint angle, it is assumed that there is no rotation about the third sensor axis. An example rotation matrix for a pitch angle of θ and a roll angle of α about the first and second sensor axes respectively is:
[0056] where R.sub.i is the sensor i rotation matrix formed from the pitch and roll angles for a particular pose, i is either the first or second sensor devices, θ is the pitch angle, and α is the roll angle.
[0057] The rotation matrices are used to form the sensor frame estimated gravity vectors. This uses the assumption that gravity acts in a vertical direction and thus along the third global coordinate frame axis. In the example given herein, the third global coordinate frame axis is assumed to point towards the ground meaning that the acceleration due to gravity acts in an upward (negative) direction. The rotation matrices act upon the gravity vector to produce the sensor frame estimated gravity vectors. The rotation matrices rotate the gravity vector using the roll and pitch angles to calculate the direction in which gravity acts on the sensor whilst in that orientation defined by the roll and pitch angles. The sensor frame estimated gravity vectors may be calculated by:
a.sub.i=R.sub.i.sup.Tg
[0058] where a.sub.i is the sensor frame estimated gravity vector for sensor i based on particular roll and pitch angles, R.sub.i.sup.T is the transpose of the sensor i rotation matrix and g is the gravity vector g=[0, 0, −9.81].sup.T. i is either the master (M) or slave (S) sensor device to which the roll and pitch angles relate.
[0059] The use of the rotation matrices to transform the gravity vector to produce the sensor frame estimated gravity vector are advantageous because they only depend on the pitch and roll angles. These have been derived from the motion sensors inside of the sensor devices and so are based on more data than an accelerometer on its own can provide and so should provide a more accurate value for the pitch and roll angles. This then follows that the sensor frame estimated gravity vectors should also be more accurate than using accelerometer readings directly. This compound calculation to produce the pitch and roll angles also means that there is less dependency on the user being static at each pose than if the accelerometer outputs were used directly. This compound calculation to produce the pitch and roll angles also means that there is less dependence on the user being able to move quickly than if a gyroscope output was used directly. In addition, by converting the pitch and roll angles to rotations and then to sensor frame estimated gravity vectors less data is required to input into the loss function making it more efficient to calculate. This thus provides advantages over the method described in “On motions that allow for identification of hinge joint axes from kinematic constraints and 6D IMU data”, Danny Nowka et al, available at https://www.control.tu-berlin.de/wiki/images/b/b3/Nowka2019_ECC.pdf.
[0060] As shown in step 35, estimated joint axis directions relative to the first and second sensor axes for each sensor device 10 are determined. The estimated joint axis directions are each three-dimensional vectors in the coordinate system of the respective sensor device. The coordinate system of the sensor device being defined by the three sensor axes. The estimated joint axis directions are determined by finding the joint axis directions that minimise a loss function concerning the projections of each sensor frame estimated gravity vector for each pose associated with each sensor on to the estimated joint axis direction for the respective sensor. The sensor frame estimated gravity vectors are protected on to the estimated joint axis direction. This projection may involve taking the scalar product of the estimated joint axis direction for a particular sensor with the sensor frame estimated gravity vector for a particular sensor associated with a particular pose. The loss function may combine the projection on to the estimated joint axis direction with the sensor frame estimated gravity vector for one sensor with the projection on to the estimated joint axis direction with the sensor frame estimated gravity vector for the other sensor. The combination of the projections of the two sensors may be the difference between the two projections.
[0061] The loss function may combine together the combined projections for each pose. In other words the loss function aggregates the combined projections for each pose. This combination may be the sum of the combined projections. The combination of the combined projections for each pose may involve combining the square of the combined projections for each pose together. The square of each of the combined projections may be summed together. Instead of the square of the combined projections, the loss function may take the magnitude of the combined projections.
[0062] The loss function may be calculated by the equation:
[0063] where L is the loss function, j.sub.i is the estimated joint axis direction for sensor i, a.sub.M.sup.k is the sensor frame estimated gravity vector for sensor i in pose k, k are the poses for which orientation data has been recorded, and N is the number of poses. In the advantageous example described herein, the number of poses may be four.
[0064] The estimated joint axis directions for each sensor that provide the minimum of the loss function may be determined by any relevant method. For instance, an iterative approach may be used to approach the minimum value for the loss function whilst varying the direction of the two estimated joint axes.
[0065] As shown at step 36, once the estimated joint axis directions for each sensor have been determined, the master sensor device 10a can use these estimated joint axis directions to calibrate the calculations associated with the joint angle. The master sensor device receives orientation data from the slave device and also its own orientation processing section. The roll and pitch angles comprised in the orientation data for each time step can be transformed based on the estimated joint axis directions to determine the rotation of each of the two sensors about the estimated joint axis direction. The master device can then take the difference between the angle of one of the sensor device relative to the other to determine the current knee joint angle about the joint axis. Corrections to the calculated knee joint angle may be made to account for misplacement of the sensors.
[0066] The above method therefore provides the advantage of providing a correction method to make the calculation of the knee joint angle, or other joint angle, more accurate. This can improve the accuracy of the data gathered by these devices and thus permit better analysis of the movement of the joint.
[0067] The applicant hereby discloses in isolation each individual feature described herein and any combination of two or more such features, to the extent that such features or combinations are capable of being carried out based on the present specification as a whole in the light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein, and without limitation to the scope of the claims. The applicant indicates that aspects of the present invention may consist of any such individual feature or combination of features. In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the invention.