Providing visual motion feedback based on sensor data

09827478 · 2017-11-28

Assignee

Inventors

Cpc classification

International classification

Abstract

The present disclosure is directed to a body-worn sensor-based system for evaluating the biomechanics and the motor adaptation characteristics of postural control during a sport activity such as a golf swing. Various embodiments use sensors such as accelerometers, gyroscopes, and magnetometers to measure the three-dimensional motion of ankle and hip joints. In several embodiments, additional sensors attached to other body segments are used to improve the accuracy of the data or detect particular instants during the swing (e.g., top of back swing, instant of the maximum speed of arm, and instant of ball impact). In a golf embodiment, the system combines the measured data in conjunction with a biomechanical model of the human body to: (1) estimate the two-dimensional sway of the golfer's center of mass; (2) quantify and evaluate the golfer's balance via his/her postural compensatory strategy; and (3) provide visual feedback to the golfer for improving dynamic postural control.

Claims

1. A method for providing data for use in displaying visual feedback to a person engaged in a physical activity, comprising: electronically receiving with one or more computing devices data collected from one or more sensors supported by a body of a person; determining, with the one or more computing devices, one or more center of mass positions of the person based on the data collected from the one or more sensors, the data reflecting movement and position of the person while performing the activity; generating first visual display data corresponding to a first visual indication reflecting said one or more determined center of mass positions; generating second visual display data corresponding to a second visual indication reflecting a target center of mass position; and providing the first and second visual display data for use in displaying the first and second visual indications to the person, wherein the first and second visual indications enable the person to determine a difference between (1) at least said one or more center of mass positions of the person while performing the activity reflected by the first visual indication and (2) the target center of mass position reflected by the second visual indication, and wherein said one or more sensors comprise at least one of: an accelerometer, gyroscope, and magnetometer.

2. The method of claim 1, wherein the one or more sensors are worn by the person.

3. The method of claim 1, wherein the one or more sensors are in a device that is carried by the person or embedded into clothing of the person.

4. The method of claim 1, wherein the activity is training.

5. The method of claim 1, wherein said target center of mass position that has been determined to contribute to a desired performance of the activity is determined based on data collected from the sensors when the person is in a standing position and not performing the activity.

6. A system for providing data for use in displaying visual feedback relating to a physical activity, the system comprising: a computing system configured to receive sensor data from one or more sensors supported by a body of a person, the data reflecting movement and position of the person during the activity; wherein the computing system is further configured to: determine one or more center of mass positions of the person based on the data collected from said one or more sensors; generate visual display data corresponding to a visual feedback about the activity, the visual feedback reflecting a difference between said one or more determined center of mass positions and a target center of mass position; and provide the visual display data for use in displaying the visual feedback to the person, and wherein said one or more sensors comprise at least one of: an accelerometer, gyroscope, and magnetometer.

7. The system of claim 6, wherein said target center of mass position that has been determined to contribute to a desired performance of the activity is determined based on data collected from said one or more sensors when the person is in a standing position and not engaged in the activity.

8. The system of claim 6, wherein said one or more sensors comprise sensors embedded in a mobile device carried by the person.

9. A method for providing data for use in presenting feedback relating to a physical activity, the method comprising: electronically receiving with one or more computing devices data collected from one or more sensors supported by a body of a person engaging in an activity; determining, with the one or more computing devices, one or more center of mass positions of the person based on the data collected from the one or more sensors, the data reflecting movement and position of the person while performing the activity; generating feedback data corresponding to feedback indicating a difference between said one or more determined center of mass positions and a target center of mass position; and providing the feedback data for use in presenting the feedback, wherein said one or more sensors comprise at least one of: an accelerometer, gyroscope, and magnetometer.

10. The method of claim 9, wherein the target center of mass position is determined based on sensor data received when the person is in a pre-defined position.

11. The method of claim 9, wherein the activity is a training.

12. The method of claim 9, wherein the activity is a sport.

13. The method of claim 10, wherein said pre-defined position is specified by a user input.

14. The method of claim 10, wherein said pre-defined position is a standing position and the person is not engaged in said activity.

15. A method for providing data for use in displaying visual feedback relating to a physical activity, comprising: electronically receiving with one or more computing devices data collected from one or more sensors supported by a body of a person; prompting for positioning of a body segment of the person in a pre-defined static position; determining, with the one or more computing devices, a target body segment position based on sensor data received when the body segment of the person is in the pre-defined static position such that the target body segment position is calibrated for the particular person based on the pre-defined static position; determining, with the one or more computing devices, one or more body segment positions of the person based on the data collected from the one or more sensors, the data reflecting movement and position of the person while performing said activity; generating first visual display data corresponding to a first visual indication reflecting said one or more determined body segment positions; generating second visual display data corresponding to a second visual indication reflecting the target body segment position; and providing the first and second visual display data for use in displaying the first and second visual indications to the person, wherein the first and second visual indications enable the person to determine a difference between (1) at least said one or more body segment positions of the person reflected by the first visual indication and (2) the target body segment position reflected by the second visual indication, and wherein said one or more sensors comprise at least one of: an accelerometer, gyroscope, and magnetometer.

16. The method of claim 15, wherein the activity is training.

17. The method of claim 15, wherein said target body segment position is determined based on sensor data received when the body segment of the person is in the pre-defined static position while not performing the activity.

18. A system for providing data for use in displaying visual feedback relating to a physical activity, the system comprising: a computing system configured to receive sensor data from one or more sensors supported by a body of a person, the data reflecting movement and position of the person during said activity; wherein the computing system is further configured to: prompt positioning of a body segment of the person in a pre-defined static position; determine a target body segment position based on sensor data received when the body segment of the person is in the pre-defined static position such that the target body segment position is calibrated for the particular person based on the pre-defined static position; determine one or more body segment positions of the person based on the data collected from said one or more sensors; generate visual display data corresponding to a visual feedback about the activity, the visual feedback reflecting a difference between said one or more determined body segment positions and the target body segment position; and provide the visual display data for use in displaying the visual feedback to the person, wherein said one or more sensors comprise at least one of: an accelerometer, gyroscope, and magnetometer.

19. The system of claim 18, wherein said target body segment position is determined based on sensor data received when the body segment of the person is in the pre-defined static position while not performing the activity.

20. The system of claim 18, wherein said one or more sensors include sensors embedded in a mobile device carried by the person.

21. A method for providing data for use in presenting feedback relating to a physical activity, the method comprising: electronically receiving with one or more computing devices data collected from one or more sensors supported by a body of a person engaged in the activity; prompting for positioning of a body segment of the person in a pre-defined static position; determining, with the one or more computing devices, a target body segment position based on sensor data received when the body segment of the person is in the pre-defined static position such that the target body segment position is calibrated for the particular person based on the pre-defined static position; determining, with the one or more computing devices, one or more body segment position of the person based on the data collected from the one or more sensors, the data reflecting movement and position of the person while performing the activity; generating feedback data corresponding to feedback indicating a difference between said one or more determined body segment positions and the target body segment position; and providing the feedback data for use in presenting the feedback, wherein said one or more sensors comprise at least one of: an accelerometer, gyroscope, and magnetometer.

22. The method of claim 21, wherein the target body segment position is determined based on sensor data received when the body segment of the person is in the pre-defined static position while not performing the activity.

23. The method of claim 21, wherein the activity is training.

24. The method of claim 21, wherein the activity is a sport.

25. The method of claim 21, wherein said pre-defined static position is specified by a user input.

26. The method of claim 21, wherein said pre-defined static position is a standing position and the person is not engaged in said activity.

27. The method of claim 21, wherein one of said one or more determined body segment positions is determined when said body segment of the person has reached maximum speed during the activity.

28. The method of claim 15, wherein said pre-defined static position corresponds to an initial target associated with the physical activity.

29. The method of claim 15, wherein said pre-defined static position corresponds to an address position or a standing position.

30. The system of claim 18, wherein said pre-defined static position corresponds to an initial target associated with the physical activity.

31. The system of claim 18, wherein said pre-defined static position corresponds to an address position or a standing position.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) Certain embodiments of the invention will now be described with reference to the drawings summarized below. These drawings and the associated description are provided to illustrate various embodiments of the invention, and not to limit the scope of the invention.

(2) FIG. 1 is a diagram that illustrates the basic axes angles measured by the sensors according to one embodiment.

(3) FIG. 2 is a diagram that depicts an example two-link and three-link model for estimating COM trajectory according to one embodiment.

(4) FIG. 3 is a flow diagram depicting a process for estimating the position of COM based on a single-link model in accordance with one embodiment.

(5) FIG. 4 illustrates an example estimation of COM based on a two-link model and a three-link model in accordance with one embodiment.

(6) FIG. 5 shows a graph that depicts the pattern of speed estimated from integration of acceleration in a fixed frame in accordance with one embodiment.

(7) FIG. 6 shows an example user interface of a program that provides visual feedback on a golf swing in accordance with one embodiment.

(8) FIG. 7 shows an example user interface of a program that helps users identify a proper swing motion in accordance with one embodiment.

(9) FIG. 8 illustrates an example COM-based feedback provided to a user to help improve the user's swing motion in accordance with one embodiment.

(10) FIG. 9 illustrates a physical setup of the sensors in accordance with one embodiment.

(11) FIG. 10 shows two graphs depicting how a COM area is determined and how a non-linearity metric is determined according to one embodiment.

(12) FIGS. 11A and 11B are graphs depicting the COM area of motion for various example swing motions.

(13) FIG. 12 is a graph depicting the norm of COM in the course of several example swing trials.

(14) FIG. 13 is a block diagram illustrating the postural and balance evaluation system in accordance with one embodiment.

(15) FIG. 14 is a flow diagram illustrating a postural and balance evaluation method in accordance with one embodiment.

DETAILED DESCRIPTION OF CERTAIN EMBODIMENTS

(16) A sensor-based system and method for evaluating and enhancing postural control and balance in sports activities will now be described with reference to the drawings. Where possible, the same reference numbers are used throughout the drawings to refer to the same or like components. This description is intended to illustrate certain embodiments, but other embodiments that are apparent to those of ordinary skill in the art, including embodiments which do not provide all of the benefits and features set forth herein, are also within the scope of this invention. Thus, nothing in this detailed description is intended to suggest that any particular feature or component is essential. The invention is defined by the claims.

1. Overview

(17) Embodiments of the present invention relate to sensor-based systems and methods for evaluating and enhancing postural control and balance in sports activities (e.g., golf, tennis, baseball, hockey, etc.). From the postural-control standpoint, with respect to golf for example, it has been suggested that the ground reaction force (GRF), the position of the center of mass (COM), and the fluctuation of foot pressure could be used to quantify postural control during the swing. Although measuring GRF has been cited in several publications, very few studies have explored the variation of the position of center of mass (COM) in a sport activity such as a golf swing. On the other hand, COM position is generally a key parameter in characterizing appropriate postural control. At this point, the state of the art does not provide a solution on how to:

(18) 1. Accurately measure the movement of a person's COM during a swing in real-life conditions and/or in real-time using a low-cost and easy-to-use system that minimally interferes with the sport activity itself. In-the-field measurements of COM are currently difficult to implement. Further, while some techniques can accurately measure the COM position during quiet standing, they have significant inaccuracies in measuring COM position during a sport motion (e.g., a swing motion).

(19) 2. Objectively and reliably separate a sport activity or motion into distinct temporal phases (e.g., for golf: the back swing, the down swing, the top of the back swing, the time of contact with the ball, the time of maximum speed of the COM, and the location at which the arm has its maximum speed in reference to COM).

(20) 3. Use the information gained from the above steps to help a person improve his or her postural control to have a better motion such as a better golf swing.

(21) Until recently, measurement of body motion under real-life conditions has been difficult and impractical. Traditionally, balance has been assessed using laboratory-based systems such as optical motion measurement systems and force platform in a gait laboratory. While these systems are clinically accepted as “the gold standard,” several drawbacks render them unsuitable for clinical and sports applications. Specifically, currently-available motion-analysis systems are time-consuming, expensive, and require an existing infrastructure and highly-trained personnel.

(22) The recent emergence of body-worn sensor technology has provided new ways to measure and assess motor functions, including balance and physical activity. In particular, the combination of accelerometers with angular-rate sensors (gyroscopes) holds great promise for hybrid kinematic sensor modules that measure the three-dimensional kinematics of body segments. Three key advantages render body-worn sensors ideal tools for developing clinical and sports-training applications: (1) they are inexpensive; (2) they are light-weight and portable; and (3) they do not require a specific environment or installation of any particular infrastructure. Body-worn sensors consequently have tremendous potential for enabling physicians/trainers to evaluate postural control under real conditions by quantifying its subcomponents, such as body-segment kinematics data, kinetic data, and (feedforward) motor adaptation.

2. Posture Control and Balance Evaluation System Overview

(23) Embodiments of the invention are directed to an innovative, portable, and cost-effective body-worn sensor-based system to evaluate the biomechanics and the motor adaptation characteristics of postural control during a sport activity such as a golf swing. Various embodiments use the above described recently available sensor technology such as accelerometers, gyroscopes, and magnetometers to measure the three-dimensional motion of ankle and hip joints. In one embodiment, the sensors may be embedded in a device such as a smartphone carried by the person or embedded into the clothing of the person. In several embodiments, additional sensors could be attached to other body segments to improve the accuracy or to measure particular instants during a golf swing, such as the top of back swing, the instant of the maximum speed of arm, and the instant of impacting the ball. In one embodiment applied to golf, the system combines the measured data in conjunction with a biomechanical model of the human body to:

(24) 1. Estimate the two-dimensional sway (i.e., movements back and forth, or sideways) of the golfer's COM. This is a by-product of finding the position of the COM over time using the n-link models (n=1, 2, 3, . . . etc.).

(25) 2. Quantify and evaluate the golfer's overall balance via his/her postural compensatory strategy (i.e., how the movement of the upper limbs compensates for the movement of lower limbs in an attempt to optimize postural control during the swing and to maximize the power of the swing).

(26) 3. Provide feedback to the golfer for improving his or her dynamic postural control, using an interface (e.g., a visual interface; see, e.g., FIG. 6). In one embodiment, the system uses the golfer's estimated COM position at specified time points during the swing phase. Examples of chosen time points are: the instant when the hand velocity during the downswing is at its maximum; the top of the down-swing; and the time of impact with the ball.

(27) FIG. 14 is a flow diagram illustrating a postural and balance evaluation method 1400 in accordance with one embodiment. At block 1402, sensor data is received. The sensor data may include data collected from sensor(s) worn by the person engaged in a sport activity. At block 1404, the system may determine postural data including COM position(s) of the person engaging in the sport activity. The term “determine” as used herein should be broadly interpreted to mean any method of determining the COM positions, including deriving the actual COM positions as well as estimating the COM positions. As will be further described, the COM positions may be determined based on a single-link model (treating the body as a single segment) or a multi-link model (separating the body into multiple segments with postural/movement/position data associated with each). In one embodiment, a multi-link model is used to account for movement around certain body joints such as hips, ankles, and knees. In one embodiment, the COM position(s) are determined at various temporal phases of a particular sport motion within the activity (e.g., swing of the golf club, swing of the baseball bat, swing of the tennis racket, etc.). The temporal phases may be pre-defined to capture certain portions of the motion that have been deemed critical to improved performance or user-defined (e.g., to allow a user to focus on his or her areas of weaknesses). For example, in baseball and tennis, one COM position that is particularly relevant to improved performance is the COM position when the arm speed is at a maximum. At 1406, in one embodiment, feedback data is outputted by the system. The feedback data may be provided in one embodiment via a user interface display. The feedback data in one embodiment includes a comparison of a determined COM position with an ideal COM position. The ideal COM position may be determined based on sensor data collected when the user is in a standing position (e.g., standing upright for a few seconds and not engaged in the sport activity/motion). In one embodiment, the postural data and/or feedback data may be stored in a mass storage within the system or remotely on a server. The stored postural and/or feedback data may be combined with data related to a future performance of the sport activity to support data tracking and analysis for improved performance of the sport activity.

(28) FIG. 2 depicts two biomechanical models of the human body that could be used for any or all of the above listed operations. Depicted in the figure are a two-link model 200 and a three-link model 202. For example, during a golf swing, the motion of hip joint cannot be neglected compared to the ankle motion. To overcome this shortcoming, a two- or three-segment model of the body is used in some embodiments to calculate the anterior-posterior (A-P) and medial-lateral (M-L) angles during the movement. The use of these models in various embodiments will be further described below. In addition, any or all of the above operations may be done in either real-time or offline.

(29) Embodiments of the invention allow physicians and/or trainers to carry out several tasks under real-life and/or real-time conditions which have not been possible before. First, they can obtain accurate and objective assessments of postural control by the user in real-life conditions. Second, they can use the various embodiments of the present invention to analyze feedforward processes that shape motor action on the basis of previously executed actions. In addition to golf, various embodiments of this invention are useful for assessing the balance of patients suffering from a variety of balance control disorders. For example, an embodiment of this invention may be a useful tool in studying learning and memory disorders and the design of optimally-efficient training paradigms.

(30) As another example, various embodiments of this invention can implement motor-adaptation paradigms that will be designed and used to analyze specific feedforward processes. For example, an important obstacle in exploring the motor learning process in golfers is the unavailability of a low-cost and easy-to-use technology allowing the analysis of motor response to sensory feedback cues. Various embodiments of this invention employ MEMS technology (e.g., the sensors can be MEMS-based) combined with a novel biomechanical model of the human body to quantify and improve the postural control during golf swing. One embodiment of the invention 1) estimates the user's COM position at particular moments during the down-swing phase; 2) characterizes COM trajectory to assess the quality of the swing; and 3) provides visual feedback, for the user, on the position of the COM at particular moments during the swing (e.g., the position of COM at maximum arm speed during down-swing phase) with respect to predefined positions. In one embodiment, the positions can include positions such as (1) the position of COM during upright position, (2) at the “address” position (i.e., when the player is standing ready at the ball and about to begin the swing), (3) at the top of the backswing, and (4) at the time of impact. In one embodiment, the position can be user-defined.

3. Mechanics of the Model(s)

(31) 3.1. Estimation of the Angles

(32) With reference to FIG. 1, in one embodiment, each sensor is configured to provide quaternions (qw, qx, qy, qz) that are subsequently converted to Euler angles (θ, φ, ψ). The resulting three-dimensional angles are used to estimate the trajectory of the user's hand, ankle, hip, and COM during the swing. The Euler angles, used to describe a sequence of three rotations determining the orientation of a rigid body in three dimensions, are (in their order of application): i) Heading 106 (θ); ii) Attitude 110 (φ); and iii) Bank 108 (ψ). It should be noted that the choice of the X-Y-Z coordinate system about which the heading, attitude, and bank angles are defined is arbitrary, and FIG. 1 illustrates one such choice.

(33) In one embodiment, the quaternion output of the calibrated sensor during the swing, q.sub.FINAL, is converted to the Euler angles as follows:

(34) q FINAL = q w + q x i + q y j + q z k θ = a tan [ 2 .Math. q y .Math. q w - 2 .Math. q x .Math. q z 1 - 2 .Math. q y 2 - 2 .Math. q z 2 ] φ = a sin ( 2 .Math. q x .Math. q y - 2 .Math. q z .Math. q w ) ψ = a tan ( 2 .Math. q x .Math. q w - 2 .Math. q y .Math. q z 1 - 2 .Math. q x 2 - 2 .Math. q z 2 )

(35) In the above equations, q.sub.x, q.sub.y, q.sub.w, and q.sub.z represent the components of the quaternion output q.sub.FINAL.

(36) 3.2. Estimation of the Displacement from the Angles

(37) Several ways exist for estimating the COM of the user during the swing. For example, in a well-known method in gait analysis, one can use the acceleration signal obtained from a sensor placed on the sacrum of the user, often the best position to monitor the COM. Although this approach may produce accurate results during quiet standing or walking on a straight line, it may be inappropriate for assessing the COM during a swing. This approach assumes a single invert pendulum model in which body mass rotates around the ankle joint (assuming a negligible motion of a hip joint.)

(38) During a golf swing, however, the motion of hip joint cannot be neglected compared to the ankle motion. To overcome this shortcoming, in one embodiment, a two- or three-segment model of the body is used to calculate the anterior-posterior (A-P) and medial-lateral (M-L) angles during the movement.

(39) 3.2.1. The Single-Link Model for Estimating the Position of the COM

(40) FIG. 3 depicts an example method for estimating the position of the COM based on a single-link model. Although the depicted example method is based on a single-link model, portions of the method are applicable to multi-link models such as the two-link and three-link models, as further described below. Block 300 shows the receipt of a sensor signal (e.g., from a sensor affixed to the lower back region of the user). The sensor signal is fed to processes 302 and 304 to extract the global quaternions and accelerations. Angles are estimated in block 306 based on the global quaternions. The calculations result in fixed acceleration in block 308, since the acceleration in each axis needs to be expressed in a fixed frame. To accomplish this, in one embodiment, the quaternion output of each sensor is transformed via a rotation matrix M.sub.ROTATION. For a local calibration of the sensor placed at the sacrum of the user (Y-axis oriented downwards—see FIG. 1), the acceleration vector can be expressed as:

(41) ( a x - fixed a y - fixed a z - fixed ) = M ROTATION .Math. ( a x a y a z ) + ( 1 0 0 ) , where M ROTATION = [ 1 - 2 q y 2 - 2 q z 2 2 q x q y - 2 q z q w 2 q x q y + 2 q y q w 2 q x q y + 2 q z q w 1 - 2 q x 2 - 2 q z 2 2 q y q z - 2 q x q w 2 q x q z - 2 q y q w 2 q y q z + 2 q x q w 1 - 2 q x 2 - 2 q y 2 ]

(42) The acceleration values are normalized by the value of the gravitational constant g, and thus are unitless.

(43) In one embodiment, next, in block 310, the fixed accelerations (three signals in the x, y, and z directions, respectively) are filtered with a band-pass filter (e.g., a wavelet) to remove the DC component. After filtering, in block 312, a trapezoidal integration routine is used to compute the velocity of the movement. Even for this relatively fast motion, the resulting integration can exhibit some drift. The drift can be represented as an offset s added to the value of the acceleration:
v(t)=∫.sub.t.sub.0.sup.t(a(u)+ε).Math.du

(44) Drift can be removed in several ways. In one embodiment, it can be assumed that velocity is nearly zero at the beginning and the end of the swing motion. To identify the zero velocity segments at the beginning and the end of the swing, one embodiment searches those periods in the beginning (i.e., the “address position,” which is conventionally defined as the position a golfer takes as he or she stands over the ball, ready to swing) and the end of the swing (i.e., impact) in which a segment of velocity data composed of at least 10 consecutive samples that produces a standard deviation of less than a fixed threshold value (less than 0.001 m/s.) In one embodiment, the beginning and the end of swing are automatically identified using the signals extracted from the sensor attached to the arm. In one embodiment, this is based on the arm position at address. The beginning of the swing is defined as the first movement of the arm in one embodiment. In one embodiment, impact determines the end of the swing. The top of the backswing is defined as the point where the COM changes directions in the side-to-side motion. These assumptions and definitions enable the definition of three segments during the swing: 1) the beginning of the swing; 2) the swing itself; 3) and impact, as depicted by the three parts in the graph shown in FIG. 5. The graph in FIG. 5 shows the pattern of speed estimated from integration of acceleration in a fixed frame. Thus one embodiment includes an algorithm designed to identify those parts by assessing the standard deviation of the velocity at the beginning and the end of swing. These parts of the signal are used to estimate the drift of integration.

(45) The detection of the movement at block 314 helps identify the three parts as discussed above. The detection of the movement assumes that before starting the swing, the user is at the rest position (no movement). In one embodiment, identifying this period is critical for removing the drift that gets added during the integration step. To identify the rest period, the system in one embodiment detects the initiation of movement via the estimated velocity and then assumes that, during the few points prior to that instant, the user was at rest (velocity=0). With the three parts identified, returning to FIG. 3, in block 316, the drift of the first part of the speed (beginning of the swing) is calculated and removed from the whole signal. Then, in block 318, the drift of the third part (at impact) is estimated and removed. In block 320, a band-pass filter is used for the second and last remaining part of the speed (the part that covers the swing itself).

(46) Finally, once drift has been removed from the velocity, the position of the COM (324) is found by integrating the velocity vector (block 322).

(47) 3.2.2. The Multi-Link Models for Estimating the Position of the COM

(48) FIG. 4 illustrates a two-link and a three-link model of the human body that can be used to estimate a person's COM once the joint angles are estimated, as described, for example, in Section 3.2 above. Anthropometric data provides the values for the parameters used for each segment. FIG. 9 illustrates an example of sensor attachment 900 for a three-link model. The sensors attached to the user's shin (906), thigh (908), and back (904) provide, respectively, the angle of ankle (θa), knee (θk) and hip (θh) joints in the anterior-posterior (A-P) plane, as illustrated in the representative human body figure 402 within FIG. 4. Note that in other embodiments, the positions of the sensors may be different, or the sensors may be embedded into clothing worn by the person at those specific positions. Having the anthropometric data of each user (e.g., body mass m and height H), the position of each link's COM in the sagittal plane can be described as:

(49) i .fwdarw. = ( T 1 .Math. sin θ a , T 1 .Math. cos θ a ) j .fwdarw. = ( L 1 .Math. sin θ a + T 2 .Math. sin θ k , L 1 .Math. cos θ a + T 2 .Math. cos θ k ) k .fwdarw. = ( L 1 .Math. sin θ a + L 2 .Math. sin θ k + T 3 .Math. sin θ t , L 1 .Math. cos θ a + L 2 .Math. cos θ k + T 3 .Math. cos θ t ) C O .fwdarw. M = ( m 1 .Math. i .fwdarw. x + m 2 .Math. j .fwdarw. x + m 3 .Math. k .fwdarw. x m 1 + m 2 + m 3 , m 1 .Math. i .fwdarw. y + m 2 .Math. j .fwdarw. y + m 3 .Math. k .fwdarw. y m 1 + m 2 + m 3 )

(50) Here, θ.sub.a, θ.sub.k, and θ.sub.t represent, respectively, the angular displacement of the ankle, knee, and trunk (as shown in the human body figure 402 in FIG. 4). The first component of the COM corresponds to the frontal direction or movement in anetior-posterior (A-P direction), which can be expressed as:

(51) C O .fwdarw. M = 1 m 1 + m 2 + m 3 .Math. ( m 1 .Math. T 1 .Math. sin θ a + m 2 .Math. ( L 1 .Math. sin θ a + T 2 .Math. sin θ k ) + m 3 .Math. ( L 1 .Math. sin θ a + L 2 .Math. sin θ k + T 3 .Math. sin θ t ) = 1 m 1 + m 2 + m 3 .Math. ( ( m 1 .Math. T 1 + m 2 .Math. L 1 .Math. + m 3 .Math. L 1 ) .Math. sin θ a + ( m 2 .Math. T 2 + m 3 .Math. L 2 ) .Math. sin θ k + m 3 .Math. T 3 .Math. sin θ t )

(52) The equations can be rewritten with three constants, as follows:

(53) C O .fwdarw. M = K 1 .Math. sin θ a + K 2 .Math. sin θ k + K 3 .Math. sin θ t where K 1 = m 1 .Math. T 1 + m 2 .Math. L 1 + m 2 .Math. L 1 m 1 + m 2 + m 3 , K 2 = m 2 .Math. T 2 + m 3 .Math. L 2 m 1 + m 2 + m 3 , and K 3 = m 3 .Math. T 3 m 1 + m 2 + m 3

(54) The equation of the COM in the medial-lateral direction (M-L) direction can be derived in an analogous fashion, with the angles expressed in the M-L direction.

(55) A similar analysis for the two-link model provides the following relationships:

(56) C O .fwdarw. M = K 1 .Math. sin θ a + K 2 .Math. sin θ t where K 1 = m 1 .Math. T 1 + m 2 .Math. L 1 m 1 + m 2 , and K 2 = m 2 .Math. T 2 m 1 + m 2

(57) The values of m.sub.i and T.sub.i (i=1-3), and L.sub.j (j=1, 2), can be estimated from user's body mass and height as explained by D. Winter et al., Biomechanics and motor control of human movement. New York: Wiley, 1990, the disclosure of which is hereby incorporated by reference. In one embodiment, the system described herein obtains the user body mass and height as part of an initial set up process (e.g., via an input in a displayed user interface). Those skilled in the art will appreciate that aspects of the sample single-link based COM determination method shown in FIG. 3 such as drift correction are applicable to the two- or three-link model based COM calculations as well.

4. Providing a Visual Feedback of the Swing to the User

(58) An embodiment of the present invention provides the user with visual feedback about his or her postural control during the golf swing. Embodiments of the invention take advantage of the observation that the position of the COM at certain time points can be used to describe an ideal swing, and provide those positions of the COM to the user to help improve the swing. These time points include, but are not limited to: (1) the instant of maximum arm speed; (2) the instant at which the golfer assumes the address position; and (3) the instant of starting the down-swing phase.

(59) In one embodiment, these time points can be identified using a sensor attached to the user's arm (e.g., sensor 902 as shown in FIG. 9). In one embodiment, the system computes the position of the user's COM at such specified time points, using data measured by the sensors during the swing. Next, the system compares the position of the COM at the specified times with the “reference”/“target” positions of the COM. In one embodiment, the reference position is defined as the COM position when the user is in the neutral standing position. This position can be estimated, for example, during an initial calibration in which the user is requested to be at upright position without any motion for a period of approximately 5 seconds. The user is then provided with feedback through an interface (e.g., visual and/or audio) about the estimated actual position of his or her COM relative to the reference COM positions. The system may then instruct the user on how to reduce the difference between the estimated actual position and the reference/target COM position.

(60) FIG. 6 illustrates an exemplary interface 600 used to provide visual feedback to the user including the above information. In one embodiment, the interface 600 includes one or more of: an interface 602 based on data from a sensor worn on the person's arm, and interfaces 610 and 612 reflecting the COM of the person during various parts of the swing. In the example depicted, the interface 602 shows the position of the player's hand (604), which is used to assist him/her to hit a virtual ball (target) 606 displayed on the same screen. While a virtual ball is used here in this example, in other embodiments, a sensor may be attached to an actual golf ball or some other object such as the golf club and the interface 602 may be adapted to display the position of the sensor attached to the ball or some other object that stands in place of the ball.

(61) In one embodiment, at the end of each trial, the position of the COM at specified instants of the backswing phase (612) and the downswing phase (610) are displayed to assist the player improve his or her postural control in subsequent trials. Note for example, the downswing phase interface 610 displays the positions of the COM from two trials 614 (previous) and 616 (current), along with the optimum position of the COM 622. A similar display in the backswing phase interface 612 shows the trial positions 618 (previous) and 620 (current) with the optimum position 624.

(62) FIG. 7 illustrates an example user interface 700 showing an example arm trajectory during the swing. The interface identifies the initiation of the swing, the estimate of the top of the backswing, and the termination of the backswing (reaching back to the initial target). Based on the initial calibration of the sensor on the user's arm, the position of the arm is set inside of the initial target circle. This may be done by asking the user to bring the club at address and then have the system automatically set this position as initiation of the swing. The same process is done (optionally) for the backswing. In one embodiment, this calibration process needs to be performed only for the first trial and does not need to be re-performed for subsequent trials (except when the arm sensor has been moved).

(63) As shown in FIG. 7, the “INITIAL TARGET” 706 represents a virtual golf ball, while the “ARM CURSOR” 704 (square cursor) reflects the position of the arm. At the beginning of the swing (i.e., Position “1”), the arm cursor rests in the middle of the INITIAL TARGET (which represents the approximate position of the ball). Once the arm cursor is in this position, the “TOP BACKSWING TARGET” 702 appears (Position “2”). With this visual indicator 702 now on display within the user interface 700, the user can then initiate his swing by moving his arm back and strive to reach the second target (i.e., “TOP BACKSWING TARGET”). Once the user reaches the second target with his motion, he can then initiate the second portion of the swing (the “downswing”) and will move his arm forward and try to reach Position “1” (i.e., ball position).

(64) FIG. 8 shows an example user interfaces 800 and 810 that display visual feedback to the user at the end of the user's swing. In one embodiment, the combined user interfaces show the position of the user's COM at two times during the swing are displayed on the computer screen. As shown, in the user interface 810, the COM position at the top of the backswing 814 (t1) is shown to the user. In the user interface 800, the COM position at the moment when the arm has its maximum speed 806 (t2) is shown to the user.

(65) In one embodiment, a target is displayed to show the user the ideal COM position (804 and 814). The display may also show the position of the COM during the last trial (802 and 812). At the end of each try, if the user has reduced the distance between his COM at times t1 and t2 and the ideal positions at those times, the user is notified via one or more visual and/or audio cues. For example, the COM target may change color, the COM target may “explode,” and/or a sound may be played. By contrast, if the relevant distance at times t1 and t2 in the last try increases as compared with the second-to-last try, the user is notified in another manner (e.g., the COM target changes into another color).

5. Performance Metrics

(66) 5.1. COM Area and Norm of COM

(67) FIG. 10 shows a graph 1000 depicting how a COM area is determined according to one embodiment. Both graphs 1000 and 1018 are plotted with COM in the A-P direction in the Y-axis and COM in the M-L direction in the X-axis. Both illustrate the COM trajectory of a user from the beginning to the end of the golf swing. As depicted, the swing starts at the address position 1016, then the backswing curve 1014 shows the trajectory of the backswing going to the top of the backswing at 1008. Then the downswing curve 1006 shows the trajectory of the COM during the downswing from 1008 through the location at which the arm speed reaches a maximum (1004) all the way to impact with the golf ball at 1002. The ideal COM position 1012 is indicated by the star. One measurement of whether the swing is optimal is the COM area as shown. The COM area is obtained in one embodiment by multiply the A-P distance 1009 between the ideal COM and a particular COM during the swing and the M-L distance 1010 between the ideal COM and the particular COM during the swing (note than 1008 is chosen here as an example only).

(68) As an example, FIGS. 11A-11B show, respectively the “COM area” (ACOM) at 1) the instant when the arm has its maximum speed during the downswing phase (FIG. 11A); and 2) the time of initiating the downswing phase (FIG. 11B) for advanced golfers (bars 1102 and 1502 in depicted bar charts; handicap 9-14), intermediate golfers (bars 1104 and 1154 in depicted bar charts; handicap 14-20), and novice golfers (bars 1106 and 1156 in depicted bar charts; no handicap score). The figures show that the area of the COM at these two instants is significantly lower for advanced players and intermediate players than novice players. This suggests that advanced and intermediate players have better postural control during these two instants since they have a lower distance of COM with respect to the upright position, which is assumed to show perfect postural stability. FIG. 12 shows a typical example in which a player, using a visual feedback from the COM at the time of the maximum of arm speed during swing, improved his postural control after very few attempts (i.e., the norm of COM has been reduced). The norm of COM is defined in one embodiment as:
norm(COM)=(COM.sub.AP.sup.2+COM.sub.ML.sup.2).sup.0.5
5.2. Non-Linearity

(69) Returning to FIG. 10, the graph 1008 illustrates another metric that may be used to measure whether a particular swing is optimal. In one embodiment, this metric quantifies the non-linearity of the user's downswing curve. In particular, the non-linearity of the early downswing is used as a metric to determine optimal performance. This is based on the assumption that a flat and straight trajectory into impact achieves the best energy efficacy. This means that the COM moves on a linear trajectory from the top of backswing toward the address point. As a result, calculating the non-linearity of the downswing COM trajectory can provide an assessment of the quality of the swing, i.e., the more linear the downswing COM trajectory, the more optimal the swing.

(70) In one embodiment, to estimate the non-linearity (1022), first the position of COM at address (1028) is projected in A-P direction onto the COM downswing curve, as shown by point 1020. Then, the maximum difference (1022) between the curve of the COM during the downswing phase and the straight line joining the top of the backswing (1024) and the projected point (1020) is calculated. In one embodiment, the estimated maximum distance between the straight line and the downswing curve is deemed to be the maximum non-linearity of the downswing curve. One embodiment normalizes this value by the range of motion of COM in A-P direction (1030) to estimate the percentage of non-linearity.

6. Computer System Embodiment

(71) FIG. 13 is a block diagram illustrating the postural and balance evaluation system 1300 in accordance with one embodiment. The system 1300 may be implemented as a programmed computer system that comprises one or more computers or computing devices (e.g., application servers, database servers, workstations, storage servers, etc.) that execute code modules. The postural and balance evaluation system 1300 includes, for example, one or more personal computers that are IBM, Macintosh, or Linux/Unix compatible. In one embodiment, the postural and balance evaluation 1300 comprises one or more servers, desktop computers, laptop computers, personal digital assistants, kiosks, or mobile devices, for example. In one embodiment, the sample postural and balance evaluation system 1300 includes a central processing unit (“CPU”) 1390, which may include one or more conventional microprocessors. The postural and balance evaluation system 1300 further includes a memory 1372, such as random access memory (“RAM”) for temporary storage of information and a read only memory (“ROM”) for permanent storage of information, and a mass storage device 1382, such as a hard drive, diskette, solid-state drive, or optical media storage device. The mass storage device 1382 may store data collected from a plurality of sensors or remotely collected sensor data, and/or calculated posture data from various trials. Typically, the components and modules of the postural and balance evaluation system 1300 are connected to the computer using a standard based bus system 1340. In different embodiments, the standard based bus system 1340 could be Peripheral Component Interconnect (“PCP”), Microchannel, Small Computer System Interface (“SCSI”), Industrial Standard Architecture (“ISA”) and Extended ISA (“EISA”) architectures, for example. In addition, the functionality provided for in the components and modules of postural and balance evaluation system 1300 may be combined into fewer components and modules or further separated into additional components and modules.

(72) The postural and balance evaluation system 1300 is generally controlled and coordinated by operating system software, such as Windows Server, Linux Server, Windows XP, Windows Vista, Windows 7, Unix, Linux, SunOS, Solaris, Android, iOS, or other compatible server, desktop, or mobile operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the postural and balance evaluation system 1300 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.

(73) The sample postural and balance evaluation system 1300 may include one or more commonly available input/output (I/O) devices and interfaces 1368, such as a keyboard, mouse, touchpad, and printer. The I/O devices may also include the one or more sensors 1364 worn on a user's body, as described above. In one embodiment these devices may be linked physically to the system 1300, or may be linked wirelessly via interfaces such as Bluetooth. In one embodiment, the I/O devices and interfaces 1368 include one or more display device, such as a monitor, that allows the visual presentation of data to a user (e.g., the visual feedback user interface described above). More particularly, a display device provides for the presentation of GUIs, application software data, and multimedia presentations, for example. The postural and balance evaluation system 1300 may also include one or more multimedia devices 1362, such as speakers, video cards, graphics accelerators, and microphones, for example. In other embodiments, such as when the postural and balance evaluation system 1300 comprises a network server, for example, the computing system may not include any of the above-noted man-machine I/O devices.

(74) In the embodiment of FIG. 13, the I/O devices and interfaces 1368 provide a communication interface to various external devices. For example, the postural and balance evaluation system 1300 is electronically coupled to the network 1340, which may comprise one or more of a LAN, WAN, or the Internet, for example, via a wired, wireless, or combination of wired and wireless, communication link 1363. The network 1340 facilitates communications among various computing devices and/or other electronic devices via wired or wireless communication links. The postural and balance evaluation system may use network 1340 to receive sensor data collected remotely and transmit such resulting data back to the user. For example, a user may wear sensors while he is playing at a golf course or driving range. The sensors may be configured to transmit data (through a wired or wireless connection) to a mobile computing device (e.g., a smartphone, a laptop computer, a tablet, etc.). The mobile communication device may in turn transmit the collected sensor data via the network 1340 to the postural and balance evaluation system 1300, which may, as described above, process the received data and provide feedback data back to the mobile computing device. The feedback data may then be used by the mobile computing device to display a visual feedback to the user (e.g., via a user interface described above). In this manner, the user can receive near-instantaneous feedback of his swings while he is at the golf course or driving range.

(75) In addition to the devices that are illustrated in FIG. 13, the postural and balance evaluation system 1300 may communicate with other data sources or other computing devices. For example, collected data may be stored in a local or remote database by the postural and balance evaluation system 1300, so that a user's performance can be tracked over time.

(76) The postural and balance evaluation system 1300 may also include one or more software modules to process perform the functionalities described herein, for example, the methods and processes depicted in FIGS. 3 and 14 and in Sections 3 and 4. The software module may be stored in mass storage 1382 or memory 1372, and implemented as one or more modules, which may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. Alternately, the software module may be implemented as separate devices, such as computer servers. In alternate embodiments, the postural and balance evaluation system can be implemented by multiple physical computers that are interconnected, with different functions or tasks optionally handled by different machines.

(77) In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.

7. Conclusion

(78) All of the methods and processes described above may be embodied in, and fully automated via, software code modules executed by one or more general purpose computers or processors. The code modules may be stored in any type of computer-readable medium or other computer storage device. Some or all of the methods may alternatively be embodied in specialized computer hardware. In addition, the components referred to herein may be implemented in hardware, software, firmware, or a combination thereof.

(79) Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.

(80) Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.

(81) It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure.