A GRIP ANALYSIS SYSTEM AND METHOD

20230356050 ยท 2023-11-09

    Inventors

    Cpc classification

    International classification

    Abstract

    One of the most important factors affecting the performance of athletes in club, bat or racket based sports is the athlete's grip on their club, bat or racket. Minor changes in grip position and force can have a significant on the outcome of a shot or other sporting action. Typically, athletes receive feedback on their grip, and the resulting shot, through coaching or practice. However, it is difficult for inexperienced athletes and coaches to correctly diagnose and fix grip faults. The present invention provides a grip analysis system, and method of use thereof, including an array of pressure sensors configured to detect a grip of a user on an object and a processor operable to analyse data from the array of pressure sensors and output at least one grip quality indicator corresponding to the grip of the user on the object.

    Claims

    1. A grip analysis system comprising: a sleeve positionable, in use, on an object configured to be gripped by a user; a distributed array of pressure sensors arranged to detect a pressure applied to the sleeve; and a processor operable to: detect, with the array of pressure sensors, a grip of a user on the sleeve; analyse the grip of the user on the sleeve, by: receiving input data from the array of pressure sensors; weighting the input data with a predetermined weight array to determine weighted pressure data; and determining at least one grip quality indicator corresponding to the grip of the user on the sleeve based on the weighted pressure data; and output the at least one grip quality indicator corresponding to the grip of the user on the sleeve.

    2. The grip analysis system of claim 1, wherein the object is a golf club and the sleeve is a golf club grip.

    3. The grip analysis system of claim 1, wherein the predetermined weight array is determined via a trained neural network, a random forest algorithm, and/or a gradient boosted decision tree.

    4. The grip analysis system of claim 3, wherein the predetermined weight array is determined at least in part via a convolutional neural network.

    5. The grip analysis system of claim 1, wherein the at least one grip quality indicator is related to one or more selected from the range: a relative strength or neutrality of hand placement on the grip, a force level, a force position, a maximum force value, a hand angle, a relative angle between two hands, a relative angle between two fingers, a maximum force applied by each hand and a maximum force applied by each finger.

    6. The grip analysis system of claim 1, further comprising a feedback device configured to receive the at least one grip quality indicator output by the processor, wherein the feedback device is operable to provide a user gripping the object with feedback related to the at least one grip quality indicator corresponding to the grip of the user on the sleeve.

    7. The grip analysis system of claim 6, wherein the feedback device comprises a haptic feedback device operable to provide a user gripping the object with haptic feedback.

    8. The grip analysis system of claim 6, wherein the feedback device comprises a visual and/or audible feedback device operable to provide a user gripping the object with visual and/or audible feedback.

    9. The grip analysis system of claim 6, wherein the processor is operable to determine a difference between the determined at least one grip quality indicator and a predetermined grip quality indicator corresponding to a predetermined desired grip, and wherein a quality of the feedback is related to a required grip change to achieve the predetermined desired grip.

    10. The grip analysis system of claim 6, wherein the feedback device is configured to provide a first feedback related to a first grip quality indicator and provide a second feedback related to a second grip quality indicator.

    11. The grip analysis system of claim 1, wherein the processor is configured to: separate the input data into a plurality of input data subsets; attribute each input data subset to a portion of a user's hand with multiclass classification; identify a position of each user hand portion on the sleeve based on the input data subset attributed to each user hand portion; and compare the identified positions of each user hand portion to a predetermined desired position of each hand portion in order to identify a difference between the identified positions of each user hand portion and the predetermined desired position of each hand portion; and wherein the at least one grip quality indicator is related to the difference between the identified positions of each user hand portion and the predetermined desired positions of each user hand portion.

    12. The grip analysis system of claim 1, wherein the processor is operatively connected to the array of pressure sensors and is adjacent to the sleeve.

    13. The grip analysis system of claim 12, wherein the predetermined labelled dataset and/or the predetermined weight array is stored on a remote server and the processor is in communication with the remote server.

    14. The grip analysis system of claim 13, wherein the remote server is in communication with at least one other grip analysis system.

    15. The grip analysis system of claim 12, further comprising a rechargeable battery configured to supply power to the processor.

    16. The grip analysis system of claim 1, wherein the array of pressure sensors comprises at least 8 pressure sensor elements.

    17. The grip analysis system of claim 16, wherein the array of pressure sensors comprises at least 368 pressure sensor elements.

    18. The grip analysis system of claim 1, wherein the processor is operable to continually output grip quality indicators corresponding to grips of the user on the sleeve.

    19. The grip analysis system of claim 1, wherein the predetermined weight array is one of a plurality of predetermined weight arrays, wherein each of the plurality of predetermined weight arrays is categorised according to hand size and/or shape, and wherein the processor is configured to: determine, based on the input data, a hand size and/or shape categorisation of a hand of a user gripping the sleeve; and select a predetermined weighted array which corresponds to the same hand size and/or shape categorisation as the determined user hand size and/or shape categorisation.

    20. A grip analysis method comprising the steps: detecting, by an array of pressure sensors, a grip of a user on a sleeve; analysing the grip of the user on the sleeve by: receiving input data from the array of pressure sensors; weighting the input data with a predetermined weight array to determine weighted pressure data; and determining at least one grip quality indicator corresponding to the grip of the user on the sleeve based on the weighted pressure data; and outputting the least one grip quality indicator corresponding to the grip of the user on the sleeve.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0035] FIG. 1 is a schematic view of a grip analysis system;

    [0036] FIG. 2 is a first flow diagram showing a method of training a neural network to provide a weighted array for use by the grip analysis system shown in FIG. 1; and

    [0037] FIG. 3 is a second flow diagram showing a method of providing feedback to a user of the grip analysis system shown in FIG. 1.

    DETAILED DESCRIPTION

    [0038] FIG. 1 is a schematic view of a grip analysis system 100. The system includes a processor 110 that is in communication with a cloud-based server 120 via a smart device 130. The processor 110 may be physically or wirelessly connected to the smart device 130 such as a smart phone or a smart watch. For example, the processor 110 and smart device 130 may communicate wirelessly via WiFi or Bluetooth.

    [0039] The grip analysis system also includes an array of pressure sensors, shown schematically by sensor elements 142, 144, 146. Although only three sensor elements 142, 144, 146 are shown, any number of sensor elements may be provided. For example, 368 sensor elements may be provided in a grid pattern. The array of pressure sensors 140 is configured to be arranged on an object to be gripped by a user, such as a golf club. In this case, the array of pressure sensors may be on, under or embedded in the grip of the golf club or any other connected location. Each sensor element 142, 144, 146 is operable to provide pressure data to the processor 110.

    [0040] Furthermore, the grip analysis system 100 also includes a haptic feedback device 150. Other types of feedback device 150 are envisaged such as a visual or audible feedback device. The processor 110 is operable to receive pressure data from the array of pressure sensors 140, process the pressure data with a method, to be discussed in more detail with reference to FIG. 3, to obtain a grip quality indicator and operate the feedback device 150 accordingly. The haptic feedback device 150 may be operable to vibrate, heat or cool to indicate to the user that their grip requires adjustment.

    [0041] FIG. 2 is a first flow diagram 200 showing a method of training a neural network to provide a weighted array for use by the grip analysis system 100 shown in FIG. 1. The first step of the method 200 is to collect labelled data 210 by having test subjects grip the grip analysis system 100 shown in FIG. 1. The collection of labelled data 210 includes collecting pressure data from the sensor array 220 and collecting pressure data from a further sensor 230. The further sensor may be a glove mounted sensor which is precisely positioned such that a position of the glove, and therefore the user's hand, relative to the array of pressure sensors may be determined. Alternatively, the labelled data 210 may be collected without a further sensor 230. A user, such as an expert athlete or coach, may provide a user input. The user input may be provided during or after the gripping action is performed. For example, a user may review a video recording of a golf swing and classify the grip as strong, neutral or weak.

    [0042] Once the labelled data has been collected 210, a model structure is specified 240. Specification 240 of the model structure may be an iterative process of trial and error. Space and/or scope may be provided to vary the model architecture and hyperparameters, such as the settings, the learning rate, the regularisation parameter to control overfitting, or any other parameter. The space and/or scope may be user determined and/or determined automatically, such as with automated machine learning type automation. Several models may be specified 240 and trained, then compared to determine relative performance. A relatively better performing model may be chosen. The model structure may be a neural network such as a convolutional neural network. Such a model requires training 250, which is the next step. To train the model 250, the model is tested and an accuracy of the model is compared to a threshold accuracy value 260. If the model does not meet the threshold accuracy value, the model is retrained 250 and retested as described above. The model may be retrained 250 with a larger or otherwise better dataset and/or may be retrained to have a different architecture. Once the accuracy of the model meets the threshold accuracy value, the model is stored 270 in a datastore 280. The datastore is connected to the cloud based server 120 such that the processor 110 of the grip analysis system 100 can access it.

    [0043] FIG. 3 is a second flow diagram 300 showing a method of providing feedback to a user of the grip analysis system 100 shown in FIG. 1. The method 300 includes loading the model 305 from a datastore 310, such as the model determined and trained by the method 200 of FIG. 2.

    [0044] The method 300 also includes collecting pressure data from the sensor array 315 when a user is gripping the object to which the sensor array is applied. The pressure data collected may be adjusted or otherwise predicted to reduce or remove sensor noise and/or to take account of degradation over time.

    [0045] The next step is to predict the user's hand position 320 on the object.

    [0046] Predicting the hand position 320 includes giving the collected pressure data to the input layer 325 of the trained neural network, forward propagation by the neural network 330, and reading a prediction from the output layer of the neural network 335. Accordingly, a position and force applied by each pressure-applying element, such as each finger, finger portion and/or palm portion, may be predicted or determined.

    [0047] Once a position and applied force for each pressure-applying element has been determined, a comparison can be made between the determined force and position and a force and position relating to a desired grip. For example, the user may determine that they desire a neutral golf grip, and the method 300 may determine that the user's grip is currently weak. The relative difference between the current and desired grip of the user may be determined.

    [0048] The next step incudes cycling through the pressure-applying elements 340 to determine which of the pressure-applying elements, such as the user's fingers, are currently positioned incorrectly, or are applying an incorrect force, when compared to the user's desired grip. Cycling through the pressure-applying elements 340 includes determining whether a correct pressure is applied 345, determine whether each finger is correctly placed 350, and determining whether all fingers and hand portions have been analysed 355. Alternatively, the output may be related to a single grip parameter, such as a strong, neutral or weak golf grip, and may be calculated with a single pass with no cycling required. Determining whether the correct pressure is applied 345 and determining whether the fingers are correctly placed 350 may be carried out in any order. The step of determining whether a correct pressure is applied 345 may include determining whether an applied pressure is too high and/or too low.

    [0049] Each pressure-applying element is considered in turn. If the method 300 determines that the correct pressure is applied 345, the placement of the finger is then considered 350. If the method 300 determines that the finger is correctly placed 350, then a determination is made regarding whether all fingers have been analysed 355. The cycling through the pressure-applying elements 340 continues until a determination is made that all fingers have been analysed 355.

    [0050] If any pressure-applying element is determined to be providing an incorrect pressure, or is incorrectly positioned, the next step is to provide feedback 360. The provision of feedback 360 includes transmitting an activation command to a feedback device 365, and to activate the feedback device 370 based on the activation command. The feedback device may then provide feedback to the user to adjust their grip such that they may achieve their predetermined desired grip. The method 300 continues to operate, from the step of collecting pressure data from the sensor array 315 to the step of activating the feedback device 370 to continually provide feedback related to the user's grip, which may be adjusted accordingly.

    [0051] In use, the array of pressure sensors 140 may be arranged on an object. A user may then grip the object in the region covered by the array of pressure sensors 140. Each sensor element 142, 144, 146 may provide a pressure value to the processor 110, which, via processes and methods described herein, is able to determine and output at least one grip quality indicator corresponding to the user's grip on the object. The user may then use the at least one grip quality indicator to adjust their grip. The process may then repeat to provide feedback related to the user's adjusted grip. For example, the array of pressure sensors 140 may be arranged on a golf club grip. A golfer may grip the golf club and address a golf ball. The system 100 may then determine that the golfer is gripping the golf club with a weak grip. However, the golfer may wish to use a neutral grip and adjust their grip accordingly. The system 100 may then reassess the golfer's grip and determine that the golfer is gripping the golf club with a strong grip, having adjusted their grip incorrectly. The golfer may continue to adjust their grip and receive feedback from the system 100 until they are gripping the golf club with their desired grip.

    [0052] The processor 110 shown in FIG. 1 may be adjacent to the array of pressure sensors 140, or remote from the array of pressure sensors 140. For example, the processor 110 and the array of pressure sensors 140 may both be positioned in the grip of a golf club. Alternatively, the array of pressure sensors 140 may be positioned in the grip of the golf club and the processor 110 may be positioned away from the golf club.

    [0053] Although the server 120 is described as being cloud-based, it is to be understood that the server 120 may be located alternatively, such as centrally on a private network or locally on a local area network. Furthermore, although a smart phone and a smart watch have been given as examples of a smart device 130, it is to be understood that the smart device 130 may be any device capable of communicating with the processor 110.

    [0054] The array of pressure sensors 140 may be arranged in a regular grid pattern. Alternatively, the array of pressure sensors 140 may be arranged in an irregular pattern. The array of pressure sensors 140 being configured to be arranged on an object to be gripped by a user may mean that the sensor elements 142, 144, 146 may be in, on or under a portion of the object. Furthermore, although the object has been described as a golf club, it is to be understood that any sporting equipment or other object may be used.

    [0055] The haptic feedback device 150 is described as being operable to vibrate, heat or cool to indicate to the user that their grip requires adjustment. However, other modes of operation are envisaged. In addition, when alternative feedback devices, such as visual or audible feedback devices, they may be operable to provide visual or audible feedback respectively.

    [0056] The methods shown in the flow diagrams 200, 300 of FIGS. 2 and 3 are not limited to the steps shown and described above. Additional, or alternative, steps may be undertaken.

    [0057] Although FIGS. 2 and 3 describe the use of a neural network, other models are envisaged, such as a random forest algorithm or a gradient boosted decision tree. Furthermore, although the further sensor is described as being glove mounted, other positions are envisaged, such as wrist mounted.

    [0058] Although FIG. 3 includes predicting both the hand position and force applied by the user to the object, only one of these parameters may be predicted and considered. Furthermore, although the pressure-applying elements are described as fingers, finger portions or palm portions, it is to be understood that the pressure-applying elements may be other items, human or non-human, such as a portions of a robotic hand.

    [0059] In addition, although the feedback device 370 is said to provide feedback continually, it is to be understood that the feedback device 370 may provide feedback only once, a predetermined number of times, or intermittently over a period of time, such as the duration of a swing or hit.