AUGMENTED REALITY FOR DETECTING ATHLETIC FATIGUE
20210252339 · 2021-08-19
Assignee
Inventors
Cpc classification
A61B5/0077
HUMAN NECESSITIES
G16H50/20
PHYSICS
A61B5/7264
HUMAN NECESSITIES
A63B2220/05
HUMAN NECESSITIES
A63B24/0062
HUMAN NECESSITIES
A61B5/7278
HUMAN NECESSITIES
A61B5/0022
HUMAN NECESSITIES
A61B5/02
HUMAN NECESSITIES
A61B5/318
HUMAN NECESSITIES
A63B2225/50
HUMAN NECESSITIES
A61B5/1123
HUMAN NECESSITIES
A63B2024/0068
HUMAN NECESSITIES
A61B5/746
HUMAN NECESSITIES
A61B2562/0219
HUMAN NECESSITIES
International classification
A63B24/00
HUMAN NECESSITIES
A63B71/06
HUMAN NECESSITIES
Abstract
An augmented reality system and method of using the same for real-time assessment of athletic performance is described. The system includes a digital platform, itself including a display, at least one camera, and a communications module. The system further includes a performance monitor carried by a garment and configured to communicate wirelessly with the digital platform, a logic engine, and an interactive user interface, presenting real-time data and images of athletic performance. The real-time data and images include images obtained by the at least one camera and athletic performance data collected via the wearable sensor system. The augmented reality system provides a real-time augmented reality environment combining analysis of performance with live images of a subject of observation.
Claims
1. An augmented reality system for real-time assessment of an athletic performance, the system comprising: a digital platform including a display, at least one camera, and a communications module; a performance monitor carried by a garment and configured to communicate wirelessly with the digital platform; a logic engine; and an interactive user interface, presenting real-time data and images of the athletic performance in an augmented reality environment, the real-time data and images including— images obtained by the at least one camera; and athletic performance data received from the performance monitor.
2. The system of claim 1, wherein the interactive user interface further presents: historical performance data; and aggregated performance data.
3. The system of claim 2, wherein historical performance data comprises real-time data collected from an identified individual over a period of time.
4. The system of claim 2, wherein aggregated performance data comprises real-time data collected from a plurality of anonymized individuals.
5. The system of claim 4, further comprising at least one predictive model indicating a likelihood of one or more of fatigue or injury in correlation to real-time data.
6. The system of claim 1, wherein the performance monitor comprises: a plurality of sensors positioned in one or more regions of the garment configured to measure signals generated during the athletic performance; and a performance monitor controller, comprising: an onboard analytics module configured to receive and process signals from the plurality of sensors; and an onboard communications module in wireless communication with the digital platform.
7. The system of claim 6, wherein the performance monitor comprises sensors to measure orientation, acceleration, heart response, and muscle response.
8. The system of claim 1, wherein the logic engine comprises an implementation of machine learning.
9. The system of claim 1, wherein the augmented reality system further comprises a data storage system in communication with the digital platform and the performance monitor, the data storage system including an implementation of machine learning.
10. A method of assessing athletic performance in real-time through an augmented reality environment, the method comprising: selecting a subject of observation; identifying the subject of observation using a digital platform; presenting an augmented reality environment including an interactive user interface and data including— images of the subject of observation collected via a camera; and real-time data collected via a performance monitor; and receiving commands from a user via the interactive user interface, wherein the commands modify one or more of the interactive user interfaces, the operation of the performance monitor, the selection of the subject of observation, and the presentation of data.
11. The method of claim 10, wherein the data further comprise: historical performance data collected from the subject of observation; and aggregated performance data collected from multiple anonymized subjects.
12. The method of claim 10, further comprising: accessing real-time analytics provided by an external data storage system; and processing the real-time data using model-predictions of athletic performance.
13. The method of claim 10, further comprising: identifying multiple subjects engaging in simultaneous athletic performances; presenting one or more available subjects via the interactive user interface; prompting a selection of one or more of the available subjects for observation in real-time.
14. The method of claim 10, further comprising: indicating, via a visual or auditory signal, when the subject of observation has a high likelihood of adverse outcome from athletic performance.
15. An augmented reality system for real-time assessment of a physical rehabilitation treatment, the system comprising: a digital platform including a display, at least one camera, and a communications module; a performance monitor carried by a garment and configured to communicate wirelessly with the digital platform; a logic engine; and an interactive user interface, presenting real-time data and images of the physical rehabilitation treatment in an augmented reality environment, the real-time data and images including— images obtained by the at least one camera; and physical rehabilitation treatment data received from the performance monitor.
16. The system of claim 15, wherein the interactive user interface further presents: historical performance data; and aggregated performance data.
17. The system of claim 16, wherein historical performance data comprises real-time data collected from an identified individual over a period of time.
18. The system of claim 16, wherein aggregated performance data comprises real-time data collected from a plurality of anonymized individuals.
19. The system of claim 16, further comprising at least one predictive model indicating a likelihood of one or more of fatigue or injury in correlation to real-time data.
20. The system of claim 15, wherein the performance monitor comprises: a plurality of sensors positioned in one or more regions of the garment configured to measure signals generated during the athletic performance; and a performance monitor controller, comprising: an onboard analytics module configured to receive and process signals from the plurality of sensors; and an onboard communications module in wireless communication with the digital platform.
21. The system of claim 20, wherein the performance monitor comprises sensors to measure orientation, acceleration, heart response, and muscle response.
22. The system of claim 15, wherein the logic engine comprises an implementation of machine learning.
23. The system of claim 15, wherein the augmented reality system further comprises a data storage system in communication with the digital platform and the performance monitor, the data storage system including an implementation of machine learning.
Description
DESCRIPTION OF THE DRAWINGS
[0032] The foregoing aspects and attendant advantages of the inventive technology will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
DETAILED DESCRIPTION
[0040] The following disclosure describes various embodiments of systems and associated methods for preparing personalized cosmetic formulas. A person skilled in the art will also understand that the technology may have additional embodiments, and that the technology may be practiced without several of the details of the embodiments described below with reference to
[0041]
[0042] In some embodiments, the interactive user interface 112 includes an augmented reality display including real-time data 118a of the subject 140 while the subject 140 is engaging in physical activity. The real-time data 118a may include vertical position, lateral position, acceleration, orientation, etc., as well as bioelectrical information. The bioelectrical information may include muscle activity signals, heart-rate signals, etc., as described further, below. As described in more detail below, with regard to
[0043] In addition to real-time data 118a, the interactive user interface 112 may present selected athletic performance data 118b, such as a personal best metric or a record-setting metric, to compare the subject 140 with an external measure of activity. The selected performance data 118b may also include a range of values within which the subject 140 is less likely to sustain an injury while engaging in physical activity. In some embodiments, an implementation of machine learning determines the range of data values, as described in more detail, below.
[0044] In some embodiments, the data 118 includes historical performance data 118c, collected from the subject 140 over a given period of time, such as during a period of peak condition, or during a period preceding an injury. The interactive user interface 112 may display the historical performance data 118c alongside other data 118. The user 102 may select and modify data 118 as desired.
[0045] While the real-time data 118a is collected from the subject 140 directly, the data 118 may include aggregated performance data 118d collected from a number of anonymized subjects, subsequent to processing to provide useful indicators for the subject 140. For example, aggregated performance data 118d may provide correlations between various measured parameters of the real-time data 118a and likelihood of injury, such as asymmetric load on one hamstring 144, uneven exertion between two legs 142, etc.
[0046] To pair a subject 140 with data 118a-d, the subject's face 148 may be recognized by the digital platform 110 through facial recognition 160, as shown in
[0047] In some embodiments, the interactive user interface 112 allows a user 102 to manipulate the augmented reality environment by selecting the type of data 118 to be presented and the manner of its presentation in a way most favorable for the user 102.
[0048]
[0049] Additionally or alternatively to the color-map 200a shown in
[0050]
[0051] In some embodiments, the system can employ cloud learning that enables the subject 140, user 102, and others to evaluate performance and compare performance to other subjects, including anonymous subjects. The digital platform 110 may communicate with the controller 305 wirelessly, via the wireless transceiver 335, which may include Bluetooth and RFID capabilities. As discussed further with regard to
[0052] In general, the word “engine,” as used herein, refers to logic software and algorithms embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NET™, PYTHON, and/or the like. An engine may be compiled into executable programs or written in interpreted programming languages. Software engines may be callable from other engines or from themselves. Generally, the engine described herein refers to logical modules that can be merged with other engines, or can be divided into sub engines. The engines can be stored in any type of computer readable medium or computer storage device and be stored on and executed by one or more general purpose computers, thus creating a special purpose computer configured to provide the engine or the functionality thereof.
[0053]
[0054] Referring to
[0055] Referring to
[0056] The ECG and EMG sensors 323a and 323b may include dry-surface electrodes distributed throughout the subject's clothing 345 and positioned to make necessary skin contact beneath the clothing along predetermined locations of the body. In some embodiments, the sensors can include an optical detector, such as an optical sensor for measuring heart rate. The fit of the clothing can be selected to be sufficiently tight to provide continuous skin contact with the individual sensors 323a and 323b, allowing for accurate readings, while still maintaining a high-level of comfort, comparable to that of traditional compression fit shirts, pants, and similar clothing. In various embodiments, the clothing 345 can be made from compressive fit materials, such as polyester and other materials (ex. Elastaine) for increased comfort and functionality. In some embodiments, the controller 305 and the sensors 323 can have sufficient durability and water-resistance so that they can be washed with the clothing 345 in a washing machine without causing damage. In these and other embodiments, the presence of the controller 305 and/or the sensors 323 within the clothing 345 may be virtually unnoticeable to the subject. In one aspect of the technology, the sensors 323 can be positioned on the subject's body without the use of tight and awkward fitting sensor bands. In general, traditional sensor bands are typically uncomfortable for a subject, and subjects can be reluctant to wear them.
[0057] The ECG sensors 323a can include right arm RA, left arm LA, and right leg RL (floating ground) sensors positioned on the subject's chest and waist. The EMG sensors 323b can be positioned adjacent to targeted muscle groups, such as the large muscle groups of the pectoralis major, rectus abdominis, quadriceps femoris, biceps, triceps, deltoids, gastrocnemius, hamstring, and latissimus dorsi. The EMG sensors 323b can also be coupled to floating ground near the subject's waist or hip.
[0058] The orientation and accelerations sensors 323c and 323d can be disposed at a central position 349 located between the athlete's shoulders and upper back region. In some embodiments, the central, upper back region can be an optimal location for placement of the orientation and acceleration sensors 323c and 323d because of the relatively small amount of muscle tissue in this region of the body, which prevents muscle movement from interfering with the accuracy of the orientation and acceleration readings. In other embodiments, the orientation sensor 323c and/or the acceleration sensor 323d can be positioned centrally on the user's chest, tail-bone, or other suitable locations of the body. In various embodiments, the orientation and acceleration sensors 323c and 323d can be positioned adjacent the controller 305, or integrated into the same packaging (e.g., housing) 322 as controller 305, as shown in
[0059] In one aspect of this embodiment, the use of a single orientation sensor and a single acceleration sensor can reduce computational complexity of the various analytics 110 (
[0060] Referring back to
[0061] In some embodiments, the performance monitor 300 (
[0062] In additional or alternate embodiments, the performance monitor 300 (
[0063]
[0064] As shown in
[0065]
[0066] In some embodiments, the method 500 includes the digital platform 110 referring to a data storage system 540 that receives the information gathered in block 504. The data storage system 540, as previously described, can provide aggregated performance data 118d from multiple anonymous subjects, historical performance data 118c from the subject 140, as well as analytics and model-predictive adjustments of indicators of fatigue. In such cases, the method 500 displays data 118 in the augmented reality environment as part of the interactive user interface 112.
[0067] In some embodiments, the interactive user interface 112 presents visual or auditory feedback to the user 450 when real-time data 118a indicates a high likelihood of an adverse outcome, as shown in block 516. For example, the performance monitor 300 may detect a bioelectric signal indicating a high likelihood of hamstring injury, based on model predictions, and the interactive user interface 112 may provide a blinking indicator over the relevant muscle group on the subject 440. For example, the digital platform 410 may provide feedback when the performance monitor 300 detects that a left hamstring is bearing an excess load, based on models of healthy and effective load balancing. In some embodiments, the user 450 designates values of real-time data 118a for which feedback will be provided. In some embodiments, the values for which feedback will be provided are designated automatically via a model-prediction, as previously described.
[0068] Many embodiments of the technology described above may take the form of computer- or controller-executable instructions, including routines executed by a programmable computer or controller. Those skilled in the relevant art will appreciate that the technology can be practiced on computer/controller systems other than those shown and described above. The technology can be embodied in a special-purpose computer, application specific integrated circuit (ASIC), controller or data processor that is specifically programmed, configured or constructed to perform one or more of the computer-executable instructions described above. Of course, any logic or algorithm described herein can be implemented in software or hardware, or a combination of software and hardware.
[0069] From the foregoing, it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but that various modifications may be made without deviating from the disclosure. Moreover, while various advantages and features associated with certain embodiments have been described above in the context of those embodiments, other embodiments may also exhibit such advantages and/or features, and not all embodiments need necessarily exhibit such advantages and/or features to fall within the scope of the technology. Accordingly, the disclosure can encompass other embodiments not expressly shown or described herein.