ATHLETIC PERFORMANCE DATA ACQUISITION SYSTEMS, APPARATUS, AND METHODS
20170272703 · 2017-09-21
Inventors
Cpc classification
A63B2220/14
HUMAN NECESSITIES
A63B2225/50
HUMAN NECESSITIES
G06Q10/0639
PHYSICS
A63B24/0062
HUMAN NECESSITIES
A63B69/3605
HUMAN NECESSITIES
International classification
H04N7/18
ELECTRICITY
A63B24/00
HUMAN NECESSITIES
Abstract
Systems, apparatus, and methods for capturing data related to athletic performances. Some methods comprise displaying a site map for the performance and accepting a selection of a target for the performance. Various methods comprise estimating, based on the user location, the initial location of the object and recording data regarding the performance. Moreover, some methods also comprise using a satellite-based map as the site map. If desired, the estimating comprises sensing a pause in the changing user location. Some methods further comprise estimating a trajectory of the object using the initial location of the object and a portion of the trajectory. In some scenarios, methods comprise capturing a location of the object at the ending of the performance. The recording can further comprise capturing a video of the performance. Furthermore, these methods can comprise capturing subjective information regarding the performance.
Claims
1. A system for capturing athletic performance data comprising: a remote control configured to be carried by a user and further comprising a geo-location circuit configured to determine a geo-location of the remote control and a user input which when activated causes the remote control to transmit a signal indicative thereof and a signal indicative of the geo-location; and a controller to be in communication with the remote control and further comprising a processor and a memory storing processor executable instructions which when executed by the processor cause the processor to perform a method further comprising detecting the signal indicative of a user input activation and responsive thereto storing the geo-location in the memory and sending a signal indicating that video of the athletic performance should be captured.
2. The system of claim 1 further comprising a video capture device configured to receive the signal indicating that video should be captured.
3. The system of claim 2 wherein the video capture device is selected from a group consisting of a smart telephone, computing device, or a device and apparatus specifically constructed for the purpose described herein.
4. The system of claim 1 further comprising a pan and tilt unit and wherein the method further comprises causing the pan and tilt unit to point toward a location selected from the group consisting of the geo-location, a desired subject, and a user-selected location on the playing field.
5. The system of claim 5 wherein the method further comprises receiving a plurality of time stamped geo-locations from the remote control and analyzing the plurality of time stamped geo-locations to determine a particular geo location at which the movement of the remote control paused for a pre-selected time.
6. The system of claim 5 wherein the method further comprises capturing video of an initial trajectory of a ball.
7. The system of claim 6 wherein the method further comprises receiving a geo location of the ball at the end of the trajectory of the ball.
8. The system of claim 7 wherein the method further comprises estimating the trajectory of the ball based on the geo location at which the remote control paused, the video of the initial trajectory of the ball, and the geo location of the ball at the end of the trajectory.
9. The system of claim 1 wherein the method further comprises accepting a user selected intended target for a ball.
10. The system of claim 1 wherein the method further comprises accepting a user selected intended shot shape for a ball.
11. A method for capturing athletic performance data comprising: detecting the signal indicative of a user input activation from a remote control configured to be carried by a user and further comprising a geo-location circuit configured to determine a geo-location of the remote control and a user input which when activated causes the remote control to transmit the signal indicative of a user activation thereof and a signal indicative of the geo-location; and responsive to the signal indicative of the user activating the user input storing the geo-location in a memory and sending a signal indicating that video of the athletic performance should be captured using a processor in communication with the remote control and memory.
12. The method of claim 10 further comprising receiving the signal indicating that video should be captured using a video capture device.
13. The method of claim 12 wherein the video capture device is selected from the group consisting of a smart telephone and a computing device.
14. The method of claim 10 further comprising causing a pan and tilt unit to point toward a location selected from the group consisting of the geo-location, a desired subject, and a user-selected location on the playing field.
15. The method of claim 14 further comprising receiving a plurality of time stamped geo-locations from the remote control and analyzing the plurality of time stamped geo-locations to determine a particular geo location at which the remote control paused for a pre-selected time.
16. The method of claim 14 further comprising capturing video of an initial trajectory of a ball.
17. The method of claim 15 further comprising receiving a geo location of the ball at the end of the trajectory of the ball.
18. The method of claim 11 further comprising accepting a user selected intended target for a ball.
19. The method of claim 11 further comprising accepting a user selected intended shot shape for a ball.
20. The method of claim 11 further comprising accepting a user selected type of club for a shot.
21. The method of claim 11 further comprising accepting a user selected set of observed natural conditions.
22. The method of claim 11 further comprising determining one or more types of clubs and presenting the determined types of clubs to a user.
23. The method of claim 11 further comprising determining a type of shot played on a shot.
24. The method of claim 11 further comprising presenting a visual cue to a user whereby the user to orient a video capture device in such a manner that the video capture device to frame a ball.
25. The method of claim 11 further comprising accepting a user-input score for at least a hole.
26. The method of claim 11 further comprising determining a score of a user for at least one hole.
27. A system for capturing athletic performance data comprising: a remote control configured to be carried by a user and further comprising a geo-location circuit configured to determine geo-location of the remote control and a user input which when activated causes the remote control to transmit a signal indicative thereof and a signal indicative of the geo-location; a controller to be in communication with the remote control and a further comprising a processor and a memory storing processor executable instructions which when executed by the processor cause the processor to perform a method further comprising detecting the signal indicative of a user input activation and responsive thereto storing the geo-location in the memory and sending a signal indicating that video should be captured; a video capture device configured to receive the signal indicating that video should be captured wherein the video capture device is a smart telephone; and a pan and tilt unit and wherein the method further comprises causing the pan and tilt unit to point toward the geo-location; wherein the method further comprises receiving a plurality of time stamped geo-locations from the remote control and analyzing the plurality of time stamped geo-locations to determine a particular geo location at which the remote control paused for a pre-selected time, capturing video of an initial trajectory of a ball, receiving a geo location of the ball at the end of the trajectory of the ball, estimating the trajectory of the ball based on the geo location at which the remote control paused, the video of the initial trajectory of the ball, and the geo location of the ball at the end of the trajectory, accepting a user selected intended target for a ball, accepting a user selected intended shot shape for a ball, accepting user-selected and observed conditions for a given shot, determining a club to be recommended for a shot, outputting an indication of the recommended club, accepting an indication of a club selection for the shot, determining a type of shot played by the user during the shot, and calculating a score for a particular hole, and accepting a user-input score.
Description
BRIEF DESCRIPTION OF THE FIGURES
[0042] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number usually identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
DETAILED DESCRIPTION
[0050] The current disclosure provides systems, apparatus, methods, etc. for capturing data of athletic performances of users and more specifically for capturing video recordings of golf swings, related subjective data, and data such as the initial location of the golf balls and the intended targets for those golf balls.
[0051]
[0052] With continuing reference to
[0053]
[0054]
[0055] A golf swing is a complex physical feat involving many motions that the player wants to coordinate. At one perhaps simplistic level, a golf swing can be modeled as a double pendulum with the shoulders (acting as a pivot point) and arms being one pendulum and the wrists (or thereabouts) and the hands/club 134 forming the other pendulum. These pendulums are attached to each other in a complex bio-mechanical system involving nearly all parts of the athlete's body particularly considering that the player's legs, hips, torso, etc. generate a significant portion of the motive force powering the swing. Thus, all aspects of the player's body can come into play including the player's stance, alignment, grip, timing, etc. This is true to varying extents no matter the type of swing or shot involved: chip, pitch putt, drive, etc. For instance, it is known that the timing of the “uncocking” of the wrists or the “release” (if performed at all) can have a significant effect on a drive.
[0056] With continuing reference to
[0057] And once the ball leaves the tee/club 134 further complexities affect the shot. For instance, the dimples present on many golf balls act to increase turbulence in the boundary layer of air around the ball and/or the onset of turbulence. Combined with the back spin imparted on many balls, the dimples create an affect often termed “Magnus lift” that literally causes the ball to lift higher than would otherwise be the case. Given the increased lift and given the decrease in horizontal velocity due to air resistance, golf balls typically do not follow ballistic, parabolic paths. Rather, they tend to follow “impetus trajectories.”
[0058] However, to differing extents, hitting the ball correctly and consistently evades many players at many levels of play. As a result, the ball often leaves the club 134 with less (or more) energy than desired, less/more back spin than desired, at an unwanted angle, with unwanted (counter) clockwise spin, etc. The result is that the ball drops short, over shoots, and/or deviates from the intended trajectory by traveling either to the right or left therefrom. Colloquially, the ball hooks, slices, pulls, draws, fades, etc. Which is not to say that these trajectories are necessarily bad. Indeed, at times a player might intentionally want to cause a ball to follow one of these trajectories to (for instance) avoid a hazard, counter act some environmental factor 114, to obtain a desired lie, etc.
[0059] With reference again to
[0060]
[0061] Of course, the visual obstructions 324 and/or terrain 326 can complicate attempts to obtain video of the golfer 328. More particularly, the visual obstruction 324 can be any number of features typically found on golf courses and/or other sites for athletic performances. In some cases, a particular visual obstruction 324 might be a berm, a tree, shrubbery, other athletes, etc. As a game/round develops, particularly when the golf bag 304/monopod 306 are cart-mounted, these visual obstructions 324 can be found between the camera 312 and the golfer 328 from time to time. The monopod 306 and goose neck 308 can allow the golfer 328 to reposition/reorient the camera 312 to have a field of view to the golfer despite the video obstructions.
[0062] On a somewhat different note, the terrain 326 might also, or in the alternative, complicate attempts to obtain video of the golfer 328. For instance, in situations in which the golf bag 304/monopod 306 are cart-mounted, the terrain 326 itself might become/be a visual obstruction 324. In other situations, the terrain 326 might impose limits on the positioning/orientation of a camera directly mounted to a cart. For instance, the cart might be restrained (by course rules) to remain on a paved path and/or the terrain 326 might tilt the cart (and camera 312) into a particular orientation. In the current embodiment, though, the height/length of the monopod 306 and the flexibility of the goose neck 308 or flex arm allows the camera 312 to see over/around many such visual obstructions 324 despite the terrain 326 on many courses, at many holes, on many fairways, near many hazards, etc. Indeed, the flex arm can also serve to allow the camera to be oriented such that it has a clear view despite being mounted on/in the golf bag. At this juncture, it might be helpful to now consider some aspects of system 300 in more detail.
[0063] For instance,
[0064] With continuing reference to
[0065] Yet, in part perhaps due to aspects further disclosed elsewhere herein, systems of the current embodiment provide better shot tracking capabilities heretofore available. For instance, at least some machine vision based systems will track the divot of particular shots rather than the ball. Some/other heretofore available systems will lose track of the ball in the sun (when sun-facing). In both (and/or other situations) these heretofore available systems will track objects other than the ball. Systems of embodiments avoid these situations. Moreover, they do so without the cumbersome and/or expensive course instrumentation systems that ProTracer™ and other heretofore available systems require. Indeed, in accordance with embodiments, systems identify the starting and ending locations for the ball involved in particular play(s) and using them as mathematical “initial conditions” for curve fitting routines adapted to estimate the trajectories of the balls. These routines, programs, modules, etc., furthermore, use data regarding the trajectory of the ball captured in the initial frames of a given shot (in which machine vision capabilities in the system recognize and/or find the image of the ball) to estimate the trajectory of the ball. This estimated trajectory can then be played back after the fact for player review.
[0066] The remote control 314 illustrated by
[0067] In some embodiments, the system includes a machine vision application(s) for finding the image of the golfer in the incoming video stream. These machine vision applications can be based on artificial intelligence circuits such as neural networks, fuzzy logic, pattern recognition functions, inference engines, heuristic functions, etc. Moreover, they can be trained to find the image of the golfer and/or estimate the range from the camera to the golfer. With knowledge gained from the geo-positioning circuits and/or orientation circuits in the system, these machine vision applications (and/or other components of the system of the current embodiment) can determine the geo-location of the golfer and/or golf ball. Thus, the system can obtain a relatively accurate initial location for the golf ball.
[0068] In some embodiments, these functions can also be trained to recognize the golf club or perhaps its shaft and/or head. More specifically, the linear shape of the shaft can stand out from the background image of the image which will typically be vegetation, an audience, the sky, the ground, etc. The often metallic composition of the shaft might also lead to a sharp color contrast with the background. Such features of the club can therefore enable machine vision recognition of the shaft. Similarly, the machine vision functions can be trained to recognize the limited number of club head-shapes and/or colors to enable their identification. Further still, these functions can be trained to identify the motion of the club associated with typical drives, chip shots, putts, etc. Thus, the machine vision circuits could be trained to recognize swings and to “mark” the location of the ball at the bottom of the down stroke and/or location of the golfer (with or without some offset).
[0069]
[0070]
[0071] The mobile device 402 comprises, hosts, houses, etc. the mobile application 410, the camera 411, the operating system 412, the geo-positioning application 414, the WiFi application 416, the graphic user Interface (GUI) 418, the orientation application 420 as well as other features. Indeed many of these features can be native/built-into the mobile device 402. These features are supported by various circuits, IC (integrated circuit) chips, sensors, etc. as those skilled in the art will appreciate. For instance, the camera 411 can be an onboard camera 411 built into the mobile device 402 by its manufacturer. Of course it could be an after-market modification, addition, etc. As such, it might operate at frame rates and/or with resolution/clarity relatively poorer than the high frame rate/high resolution capabilities of many heretofore available cameras used to capture video of golf ball flights. That is not to say that these cameras necessarily have relatively poor performance. But, that can be the case.
[0072] As
[0073] Regarding the WiFi application 416, in the current embodiment, it allows the mobile device 402 and applications resident thereon to communication with other WiFi devices within range of the mobile device 402. More specifically, the WiFi application 416 allows the mobile device 402 to communicate with the remote control 404. Thus, via the WiFi application, the mobile application 410 can exchange data with the remote control application 424 which can be resident on the remote control 404. Of course, BlueTooth® and/or other communications technologies could be used instead of or in addition to WiFi technology.
[0074] With continuing reference to
[0075] The orientation application 420 can be configured to determine the orientation of the mobile device 402. For instance, it can sense accelerometers, compasses, etc. in the mobile device 402 to determine in which direction it points (for instance, north, east, south, west, and/or “points on the compass” there between). It can also inform the mobile application 410 whether the mobile device 402 is pointing up/down and/or to what degree.
[0076] Furthermore, the pan/tilt unit 422 can be mechanically and/or rigidly coupled to the mobile device 402 and an object likely to remain relatively near the player as the player traverses the golf course (or other terrain). It can therefore cause the mobile device to pan and/or tilt when it reorients itself in a corresponding manner. Furthermore, it can be configured to receive pan/tilt commands from the mobile application 410 and to respond thereto with/without feedback of its orientation to the same.
[0077] As to the mobile application 410, it controls the other features of the mobile device thereby allowing users to capture objective and subjective data regarding the user's play. And, in doing so, it often communicates with and/or relies on the operating system 412 to deliver data to it and/or to execute its commands. For instance, the mobile application communicates with the geo-positioning application 414 through the operating system 412 to obtain the geo-position of the mobile device 402. Further still, it communicates with the orientation application 420 to obtain data regarding the orientation of the mobile device 402 and/or camera 411. It also communicates with the WiFi application 416 to send commands/data to the remote control 404 and to receive commands/data therefrom. Furthermore, in the current embodiment, it communicates with the GUI 418 to exchange commands/data with the user and it communicates with the pan/tilt unit to exchange commands/data therewith.
[0078]
[0079] With reference still to
[0080]
[0081] Often, golfers 328 will carefully consider the lay of the ball 318, the terrain 326, the location of the hole, etc. and select a target location for the shot. Moreover, players sometimes consider the potential presence of visual obstructions 324, other obstructions, hazards, etc. and, based therein (and using their experience, knowledge, etc. of the game/course), will select a shape shot for the shot. For instance, should an obstruction like a tree block a direct trajectory to the hole/target, the golfer 328 might aim the ball to one side of the obstruction and plan the shot such that the trajectory of the ball will take it around the obstruction and then cause it to come back toward the intended target. Reference 503 illustrates aspects of such considerations.
[0082] The intended target, shape shot, etc. represent subjective information about the shot not captured by otherwise heretofore available systems. In contrast, the mobile application 410 can be configured to allow the golfer 328 to enter such subjective information via the GUI 418 for storage in memory and/or subsequent review. See reference 504.
[0083] With continuing reference to
[0084] Moreover, the golfer 328 will typically be carrying the selected club 320 with a particular RFID circuit 322 attached thereto. That RFID circuit 322 will be transmitting a code identifying the club being carried and of course other RFID circuits on the other golf clubs 320 might be transmitting their codes as well. Furthermore, as the golfer 328 approaches the tee 316, the signals from those other chips will decrease as the distance between them and the remote control (with its RFID transponder) increases. The signal strength from the RFID circuit on the carried golf club 320 will therefore, in most situations, remain roughly constant and/or begin exceeding the signal strengths from the other RFID circuits 322 (see reference 5 7).
[0085] In addition, or in the alternative, the remote control 314 will be transmitting its geo-position to the mobile device 310. The mobile application 410 therein will, in embodiments, be storing a time stamped array of these geo-positions. See reference 508. The mobile application 410 can be communicating with the geo-positioning application 414 of the mobile device 402 to determine the location of the camera 411. It can also be communicating with the orientation application 420 of the mobile device 402 to determine the orientation of the mobile device 402 (and camera 411). Thus, the mobile application 410 can know the camera's location, the remote control's location, and the orientation of the camera 411. Since the golfer 328 will often be carrying the remote control 404, the mobile application 410 can equate the location of the remote control 404 with that of the golfer 328. As a result, the mobile application 410 can determine how to re-orient the camera 411 to frame the player and send corresponding commands to the pan/tilt unit 422. See reference 512.
[0086] At some time before/after the shot, the golfer 328 can enter subjective data regarding the shot via the GUI 418. More specifically, the player can enter subjective data such as the intended target location, the intended shape shot, their confidence/uncertainty level, notes regard the wind and/or other environmental data, etc.
[0087] Moreover, method 500 in accordance with embodiments, includes the golfer 328 lining up on the shot as illustrated at reference 516. Usually, this activity involves the player positioning themselves with respect to the tee 316, adjusting their grip on the golf club 320, positioning and/or orienting the golf club 320 with respect to the ball 318, etc. These activities take time and as a result the geo-position of the remote control 314 is likely to remain nearly stationary/constant for some time. Indeed, it is not unusual for a golfer 328 to spend 15 seconds or so “at the tee 316.” These, and/or other inputs, can be gathered via touch screens, soft controls, cursors, keyboards, trackballs, etc. See reference 514.
[0088] Of course, video capture might be ongoing during the foregoing activities and/or at other times. Thus, objective data regarding the shot (for instance, the video data) can be captured, stored, etc. as reference 516 indicates. Other objective data can include the location of the remote control 404/golfer 328, the location and/or orientation of the camera 411, etc. Of course, as reference 518 indicates, the golfer 328 will most often swing thereby propelling the ball with a drive, chip shot, putt, etc.
[0089] The location of the ball 318 can also be determined and stored as objective information. More specifically, the mobile application 410 can examine the time-stamped array of geo-positions for the remote control 404 and determine a relatively accurate approximation of the location of the ball 318. More specifically, the mobile application 410 can take advantage of characteristics of typical golfer 328 behavior. For instance, the mobile application 410 can examine the time-stamped geo-position data (for the remote control) and identify relatively lengthy pauses wherein these positions are approximately equal. Should there be more than one such pause, the mobile application can further examine the positions to determine the last such pause before the positions suddenly change and/or head back toward the mobile device 402. In other words, the mobile application can determine when the player has paused over the ball 318 and/or when the golfer 328 turns and heads back toward their golf bag 304/the mobile device 402. Since the golfer 328 and remote control 404 are in close proximity to the ball 318 at that time, the mobile application can use the remote control's location as a close enough approximation of the location of the ball 318 at the beginning of the shot. Reference 520, accordingly, indicates the mobile application 410 determining the starting location for the ball 318.
[0090] As play proceeds, the golfer 328 will eventually proceed to the lay of the ball 318 which they previously hit. See reference 522. Moreover, at some point, the golfer 328 can enter additional/new subjective data regarding the shot. For instance, the player could enter an indication of how well they felt that they executed the shot via the GUI 418. This information can be considered as additional subjective data regarding the shot, swing, play, performance, etc. Such activities are indicated at reference 524.
[0091] Furthermore, method 500 of
[0092] As a result, the mobile application 410 has access to three pieces of objective data that enable it to estimate the trajectory that the ball 318 took/followed during the shot far better and with less processing (and related) resources than heretofore possible. Indeed, the mobile application has reasonably good estimates of both the starting and ending locations of the ball 318 as well as many video frames of the ball as it is hit, leaves the tee 316 and begins to respond to the various phenomenon that affects its trajectory. In other words the mobile application of embodiments has an initial condition, an ending condition, and sample data for a portion of the trajectory from which it can estimate the ball's trajectory. The estimation can be by way of a machine vision application determining the 3-dimensional locations of the ball in each (or some) of the video frames and a curve fitting application using the resulting data to estimate the trajectory. See reference 532.
[0093] Thus, the system 300/400 can capture both subjective and objective data regarding particular shots. Of course it can also capture such information regarding an entire game for a particular player, a particular game for a group of players (with each remote control 404 possessing/transmitting a unique identifier), sets of games for players, etc. Moreover, it can store this information and allow it to be played back on the mobile device 402 or devices which receive the information it captures/generates. More specifically, as indicated at reference 536, the player can replay a shot, game, etc. as the mobile application 410 (or related applications) map it to the hole, course, etc. on which the play occurred. This re-play can include displaying the: [0094] Information regarding where the play occurred (for instance, identifying the course, hole location on the fairway, etc.) [0095] Subjective data captured regarding the play (for instance, the intended target and/or the intended shape shot. [0096] Objective information regarding the play (for instance, the starting location of the ball 318 or golfer 328, the ending location of the ball 318, the captured video of the player's swing, the video of the ball as it leaves the tee, the estimated trajectory, etc.)
[0097] Moreover, if desired, method 500 can be repeated in whole or in part as indicated at reference 540. In accordance with some embodiments, method 500 can even be performed in a different order than illustrated by
[0098]
[0099] The GUI 600 can display the satellite map 602 obtained from any available mapping service via the geo-positioning application 414 or WiFi application 416. In some embodiments, the GUI 600 displays a schematic and/or simplified map of the play instead of or in addition to a photographic map of the play. In the current embodiment, the satellite map 602 happens to show an entire hole from the teeing ground 626 to the cup 632 as well as some of the surrounding terrain. Thus, it shows the rough 623, various hazards 628, various obstructions 630, the tee 633 location, etc. as well. Thus, the golfer 328 can see where they played the particular hole shown.
[0100] Moreover, as
[0101] During the currently displayed play 631, the golfer 328 appears to have then chosen to lay the ball up on the green as indicated by greens target 640. The golfer 328 then took a chip shot as indicated by the chip shot trajectory 643 and chip shot lay 646. Again, the greens target 640 is shown for player evaluation purposes and/or for other purposes. GUI 600 also displays the greens lay 648 and the putt path 650 that scored the hole for that player. In regard to tracking the ball, the putt path can be considered to be somewhat similar to the various ball trajectories as disclosed further elsewhere herein. Note, that in some embodiments, the GUI 600 can display the play 631 of more than one golfer 328. Again, the golfers 328 can use these plays 631 as displayed to evaluate each other's performance, for enjoyment, etc.
[0102] With continuing reference to
[0103] In contrast, the range control 608 allows a user to request various data related to the range (or distance) to various points on the hole. More specifically, activating that control can cause the GUI 600 to display, update, estimate, etc. the range to the cup 610, the range to the (current) target 612, the range difference 614 there between, etc.
[0104] With ongoing reference to
[0105] Furthermore, GUI 600 of the current embodiment displays various information that might be of interest to the golfer 328 during and/or after the play 631. For instance, GUI 600 can display data regarding where each shot took place. That location data 618 can include the country, state, city, golf course, hole, position on the hole, etc. if desired. Weather data 620 such as temperature, humidity, visibility, wind speed, wind direction, wind variability, etc. can be displayed by GUI 600. The weather data 620 can be obtained from weather forecasting/reporting services while the location and related data (e.g. elevation) can be obtained from various geo-positioning services. Moreover, while a shot/play 631 is being played back from memory such location and weather data 618 and 620 can be displayed if desired.
[0106]
[0107] At this juncture a few words might be in order about the computer(s) 706 and/or other systems, apparatus, etc. used to design, store, host, recall, display, transmit, receive, etc. programs, applications, controllers, algorithms, routines, codes, GUIs, etc. of embodiments. The type of computer 706 used for such purposes does not limit the scope of the disclosure but certainly includes those now known as well as those which will arise in the future. But usually, these computers 706 will include some type of display 708, keyboard 710, interface 712, processor 714, memory 716, and bus 718.
[0108] Indeed, any type of human-machine interface (as illustrated by display 708 and keyboard 710) will do so long as it allows some or all of the human interactions with the computer 706 as disclosed elsewhere herein. Similarly, the interface 712 can be a network interface card (NIC), an RFID transceiver, a WiFi transceiver, an Ethernet interface, etc. allowing various components of computer 706 to communicate with each other and/or other devices. The computer 706, though, could be a stand-alone device without departing from the scope of the current disclosure.
[0109] Moreover, while
[0110] Again with reference to
[0111] Thus, systems have been provided which allow users to capture video of their athletic performances in convenient, user-friendly, hands-free manners. Moreover, systems of embodiments estimate the trajectory of the balls, pucks, and/or other projectiles used in these athletic performances with more accuracy while using fewer computing resources than heretofore possible. Systems of embodiments also capture objective data (such as video) and user observed data, as well as subjective information thereby providing additional capabilities and a holistic capturing of the athletic performances. As a result, users can review an integrated set of data regarding their performance and, hopefully, improve their game to a greater degree and in less time than heretofore possible.
CONCLUSION
[0112] Although the subject matter has been disclosed in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts disclosed above. Rather, the specific features and acts described herein are disclosed as illustrative implementations of the claims.