Sports events broadcasting systems and methods
11130019 · 2021-09-28
Assignee
Inventors
Cpc classification
H04N21/21805
ELECTRICITY
A63B2220/833
HUMAN NECESSITIES
H04N5/262
ELECTRICITY
H04N21/23418
ELECTRICITY
H04N21/235
ELECTRICITY
H04N23/90
ELECTRICITY
International classification
A63B24/00
HUMAN NECESSITIES
Abstract
A system and method for recording and broadcasting motion of a sports ball and individual players includes a sports ball with cameras embedded within for recording trajectory and locomotion, a sensor module for storing video footage and telemetry metadata, and cameras mounted on the sports equipment of individual players. The sensor module includes an Inertia Measuring Unit, a transceiver, a memory, a power source, and a processor, all operatively connected to one another. The sports ball sends data to a wireless data transmission grid mounted under a sports pitch and/or to antennas for transfer to a data processing server which determines a real ball direction thereafter sent to a stadium camera system that generates further 360 degree action-focused broadcasting data. The data is sent to the processing server which processes the raw footage received from the sports ball and individual players to produce clean footage for various applications.
Claims
1. A system for recording and broadcasting during an athletic activity, the system comprising: one or more image capturing devices mounted within the sports ball and recording raw video footage from the perspective of said sports ball; a sensor module mounted within the sports ball for receiving the raw video footage of said sports ball, generating telemetry metadata of said sports ball, and wirelessly transmitting data; one or more image capturing devices mounted on sports equipment wearable by individual players for providing compensatory raw video data for processed video footage; a wireless transceiver hub communicatively connected to said sports ball and image capturing devices on the sports equipment worn by individual players to receive data for upload to a data processing server for subsequent processing and broadcasting; a data processing server communicatively connected to the wireless transceiver hub for processing uploaded data and producing directional data that is sent to a stadium camera system; and a stadium camera system communicatively connected to the data processing server that uses said directional data sent by said data processing server to obtain real-time 360 degree action-focused broadcast data that is sent back to the data processing server, the stadium camera system comprising a plurality of action-focused cameras that, based on the directional data, generate action coverage covering 360 degrees field of view around a spherical focus zone determined by the location of the sports ball, the location of one or more individual players, or a combination thereof, wherein the spherical focus zone comprises a dynamic spherical focus zone having a variable diameter that increases or decreases depending on the action taking place; wherein the data processing server applies camera footage selection and discarding rules and data-broadcasting triggering rules, to synthesize and filter the received footage from the sports ball, individual players and stadium camera system and reconstruct scenes for broadcasting to an audience.
2. The system of claim 1, wherein the wireless transceiver hub comprises a wireless sports pitch transmission grid, transceivers located around a sports pitch, or a combination of a transmission grid and transceivers located around the sports pitch, and wherein the wireless transceiver hub comprises mmW-based communication systems, a combination of mmW-based and sub 6 GHz-based communication systems, or wireless local area networking (WiFi) systems.
3. The system of claim 1, wherein said plurality of action-focused cameras auto-compensate and regulate rotation, focus, and zoom.
4. The system of claim 3, wherein the stadium camera system further comprises Light Detection and Ranging (LIDAR) devices configured to provide precise distance and depth information of action taking place in said sports pitch.
5. The system of claim 3, wherein the stadium camera system further includes electroacoustic transducers, including microphones and loudspeakers, respectively configured to record and reproduce sound data originating from the action taking place on said sports pitch.
6. The system of claim 1, wherein processed footage for broadcasting includes image data, 3D geometries, video data, textual data, haptic data, audio data, or a combination thereof.
7. The system of claim 1, wherein one or more layers of soft polymer material are mounted on top of camera lenses of the image capturing devices for providing damage and impact protection to said camera lenses.
8. The system of claim 1, wherein the sensor module further comprises: an Inertia Measuring Unit (IMU); one or more millimeter wave (mmW) transceivers; a memory; a power source; and a processor.
9. The system of claim 8, wherein said IMU measures and reports velocity, acceleration, and angular momentum of said sports ball using a combination of one or more accelerometers and one or more gyroscopes.
10. The system of claim 9, wherein said one or more gyroscopes are configured to maintain rotation of the sensor module independent of rotation of the sports ball.
11. The system of claim 8, wherein said one or more mmW transceivers enable positional tracking of the sports ball.
12. The system of claim 8, wherein said one or more mmW transceivers are configured to allow the sports ball to upload raw data to the data processing server via the wireless transceiver hub, said raw data comprising: raw video footage captured from the perspective of the sports ball; and the telemetry metadata of the sports ball.
13. The system of claim 8, wherein said memory is adapted to store data comprising: application program instructions; the telemetry metadata of the sports ball; and raw video footage taken by the plurality of image capturing devices mounted in the sports ball.
14. A method for processing and synthesizing video footage performed by a data processing server, the method comprising: obtaining raw video data from image capturing devices mounted in a sports ball and on individual players along with telemetry metadata, wherein at least one image capturing device mounted in the sports ball is configured to capture raw video data from the perspective of the sports ball; determining a real ball direction within world space; sending directional data to a stadium camera system that, based on the directional data, auto-compensates and regulates rotation, focus, and zoom of cameras in order to generate uniform action coverage covering 360 degrees field of view around a spherical focus zone determined by the location of the sports ball, the location of one or more individual players, or combinations thereof, wherein the spherical focus zone comprises a dynamic spherical focus zone having a variable diameter that increases or decreases depending on the action taking place; obtaining 360 degree action-focused broadcast data from the stadium camera system; applying camera footage selection/discarding rules; applying data-broadcast triggering rules; and processing raw data received from the sports ball, sports equipment of individual players and the 360 degree action-focused broadcast data to reconstruct scenes for broadcasting to an audience.
15. The method of claim 14, wherein determining a real ball direction within world space further comprises: detecting a movement of the sports ball; determining an initial spatial orientation of the sports ball; determining a change in the spatial orientation of the sports ball; determining telemetry metadata of the sports ball; and utilizing the telemetry metadata in conjunction with high speed video footage, initial spatial orientation, and changes in spatial orientation of the sports ball for obtaining the real ball direction in world space.
16. The method of claim 14, wherein camera footage selection/discarding rules include: rules for selecting footage from cameras recording images to add to a final video footage to be broadcast to an audience; and rules for discarding footage from cameras recording images to omit from a final video footage to be broadcast to an audience.
17. The method of claim 16, wherein footage may be selected from one camera within ball space for a given set of video frames.
18. The method of claim 16, wherein more than one set of footage recorded simultaneously by more than one camera from different locations and angles within ball space may be selected for a given set of video frames.
19. A system comprising: a sensor module mounted within a sports ball for generating telemetry metadata of said sports ball and wirelessly transmitting data; one or more image capturing devices mounted on sports equipment wearable by individual players for providing raw video data for processed video footage; a wireless transceiver hub communicatively connected to the sports ball and image capturing devices on the sports equipment worn by individual players to receive data for upload for subsequent processing and broadcasting; a data processing server communicatively connected to the wireless transceiver hub for processing uploaded data and producing directional data; and a stadium camera system communicatively connected to the data processing server that uses said directional data to obtain real-time 360 degree action-focused broadcast data that is sent back to the data processing server, the stadium camera system comprising a plurality of action-focused cameras that, based on directional data sent by the data processing server, auto-compensate and regulate rotation, focus, and zoom of cameras in order to generate action coverage covering 360 degrees field of view around a spherical focus zone determined by the location of the sports ball, individual players, or combinations thereof, wherein the spherical focus zone comprises a dynamic spherical focus zone having a variable diameter that increases or decreases depending on the action taking place; wherein the data processing server applies camera footage selection and discarding rules and data-broadcasting triggering rules to process received footage from the image capturing devices mounted on sports equipment wearable by individual players and the stadium camera system to generate experiences in augmented, virtual or mixed reality for interaction with users via user devices.
20. The system of claim 19, wherein the augmented, virtual or mixed reality experiences based on the events of a stadium are shared with one or more cloud servers to simulate and broadcast the sports event to at least one other stadium.
Description
DESCRIPTION OF THE DRAWINGS
(1) The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
DETAILED DESCRIPTION
(19) In the following description, reference is made to drawings which show by way of illustration various embodiments. Also, various embodiments will be described below by referring to several examples. It is to be understood that the embodiments may include changes in design and structure without departing from the scope of the claimed subject matter.
(20)
(21) Raw data 112, which includes raw video footage captured by the image capturing devices 110 mounted in the sports ball 104 and on individual players 106, in addition to the telemetry metadata captured by the sensor module in the sports ball 104, are transmitted to a wireless transceiver hub, which may include, e.g., a wireless pitch data transmission grid 114 mounted under the sports pitch and/or antennas 116 mounted around the sports pitch, which in turn transmit the raw data 112 to a data processing server 118 for analysis and processing.
(22) After obtaining the raw video data and telemetry metadata, the data processing server 118 may proceed by determining a real ball direction within world space. Information of the real ball direction within world space along with telemetry metadata sent by the sensor module included in the image capturing devices 110 on sports equipment 108 worn by individual players 106 provides a stadium camera system 120 with directional data 122 that action-focused cameras 124 included in the stadium camera system 120 may focus on. The stadium camera system 120 may send real-time 360 degree action-focused broadcast data 126 back to the data processing server 118 for further processing.
(23) In some embodiments, the action-focused cameras 124 include a Light Detection and Ranging (LIDAR) devices mounted thereon. The LIDAR devices may provide precise distance and depth information of action taking place in the sports pitch. Data obtained from the LIDAR devices may be included in the 360 degree action-focused broadcast data 126 that is sent back to the data processing server 118 for further processing.
(24) In some embodiments the plurality of action-focused cameras 124 of the stadium camera system 120 includes electroacoustic transducers, including microphones and loudspeakers, respectively configured to record and reproduce sound data originating from the action taking place on the sports pitch.
(25) Subsequently, the data processing server 118, through footage selection/discarding rules of footage sent by the image capturing devices 110 mounted in the sports ball 104 and on individual players 106, applying data-broadcast triggering rules, noise filtering methods, and other video processing techniques, synthesizes the raw data 112 and 360 degree action-focused broadcast data 126 into processed data 128. A processed data signal is then transmitted for subsequent presentation to an audience, e.g., via display 130. Through the broadcasting system 100, footage from the raw data 112 and 360 degree action-focused broadcast data 126 is received at a high frame rate of at least 100 frames per second (FPS) and is converted into clean, processed data 128 at a low frame rate of at least 24 FPS that may be viewed as a processed footage broadcasting 132 including on a viewing means such as display 130 for enjoyment of an audience. The processed data 128 may be viewed as a reconstructed replay from different angles and perspectives for a specific scene of the sports event.
(26) In some embodiments, processed footage broadcasting 132 may include image data, 3D geometries, video data, textual data, haptic data, audio data, or a combination thereof.
(27) In some embodiments, systems and methods of the current disclosure may additionally be adapted for other types of events that may take place in stadiums, such as concerts, plays, and other entertainment events. For application in concerts and plays, real-time motion-capturing techniques known in the art may be applied to performers.
(28)
(29) In an embodiment, the sports equipment 108 includes sports uniform 202 and sports shoes 204. The image capturing devices 110 that may be mounted upon the sports equipment 108 may preferably include video-recording cameras, and may be employed to record raw video footage from the perspective of the individual players. The video-recording cameras may capture video footage at a high rate of at least 100 FPS and covering at least 120 degrees of field view. The sports equipment 108 additionally includes a sensor module connected to the image capturing devices 110 for capturing individual players motion telemetry metadata and enabling data transfer.
(30)
(31)
(32)
(33)
(34) The sensor module 402 may be physically coupled to the sports ball 104 by a variety of coupling means depending on the nature of the sports ball 104. For example, the sensor module 402 may be physically coupled to a sports ball 104 by being attached to the exterior of the sports ball 104, by being attached to an interior surface of a hollow sports ball 104, by being suspended by a suspension system in the interior of a hollow sports ball 104, or by being integrated into the outer layer or other layer of a multi-layer, non-hollow sports ball 104.
(35) Exemplary techniques that may be employed to mount the sensor module 402 to the sports ball 104 are disclosed in U.S. Pat. Nos. 7,740,551, and 8,517,869, both filed on Nov. 18, 2009 and which are incorporated herein by reference.
(36)
(37) The number of image capturing devices 110 may vary according to the design of the sports ball. Thus, in the case of a soccer ball 302 with 12 black pentagonal patches 504 and 20 white pentagonal patches 506, a total of 12 image capturing devices 110 may be mounted in the sports ball, with one image capturing device 110 mounted per black pentagonal patch 504. The image capturing devices 110 may capture video footage at a high rate of at least 100 FPS and covering at least 90 degrees of field view each. Thus, for a soccer ball 302 with 12 image capturing devices 110, a total of at least 1080 degrees of field view may be covered.
(38)
(39) Suitable materials for soft protective lens 604 may include neoprene, which displays high elasticity and cannot easily be cracked or damaged upon impact.
(40) According to yet other embodiments, other suitable methods for protecting a image capturing device may be employed. For example, suitable methods for protecting an image capturing device are mentioned in the United States Pre-Grant Publication No. 2013/0129338, filed on Oct. 15, 2012, which is herein incorporated by reference, where a camera protection system includes three layers of protective material. The outer layer is made of a firm yet flexible material such as Santoprene™ vinyl (Santoprene is a trademark of the ExxonMobil Corporation for their proprietary line of thermoplastic vulcanizate (TPV)), vinyl or nitrile based compound. The second inner layer is made of a softer material such as neoprene or other similar soft and spongy/foam materials that have good compression and decompression properties. The third interior layer is made of Jersey or other suitable soft cloth material designed to protect the finish of a lens barrel wherein the camera lens may be located.
(41)
(42) The IMU 702 measures and reports the velocity, acceleration, angular momentum, and other telemetry metadata of the sports ball and/or individual players using a combination of accelerometers and gyroscopes.
(43) Accelerometers within the IMU 702 may be capable of measuring the acceleration of the sports ball and individual players, including the acceleration due to the Earth's gravitational field. In one embodiment, accelerometers within the IMU 702 may include a tri-axial accelerometer that is capable of measuring acceleration in three orthogonal directions. In other embodiments one, two, three, or more separate accelerometers may be included within IMU 702.
(44) Gyroscopes included in the IMU 702 or in addition to gyroscopes included in the IMU 702, apart from measuring angular momentum of a sports ball in motion, may also serve for maintaining the rotation of the sensor module 402 independent of the rotation of the sports ball.
(45) MmW transceivers 704 may allow the sports ball and sports equipment on individual players to receive mmW wireless signals and to upload data including raw video footage and telemetry metadata for subsequent processing and broadcasting. The mmW transceivers 704 may also be configured to enable positional tracking of the sports ball and sports equipment. Data transfer and receiving of the mmW transceiver 704 to and from other devices may take place over a personal area network or local area network using, for example, one or more of the following protocols: ANT, ANT+ by Dynastream Innovations, Bluetooth, Bluetooth Low Energy Technology, BlueRobin, or suitable wireless personal or local area network protocols.
(46) In an embodiment, mmW-based communication systems, a combination of mmW-based and sub 6 GHz-based communication, or wireless local area networking (WiFi), preferably, but not limited to, providing data at 16 GHz, are used for data transfer and receiving by mmW transceivers 704. In other embodiments, 4G antenna systems may be used as support the mmW/sub GHz antenna systems.
(47) The memory 706 may be adapted to store application program instructions and to store telemetry metadata of the sports ball and individual players from the IMU 702 as well as raw footage taken by the image capturing device.
(48) The power source 708 is configured to provide power to the image capturing device and to the sensor module 402.
(49) In one embodiment, the power source 708 may be a battery. The power source 708 may be built into the sensor module 402 or removable from the sensor module 402, and may be rechargeable or non-rechargeable. In one embodiment, the sensor module 402 may be repowered by replacing one power source 708 with another power source 708. In another embodiment, the power source 708 may be recharged by a cable attached to a charging source, such as a universal serial bus (“USB”), FireWire, Ethernet, Thunderbolt, or headphone cable, attached to a personal computer. In yet another embodiment, the power source 708 may be recharged by inductive charging, wherein an electromagnetic field is used to transfer energy from an inductive charger to the power source 708 when the two are brought in close proximity, but need not be plugged into one another via a cable. In another embodiment, a docking station may be used to facilitate charging.
(50) The processor 710 may be adapted to implement application programs stored in the memory 706 of the sensor module 402. The processor 710 may also be capable of implementing analog or digital signal processing algorithms such as raw data reduction or filtering. For example, the processor 710 may be configured to receive and process raw data from the IMU 702 and raw video footage from image capturing device.
(51) In an embodiment, combining the capabilities of the IMU 702 with the positional tracking provided by the mmW transceivers 704 may enable sub-centimeter or sub-millimeter positional and orientational tracking, which may increase accuracy when tracking the real-time position and orientation of the client devices and may improve the general user experience.
(52) Tracking of the client devices may be performed employing several techniques known in the art. For example, tracking may be performed by employing time of arrival (TOA) tracking technique, which uses information gathered from three or more antennas. The client device then sends out a signal that is received by all of the antennas within range. Then, each antenna measures the amount of time it has taken to receive the signal from the time the signal was sent, triangulating the position of the client device. In other embodiments, tracking of client devices may be performed by using an angle of arrival (AOA) technique which, instead of using the time it takes for a signal to reach three base stations like TOA does, uses the angle at which a client device signal arrives at the antennas. By comparing the angle-of-arrival data among multiple antennas (at least three), the relative location of a client device can be triangulated. In further embodiments, other tracking techniques known in the art may be employed (e.g., visual imaging, radar technology, etc.).
(53)
(54) According to an embodiment, the ball charging system 800 may provide power through inductive charging, in which case an inductive coil may be mounted in the sports ball 104 and coupled to the power source 708 of sensor module 402 for charging with a ball charging device 802. The sports ball 104 may have exterior markings 804 to indicate the location of the inductive coil or to otherwise facilitate optimum orientation of the sports ball 104 for charging with the ball charging device 802.
(55)
(56) According to other embodiments, a combination of wireless pitch data transmission grid 114 and antennas 116 may also be implemented for providing enhanced connectivity and ensure data upload. The antennas 116 may be mounted around the sports pitch 902.
(57) According to yet other embodiments, antennas 116 may be employed in place of the wireless pitch data transmission grid 114.
(58) The wireless pitch data transmission grid 114 and antennas 116 enable different wireless systems communication, preferably mmW-based communication, a combination of mmW-based and sub 6 GHz-based communication, or wireless local area networking (WiFi), preferably, but not limited to, providing data at 16 GHz.
(59) Transmission speed from the sports ball 104 to the wireless pitch data transmission grid 114 and/or to the antennas 116 may be of at least around 5 gigabytes per second.
(60)
(61) The step of processing raw data through other video processing techniques 1014 may include synthesizing/filtering raw data 112 coming from image capturing devices 110 mounted in the sports ball 104 and on the sports equipment 108 of individual players 106, as well as 360° action-focused broadcast data from the stadium camera system.
(62)
(63) The method for determining a real ball direction in world space 1002 may begin by detecting the movement of the sports ball at step 1102, which may be performed based on acceleration data captured by IMU 702 of the sensor module 402 described in
(64) Subsequently, and in response to the determination of the occurrence of a movement to track, an initial space orientation of the sports ball may be determined at step 1104, which may be made by reference to a coordinate axis system. The determination of the initial spatial orientation of the sports ball at step 1104 may be made with respect to a gravity vector or with respect to an Earth magnetic field vector. In the case of a soccer ball, the determination of the initial spatial orientation of the sports ball relative to the specific movement to be tracked may be defined, for example, as the spatial orientation of the soccer ball just before, at the moment of, or just after the soccer ball is kicked by an individual, depending on the algorithm employed.
(65) Afterwards, a change in the spatial orientation may be determined at step 1106 in a similar way as the determination of an initial space orientation of the sports ball at step 1104, except that additional information about changes in the orientation of the gravity vector or magnetic field may be additionally factored in.
(66) Then, a determination of the sports ball telemetry metadata at step 1108 may be performed. Telemetry metadata may refer to ball speed, ball spin rate, ball spin axis, and ball launch angle data, all being information captured by the IMU or inferred by the data processing server from data captured by the IMU. Telemetry metadata, in conjunction with high speed video footage taken by image capturing devices from the image capturing device, the initial spatial orientation, and the changes in spatial orientation of the sports ball, may be analyzed by the data processing server to obtain a real ball direction in world space at step 1110. Suitable analysis techniques for obtaining a real ball direction in world space include regression analysis, amongst others.
(67)
(68) In
(69) According to an embodiment, the stadium camera system 120 is configured to capture a spherical focus zone 1204, which includes the area within the sports field where the most relevant action of the sports event takes place. Diameter of the spherical focus zone 1204 may depend on the initial configuration of the stadium camera system 120. This configuration may depend on different sports event priorities. In an embodiment, a stadium camera system 120 configured for a smaller spherical focus zone 1204 may be based on focusing on an area within about 2 to 10 meters around the sports ball 104. In another embodiment, a stadium camera system 120 configured for a larger spherical focus zone 1204 may be based on focusing on an area within about 10 and about 50 meters around the sports ball 104 when taking into account individual players 106 located relatively close to the action.
(70) According to another embodiment, the spherical focus zone 1204 may be static or dynamic. For a static spherical focus zone 1204, diameter of the spherical focus zone 1204 may be fixed, meaning that the focus of the stadium camera system 120 may, independently of the type of action taking place, always be on the action taking place within about 2 to 10 meters around the sports ball 104, or within about 10 to 50 meters around the sports ball 104 when taking into account individual players 106 located relatively close to the action. On the other hand, for a dynamic spherical focus zone 1204, the spherical focus zone 1204 may increase or decrease in diameter depending on the action taking place. For example, in the case of a soccer match, a goal kick or a free kick may trigger an expansion of the spherical focus zone 1204, while a penalty kick may trigger a contraction of the spherical focus zone 1204.
(71) According to an embodiment, stadium camera system 120 may be configured to compensate and modify the rotation, focus, and zoom of each of the action-focused cameras 124 depending on where the spherical focus zone 1204 is located with respect to the action-focused cameras 124. Compensation and modification of the rotation, focus, and zoom of each of the action-focused cameras 124 is performed in order to obtain highly uniform 360 degree action-focused broadcast data 126 to be sent back to the data processing server. In this embodiment, each of the action-focused cameras 124 may, at any specific moment in time, record the action within the spherical focus zone 1204 employing different levels of rotation, focus, and zoom. For example, if the spherical focus zone 1204 is located close to a specific group of action-focused cameras 124, the zoom and focus parameters from each of these action-focused cameras 124 may be lower than the zoom and focus parameters from action-focused cameras 124 located farther away from the spherical focus zone 1204. Likewise, when the spherical focus zone 1204 moves away from the specific group of action-focused cameras 124 initially located close to the spherical focus zone 1204, then the zoom and focus parameters from each of these action-focused cameras 124 may be higher than before. Rotation of the action-focused cameras 124 may as well be dependent on the location of the spherical focus zone 1204. In order to perform the compensation and modification of the rotation, zoom, and focus parameters, the distance and position between each of the action-focused cameras 124 and the spherical focus zone 1204 is measured by the data processing server and is provided to the stadium camera system 120 as a set of real-time instructions to achieve the rotation, zoom, and focus compensation. Additionally, the action-focused cameras 124 may need to be calibrated before or at the beginning of the sports event.
(72)
(73) The set of rules upon which the camera footage selection/discarding within ball space 1300 is based may be more easily applied after a real ball direction has been determined (e.g., by the method for determining a real ball direction in world space 1002 described with reference to
(74) In
(75) According to an embodiment, footage may be selected from one image capturing device 110 within the ball space for a given set of video frames.
(76) According to yet another embodiment, different footage recorded simultaneously by more than one image capturing device 110 from different locations and angles within the ball space may be selected for a given set of video frames. Recording footage from more than one image capturing device 110 for a given set of frames may provide a video control operator (not shown) with a greater number of sets of possible processed footage to select and broadcast to an audience.
(77)
(78) As may be noted, camera footage selection rules 1402 may ensure that only footage from image capturing devices recording relevant images is taken into consideration for final video broadcasting.
(79) The processed data obtained after analysis and processing of raw data may be used for reconstructing specific scenes (e.g., for providing a replay of a penalty shootout, a sequence related to a referee's ruling, etc.) that can then be broadcast on a display or other viewing means for enjoyment of an audience.
(80) Likewise, a set of camera footage discarding rules 1410 may also be considered. The set of camera footage discarding rules 1410 may include discarding camera footage from image capturing devices that are directed towards: the sky/rooftop 1412; the ground surface in direct contact with the sports ball 1414; and, an obstruction 1416 such as an advertising board in close proximity to the soccer ball.
(81) As may be appreciated, camera footage discarding rules 1410 work based on eliminating images that may not provide any value to an audience.
(82)
(83) Examples of data-broadcast triggering rules 1500, as shown in
(84) As may be appreciated, in general, data-broadcast triggering rules 1500 are based on and connected to events that are specific to the rules of the sports event being watched.
(85)
(86) According to various embodiments, produced processed data 128 may be selected from the perspective of the sports ball, from the perspective of individual players, from a 360 degree action-focused view obtained from the stadium camera system, or from combinations thereof. For example a broadcast replay of a penalty shootout may include a few seconds of footage from the perspective of a soccer ball, a few seconds from the perspective of the individual player kicking the penalty, a few seconds from the perspective of the goalkeeper in an attempt to save the soccer ball, and a few seconds from the perspective obtained by the stadium camera system.
(87)
(88) In the embodiment of a AR/VR/MR broadcasting system 1700, an action perspective data capturing module 1702 includes a sports ball 1704 and one or more individual players 1706 with image capturing devices 1708 mounted upon their sports equipment 1710 and configured to capture and transmit raw data 1712, such as raw video footage and telemetry metadata, for subsequent analysis and processing. The raw data 1712 is transmitted to a processing/rendering server 1714 after being uploaded to the wireless pitch data transmission grid 1716 and/or antennas 1718. At this stage, the processing/rendering server 1714 analyzes and processes the raw data 1712 received and generates directional data 1720 that is then sent to the stadium camera system 1722 comprising a plurality of action-focused cameras that, based on directional data sent by the data processing server, auto-compensate and regulate rotation, focus, and zoom of cameras in order to generate uniform action coverage covering 360 degrees field of view around a spherical focus zone determined by the location of the sports ball, individual players, or combinations thereof to create 360 degree action-focused broadcast data 1724 that is sent back to the processing/rendering server 1714.
(89) Then, the processing/rendering server 1714, applying data-broadcast triggering rules, noise filtering methods, and other video processing and rendering techniques, processes the raw data 1712 and 360 degree action-focused broadcast data 1724 into processed data 1726 to create context-full scenarios. These context-full scenarios may be used for generating AR/VR/MR experiences 1728 for enjoyment of users 1730 such as members of an audience through suitable user devices, such as AR/VR/MR headsets 1732.
(90) According to an embodiment, in the case of AR/VR/MR experiences 1728 with individual players 1706, the AR/VR/MR broadcasting system 1700 may be employed not only to train athletic skills of said individual players 1706 but also to provide psychological conditioning, such as mentally preparing one or more individual players 1706 for a penalty shootout. Through the AR/VR/MR broadcasting system 1700, the processing/rendering server 1714 may recreate scenarios from previous matches, serving as a personalized training program that may be used in a virtual reality/augmented reality training center.
(91) According to another embodiment, in the case of AR/VR/MR experiences 1728 provided to members of an audience, the AR/VR/MR broadcasting system 1700 may allow members of an audience to view and experience the broadcast of a replay from the perspective of one or more individual players 1706 and/or from the perspective of a sports ball 1704.
(92) According to another embodiment, the processing/rendering server 1714 is able to create augmented reality volumes that users 1730, such as members of an audience, may interact with to enjoy further AR/VR/MR experiences 1728. These interactive augmented reality volumes may be created by distance interpolation methods applied on the sports equipment 1710 to calculate the height and shape of an individual player 1706. In this embodiment, a content database 1736 includes information from all of the individual players 1706 as well as from the sports pitch 1734. Forms of AR interactions with individual players 1706 may include viewing player statistics, highlights, biography, and the like. Forms of AR interactions with the sports pitch 1734 may include viewing further details about the sports pitch 1734 and stadium, including history, dimensions and capacity, statistics, highlights, and the like. Forms of AR interactions with the sports ball 104 may include current sports ball 104 locomotion data such as ball speed, ball spin rate, ball spin axis, and ball launch angle data, amongst others. In further embodiments, initial AR interactions may as well include options that lead to VR/MR interactions, such as being able to view, through AR/VR/MR headsets 1732, the sports event from the perspective of one or more individual players 1706 by utilizing data captured by the image capturing devices 1708 on the one or more individual players 1706.
(93) According to another embodiment, compensation and modification of the rotation, focus, and zoom of each action-focused camera 1738 within the stadium camera system 1722 generates 360 degree action-focused broadcast data 1724 that the processing/rendering server 1714 processes and renders in order to create 360 degree AR/VR/MR experiences 1728 around the action taking place in the sports pitch 1734 that may be viewed by users 1730 through AR/VR/MR headsets 1732. Yet further in this embodiment, the 360 degree AR/VR/MR experiences 1728 may also be shared with one or more remote sports stadiums through a cloud server in order to simulate the action of the sports event taking place in the original stadium into the one or more remote stadiums. For AR/VR/MR experiences 1728 to be shared between stadiums, the sports pitch 1734 of the one or more target remote stadiums may be mapped against the sports pitch 1734 of the original stadium and overlay against the processed data 1726, providing users 1730 with a sensation that may be similar to being present at the original stadium.
(94) A system and method for providing augmented reality through a simulation engine is disclosed in the U.S. Pre-Grant Publication No. 2013/0218542 filed on Nov. 29, 2012 by the same inventor of the current disclosure, and is herein incorporated by reference. Moreover, U.S. Pre-Grant Publication No. 2013/0215229 and filed on Aug. 22, 2013 by the same inventor of the current disclosure discloses a system and method for creating a virtual reality environment based on the recording of a real scene, and is herein also incorporated by reference.
(95) The subject matter of this application is also related to the subject matter of U.S. Pre-Grant Patent Publication No. 2015/0328516 A1 entitled “Sports Ball Athletic Activity Monitoring Methods and Systems,” filed on May 14, 2014. That application, including Appendices, is herein incorporated by reference.
(96) While certain embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that the invention is not limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those of ordinary skill in the art. The description is thus to be regarded as illustrative instead of limiting.