Method and infrastructure for synchronized streaming of content
09848221 · 2017-12-19
Assignee
Inventors
Cpc classification
H04N21/242
ELECTRICITY
H04N21/6582
ELECTRICITY
H04N21/43076
ELECTRICITY
H04N21/42202
ELECTRICITY
H04N21/4302
ELECTRICITY
H04N21/44209
ELECTRICITY
H04N21/47202
ELECTRICITY
H04N21/44004
ELECTRICITY
International classification
H04N7/16
ELECTRICITY
H04N21/242
ELECTRICITY
H04N21/258
ELECTRICITY
H04N21/422
ELECTRICITY
H04N21/43
ELECTRICITY
H04N21/44
ELECTRICITY
H04N21/442
ELECTRICITY
H04N21/858
ELECTRICITY
H04N21/472
ELECTRICITY
Abstract
Systems and methods for synchronizing the playback of network media across multiple content playback devices, termed herein as “playback devices”, “clients”, or “client devices”. In one implementation, client devices are controlled to parse and buffer media content separately. Once all clients are ready, a controller may cause the client devices to start in a synchronized fashion based on signals sent by the controller. The controller adjusts the timing of the signal so that the outputs are displayed in synchronization on each client device. In other implementations, device lag times may be measured. In still other implementations, a master device may synchronize playback of media content on slave devices. In yet other implementations, devices may buffer and join playback of media content occurring on other devices. In further implementations, the systems and methods may be expanded to include steps of processing authentication for service providers prior to arranging synchronized playback.
Claims
1. A method of synchronizing the playback of a content item among a plurality of content playback devices, the content item available through a service provider requiring an affiliation process, comprising: a. coupling a plurality of content playback devices in data communication with a controller, the controller configured to at least partially control playback of a content item on the plurality of content playback devices through a service provider, the plurality of content playback devices constituting a synchronization group; b. sending a signal from the controller to each of the plurality to cause each of the plurality to contact the service provider to obtain access to the content item; c. in the event one of the plurality is not allowed access to the content item, then notifying the controller of the event and responsive to the notifying, the controller removing the one from the synchronization group; and d. sending a signal to each of the remaining content playback devices in the synchronization group to begin playback of the content item, such that the one of the plurality that is not allowed access does not receive a signal to begin playback; e. receiving data about device lag times associated with at least a first and a second content playback device in the plurality; f. calculating a time differential between a start time associated with the first content playback device and a start time associated with the second content playback device, the time differential at least partially based on the device lag times; and g. wherein the sending a signal to each of the content playback devices in the synchronization group to begin playback of the content item includes sending signals to the first and second content playback devices to begin playback of the content item, a time of each sending separated by the time differential.
2. The method of claim 1, wherein at least a portion of the plurality are in data communication with a proxy device, and wherein the sending a signal to cause each of the plurality to contact the service provider includes sending a signal to cause each of the portion of the plurality to contact the service provider through the proxy device, such that the content item is obtained at least in part by throughput through the proxy device.
3. The method of claim 2, wherein the proxy device is a second display.
4. The method of claim 1, wherein the controller configures the plurality of content playback devices for synchronized playback through a second display.
5. The method of claim 4, wherein the second display indicates a list of content items for which access may be obtained by each of the plurality, or a list of content playback devices within the plurality that can obtain access to a given content item.
6. The method of claim 1, further comprising sending each of the content playback devices in the synchronization group a unique URL with which to access the content item.
7. A non-transitory computer-readable medium, comprising instructions for causing a computing device to implement the method of claim 1.
8. A method of synchronizing the playback of a content item among a plurality of content playback devices, the content item available through a service provider requiring an affiliation process, comprising: a. coupling a plurality of content playback devices in data communication with a controller, the controller configured to at least partially control playback of a content item on the plurality of content playback devices through a service provider, the plurality of content playback devices constituting a synchronization group; b. sending a signal from the controller to each of the plurality to cause each of the plurality to contact the service provider to obtain access to the content item; c. in the event one of the plurality is not allowed access to the content item, then notifying the controller of the event and removing the one from the synchronization group; d. sending a signal to each of the remaining content playback devices in the synchronization group to begin playback of the content item, such that the one of the plurality that is not allowed access does not receive a signal to begin playback; e. receiving data about device lag times associated with at least a first and a second content playback device in the plurality; f. calculating a time differential between a start time associated with the first content playback device and a start time associated with the second content playback device, the time differential at least partially based on the device lag times; and g. wherein the sending a signal to each of the content playback devices in the synchronization group to begin playback of the content item includes sending signals to the first and second content playback devices to begin playback of the content item, a time of each sending separated by the time differential.
9. The method of claim 8, wherein at least a portion of the plurality are in data communication with a proxy device, and wherein the sending a signal to cause each of the plurality to contact the service provider includes sending a signal to cause each of the portion of the plurality to contact the service provider through the proxy device, such that the content item is obtained at least in part by throughput through the proxy device.
10. The method of claim 9, wherein the proxy device is a second display.
11. The method of claim 8, wherein the controller configures the plurality of content playback devices for synchronized playback through a second display.
12. The method of claim 11, wherein the second display indicates a list of content items for which access may be obtained by each of the plurality, or a list of content playback devices within the plurality that can obtain access to a given content item.
13. The method of claim 8, further comprising sending each of the content playback devices in the synchronization group a unique URL with which to access the content item.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Like reference numerals denote like elements throughout.
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
DETAILED DESCRIPTION
(13) Referring to
(14) A remote control 22 may be employed to control the content playback device, or control may be exercised by way of the second display 16. The use of second display devices in such contexts has certain benefits because the same provides complementary functionality to the IPTV, but generally does not require additional investment by the user because the same make use of a device, e.g., a smartphone, tablet computer, or the like, which most users already have in their possession. Additional details about such second displays and their interactions with content playback devices, e.g., through proxy servers and otherwise, may be seen from Applicants' co-pending U.S. patent application Ser. No. 13/077,181, filed Mar. 31, 2011, entitled “PERSONALIZED SECOND DISPLAY BROWSING EXPERIENCE DUE TO MULTIPLE SESSION FEATURE”, owned by the assignee of the present application and incorporated herein by reference in its entirety.
(15) As illustrated in
(16) In a general method, including use of a second display, a user has a user account with a source or clearinghouse of services. Here, the source or clearinghouse is represented as a management server, but it should be understood that the user account may be with a service provider directly. The management server communicates with at least one content server (generally associated with the service provider) such that the content server provides content items such as streaming assets for presentation or access at the content playback device. The user account has information stored thereon related to what content playback devices are associated with the user account. When a user logs on, they may see this list of content playback devices and may choose a particular content playback device. Once a content playback device has been chosen, a list of services may be displayed from which the user may choose. From a chosen service, a user may select a content item for viewing, undergoing an affiliation or authentication step if required by the service. Additional details may be found in the application incorporated by reference above.
(17) A number of synchronization controllers 36-54, also termed just “controllers”, are also illustrated. Controllers may be in one or all of the content playback devices, second displays, or servers controlling content delivery. In general, at least one controller is required, and the controller may be implemented in hardware, software, firmware, or the like. The controller can even be in an external device 18, devoted to controller functionality, or providing other functionality in addition to controller functions.
(18) A typical situation represented by
(19) Consequently, the controllers 36-54 are employed to coordinate such playback. All of the client devices participating in the synchronized playback establish data communication with a controlling device or controller that coordinates the playback timing across all participating clients. The controller can be one of the client devices, or it may be a separate device. Generally, some client devices will be capable of operating as controlling devices, and others will not.
(20) Referring to the flowchart 20 of
(21) Once the content playback devices are ready to playback the content item, they signal their readiness to the controller (step 68), e.g., to controller 42 in the second display 16. In particular, once all of the content playback devices have decoded the index and header information they need and have buffered enough data such that they may start playback, their readiness is communicated to the controller. At this point, the client devices are waiting for a start signal to begin playback of the content item. Once the controller has received a signal from all clients indicating their readiness to begin playback, a start signal may be sent to each client to begin playback. Each client device should be in a state where playback may begin immediately or at a specified future time, so as to account for the local network lag of communications from the controller to all clients, upon receiving the signal from the controller. The controller 42 may adjust the timing of start signals (step 62) so that the output is displayed in synchronization on each client device. For example, the controller 42 may delay the sending of start signals based on the network lag of each client device, or may send all the start signals but indicate within the signal a respective delay after which playback should begin. In this latter alternative, all the playback devices have an opportunity to cache content while they are waiting through the delay.
(22) A number of steps may be employed in determining the timing of the start signals. For example, if the controller is based at the server level, e.g., within the management server, the same may be aware of and account for differences in location of the service provider or source server relative to the client devices (step 64). In other words, some client devices may be located much closer to the source of content than others, and thus will experience less network lag or delay in receiving content.
(23) Device lags may also be accounted for, such as the device lag between when a playback signal is generated and when that signal is actually displayed to the user. Such device lags may be measured using techniques described below, and in any case data about such lags may be communicated to the controller (step 66). Client devices may also employ a step of attempting to measure their network lag, and communicating the same to the controller (step 76), by measuring how long it takes for a test signal to traverse to a network location and back, e.g., to the management server.
(24) Once data is obtained about network lags and device lags, the one or more controllers may use the data to order the start times at which signals will be sent to client devices to begin playback (step 72). For example, the controller may compensate for the differing lag times of the clients by giving a start command to the client with the most lag first and giving a start command to the other clients with enough delay so that the final display of the content will occur in a synchronized fashion. Once the ordering is done, and timing differentials calculated between the various start times, start signals may be sent to client devices (step 74).
(25) Referring to the system 30 of
(26) To adjust for this, the device of
(27) The content playback device 78 includes a playback signal generation circuit 84 that accepts a start signal from the network interface 82 which generally originates from a controller. The start signal indicates that playback should begin. An exemplary playback signal is illustrated in the graph 85. Once the playback signal is generated, a finite amount of time Δt passes before a user actually sees a corresponding signal on the display, illustrated as Δt.sub.v or hears a corresponding sound on the audio system, illustrated as Δt.sub.a, in graphs 98 and 102, respectively. To determine these time differentials, an optical sensor 94, such as a camera, is disposed to receive displayed signals from the display 86. An audio sensor, such as a microphone 96, is disposed to receive rendered signals from the audio system 88. For example, a light detector may be placed in front of the display and a microphone in front of a speaker. The same provides signals to a measurement circuit 104, which also receives an indication of the playback signal 85 from the playback signal generation circuit 84. By measuring the time between the playback signal 85 and signals 98 and 102, a measurement of the device lag may be calculated.
(28) It will be understood that the type of sensor may vary, and the only requirement is that they be positioned such that the same can detect the playback being output by the device. As audio is not highly directional, a built-in microphone may not need any special positioning if the device is located in the same room as the playback. A light intensity sensor or detector should be located so that the same is facing the screen where the video output is playing. Generally, such optical detectors should have a narrow field of vision and may employ shielding, such as a flat black tube, to reduce the amount of stray light from other angles being picked up by the sensor.
(29) In general, the sensors need not be of any particular high-quality as the same only need to respond quickly to the overall intensity they are receiving. For example, inexpensive microphones, as are commonly used in telephones, will generally be sufficient for detecting overall sound intensity in real-time. For the light detector, any camera sensor may be employed, even those lacking optics necessary to produce a clear picture. The light detector may also simply detect overall light intensity and need not employ multiple pixels or be able to detect different intensities for different wavelengths of visible light.
(30) The above system provides various advantages. For example, the system measures the overall lag, the same being a primary parameter required to synchronize the output. No matter how complex the signal processing pathway is, the overall result is measured. In this way, complex cases where significant signal processing exists may still be afforded synchronized playback, e.g., in professional broadcast environments where signals may be routed through many pieces of equipment. In this connection, it is noted that the measurement may be for a lag time through an arbitrary signal path, and may not necessarily include rendering of the signal at the end of the path. For such implementations, an intermediate sensor 106 may be employed to monitor the signal at the end of the signal path being measured, to look for timing when the generated signal reaches that point.
(31) In variations of the above, the lag measurement may be automated such that device lags are automatically measured each time a change in signal path is detected, such as when a new device is attached to an HDMI output. Such automation may be provided in any of the embodiments described above.
(32) A method that may be employed by the system of
(33) The test signal is then rendered, e.g., visually and/or aurally (step 118), and the same is detected by the optical sensor and/or microphone (step 122), respectively. Indication of receipt of the test signal is sent to the measurement circuit (step 124). The difference between the time of arrival of the start signal (or initiation of test signal) and the time of detection yields the lag time for the signal (step 126). This “device lag time” may then be sent to one or more controllers in data communication with the content playback device (step 128).
(34) For a sound intensity sensor, the device may begin by outputting a silent audio signal and then outputting a loud signal. The audio signal that is used may vary, but should substantially immediately increase from silence to a steady volume. A single tone, such as a sine wave or square wave, can be used, or the output may include white noise. Musical outputs may be employed if the first note is of a sufficiently consistent loud amplitude. As with the optical detector, the lag may be calculated from the difference in timing from when the sound being output went from silence to the audio signal and when the sound intensity detector picked up the sudden increase in sound intensity.
(35) The device may calculate the display lag using only one of the sensors, e.g. optical or audio, or it may use both. In the case where the device uses both, both measurements may occur simultaneously as they do not generally interfere with each other. It is noted that in such cases, the measurements of rendered signals may occur at different times. For example, if the audio and video synchronization of the output device is off, there may be a variation in the device lag for the audio and video outputs. In the case of a difference in device lag, the controller may employ different timings for the audio and video to compensate for that difference.
(36) The measurements may be repeated, e.g., by cycling from low to high intensity several times, to ensure that the changes picked up were from the playback of the output and not from environmental interference. Statistical methods may be employed to ensure that enough points have been collected to obtain a true measurement.
(37)
(38) A second content playback device 134 is illustrated, and the second content playback device has been indicated as desiring to join the playback of the content item 138. The first and second content playback devices 132 and 134, respectively, are illustrated as part of the local network 15. It will be understood that there is no requirement the two are on the same local network. In addition, a separate synchronization controller 136 is illustrated, and the same may form a portion of the second display, may form a portion of either content playback device, or may be a separate device entirely.
(39) The second content playback device 134 has a buffer 135 and upon indication that the second content playback device wishes to join the playback of the first, the buffer 135 may begin to receive the content item through the Internet and/or the local network.
(40)
(41) In particular, the controller causes the second content playback device to begin buffering content (step 146), starting with the portion of the content item data that it estimates will contain the portion that will be played at the point in time when it has buffered enough data to start playing e.g., at a first target point. The second content playback device buffers the content until it has sufficient to join the playback (step 148). In so doing it may employ data about known network and device lags and delays (step 162).
(42) Once the second content playback device has buffered enough data to start playback, it may then compare the portion of data it has with the current playback point, e.g., point 139. Additional communication with the controller may be made during buffering to double check that the playback timing information received by the second content playback device is still correct and was not affected by, e.g., abnormally high network lag on the part of either or both content playback devices or other such interruptions. If buffering happened quickly and the playback point has not yet reached the start of the content being buffered, the second content playback device may wait until the playback position reaches the playback point, and then begin playing the beginning of the content it has buffered (step 158).
(43) If the current playback point has already passed the beginning of the data that was buffered, the client may determine at what point the current playback is, within the buffered data, and will check to see if there is adequate data buffered beyond that to start playback at that position. If there is sufficient data, then playback begins at the position within the data that corresponds with the current playback point. If there is not enough data buffered, playback will not begin at this point, and the client will continue to buffer the media (step 154), repeating the check each time a new segment of content item data is received. Once enough data is received, such that the buffer includes the playback point, the second content playback device may join the playback (step 158).
(44) In some cases, a sufficiently disruptive network interruption may occur. In this case, the latest data in the buffer may be behind the current playback point, in which case the second content playback device may start over from the beginning with its attempt to begin synchronized playback (step 159).
(45) The system and method of
(46) In some cases of synchronization, it may be desired to set up a direct relationship such that one content playback device acts as a master device and another a slave. Systems and methods according to the principles described here, in particular with respect to
(47) For example, referring to
(48) The master content playback device may also receive content items from another device, such as through an HDMI input 167. Where the input is a protected signal, as through an HDMI connection, the master content playback device may need to encrypt the transmitted signal to the slave content playback device in order to ensure continued protection of the signal. Moreover, the master may need to encode the source material for transmittal to the slave device over the network if the source is not already in a suitable format. In some cases, the encoding may employ stronger compression, based on the available bandwidth between the master and the slave device.
(49) Referring to the flowchart 80 of
(50) The master content playback device then transmits the synchronized content to the slave content playback device (step 184). Such may be done immediately if no lags are expected, or with delays or lags to accommodate for such as has been described above.
(51) The transmission of synchronized content may have a number of variations associated. For example, the master content playback device may provide content using one or more internal tuners (step 186). The master content playback device may encode content (step 188) to ensure that slave content playback devices can use the content. The master content playback device may further encrypt the content if required by the system (step 194). In yet other implementations, the master content playback device may send the content using a physical input, e.g., HDMI, NTSC, etc.
(52) Other variations will also be seen. For example, and as indicated in
(53) In some implementations, one or more slave content playback devices may be given permission to control the master content playback device. In this case, the slave device may be enabled to issue control commands to the master, such as to change the channel or to switch to an external input. The master device may execute these commands, which may change what is being displayed, and therefore what is being sent to all the subscribed client or slave devices. The master content playback device may have privacy settings configured to allow the user to allow all client connections, disallow all client connections, allow only certain clients to connect, or allow clients to connect only if they supply proper authentication credentials. Other such settings will also be understood.
(54) It is noted that the master device need not display the content that it supplies to the client or slave device. This allows slave devices to access external inputs, e.g., a TV tuner, disc player, or other content source in the master device, even if there is no desire for the master device to also display that content. The master device may display other content while supplying the desired content or the master device may have the display portion of its circuitry in an off state to conserve power. In some implementations, the master device may supply more than one separate content stream to its connected slave or client devices. It is further noted that a particular content playback device may act as a master device relative to some devices, and as a client to others, even at the same time.
(55) In some implementations, the user may or may not be concerned about the synchronization of the playback between the master device and the slave device, or between a plurality of slave devices. For example, where devices are not in close proximity, such synchronization is not necessary. Where synchronization is employed, the master content playback device may need to delay the playback of its own signal relative to when it transmits a signal to one or more slave devices to account for lag in the transmission of the signal and the processing of the signal by the slave devices. Each device would generally add enough delay so that the content item would be played at the same playback point as the device with the most lag would play the same with no delay.
(56) It is understood that the term “display” is interpreted to be inclusive of playing an audio signal through speakers in the case where the media being played contains audio information, regardless of whether the media also contains video or image information. An audio device, such as a home audio receiver, may synchronize to a device with an audio and video signal, such as a TV, in which case the home audio device may only request and receive the audio portion of the information.
(57) In another variation of the above implementations, if the slave devices that are subscribed to a master device are connected within the same local network, such that multicast network communications are enabled between the devices, the master device may choose to use multicast communications so that the content item data only needs to be transmitted once in a single stream, thus saving significant bandwidth over having to broadcast the same data in multiple separate communications to each client device.
(58) In yet another implementation, systems and methods according to the principles described here relate to providing synchronized playback even when content playback devices must access management server infrastructures to access content, including undergoing affiliation and authentication procedures. For example, referring to the system 90 of
(59) Through the Internet 25, the first and second content playback devices 196 and 206, respectively, may communicate with a content or service provider 214 through, in some cases, a management server 212. For example, the management server 212 may arrange for the presentation of services and assets, including an asset 202 having an asset ID 202′, on a user interface of the second display or content playback device. Users may browse content and identify assets through the use of the asset ID. The users of the content playback devices may select the asset 202 for playback, in which case the asset 202 from the service provider is downloaded and played back or streamed to the content playback devices. As noted in
(60) Generally, to access content to a content or service provider, steps of affiliation are required to ensure access by a particular device is allowed and enabled. Steps of such affiliation processes are described in co-pending applications: U.S. patent application Ser. No. 13/077,298, filed Mar. 31, 2011, entitled “Direct Service Launch On A Second Display”: U.S. patent application Ser. No. 13/207,581, filed Aug. 11, 2011, entitled “System And Method To Easily Return To A Recently Accessed Service On A Second Display”: U.S. patent application Ser. No. 13/233,398, filed Sep. 15, 2011, entitled “System And Method To Store A Service Or Content List For Easy Access On A Second Display”; and U.S. patent application Ser. No. 13/217,931, filed Aug. 25, 2011, entitled “System And Method Providing A Frequently Accessed Service Or Asset List On a Second Display”; all of which are owned by the assignee of the present application and herein incorporated by reference in their entireties.
(61) In systems and methods according to
(62) In more detail, and referring to a flowchart 100 in
(63) The synchronization group may then be filtered based on various factors, if such filtering has not been performed at the authentication step (step 224). Examples of such factors include that certain content may employ differing formats that may require hardware support of codec software that is not available on all clients. Another factor may be that some content distribution licenses only allow the content to be displayed in certain geographical regions. Another factor that may prevent playback is if a device has a rating limit set that would prevent the playback of the content item with the given rating. If playback is not allowed on the client, the controller informs the client and the client is removed from the synchronization group (step 226). Synchronized playback may then begin, as arranged and coordinated by the controller (step 228), with each client device obtaining and using its own unique URL to access the media.
(64) Variations of the above system and method will be understood given the teaching therein. For example, combinations of the above synchronization techniques may be employed. As another example, if a source device obtains content to play from a service provider, that source device may use the service provider as it would any other content source and transmit the content item to any subscribed client devices as noted above with reference to
(65) Where a second display controls playback, the second display may operate software that allows the same to choose from a plurality of target devices for playback. The content navigation on the second device may indicate which content is playable by each device that is currently targeted for playback, or may even filter the content choices presented to the user to ensure that the user can only see and choose from content that can be played on all targeted devices. If playback is initiated by a second display, then the second display can designate one of the content playback devices to be the controller, in which case the content playback devices to be synchronized establish communication between themselves to synchronize with the controller content playback device. Also if playback was initiated by a second display device, the second display device may act as the controller even though it is not one of the playback devices, in which case the content playback devices to be synchronized communicate with the second display device. The content playback devices may address their communications directly to the controller or may communicate to an external server that is in data communication with all.
(66) Systems and methods have been disclosed that allow improvement of the user experience of the IPTV without adding to the hardware costs of the unit. As disclosed above, users may employ the system and method to playback content in a synchronized fashion, allowing enjoyment of content items without the disadvantages suffered by prior attempts at coordinated playback.
(67) One implementation includes one or more programmable processors and corresponding computing system components to store and execute computer instructions, such as to execute the code that provides the various server functionality, e.g., that of the management server or content server, second display, or content playback device. Referring to
(68) The computing environment includes a controller 234, a memory 236, storage 242, a media device 246, a user interface 254, an input/output (I/O) interface 256, and a network interface 258. The components are interconnected by a common bus 262. Alternatively, different connection configurations can be used, such as a star pattern with the controller at the center.
(69) The controller 234 includes a programmable processor and controls the operation of the servers, second displays, content playback devices, controllers, and their components. The controller 234 loads instructions from the memory 236 or an embedded controller memory (not shown) and executes these instructions to control the system.
(70) Memory 236, which may include non-transitory computer-readable memory 238, stores data temporarily for use by the other components of the system. In one implementation, the memory 236 is implemented as DRAM. In other implementations, the memory 236 also includes long-term or permanent memory, such as flash memory and/or ROM.
(71) Storage 242, which may include non-transitory computer-readable memory 244, stores data temporarily or long-term for use by other components of the system, such as for storing data used by the system. In one implementation, the storage 242 is a hard disc drive or a solid state drive.
(72) The media device 246, which may include non-transitory computer-readable memory 248, receives removable media and reads and/or writes data to the inserted media. In one implementation, the media device 246 is an optical disc drive or disc burner, e.g., a writable Blu-ray® disc drive 252.
(73) The user interface 254 includes components for accepting user input, e.g., the user indications of streaming content items, and presenting service lists, asset lists and categories, and individual assets to the user. In one implementation, the user interface 254 includes a keyboard, a mouse, audio speakers, and a display. The controller 234 uses input from the user to adjust the operation of the computing environment.
(74) The I/O interface 256 includes one or more I/O ports to connect to corresponding I/O devices, such as external storage or supplemental devices, e.g., a printer or a PDA. In one implementation, the ports of the I/O interface 256 include ports such as: USB ports, PCMCIA ports, serial ports, and/or parallel ports. In another implementation, the I/O interface 256 includes a wireless interface for wireless communication with external devices. These I/O interfaces may be employed to connect to one or more content playback devices.
(75) The network interface 258 allows connections with the local network and optionally with content playback devices and second displays and includes a wired and/or wireless network connection, such as an RJ-45 or Ethernet connection or “Wi-Fi” interface (802.11). Numerous other types of network connections will be understood to be possible, including WiMax, 3G or 4G, 802.15 protocols, 802.16 protocols, satellite, Bluetooth®, or the like.
(76) The servers, second displays, and content playback devices may include additional hardware and software typical of such devices, e.g., power and operating systems, though these components are not specifically shown in the figure for simplicity. In other implementations, different configurations of the devices can be used, e.g., different bus or storage configurations or a multi-processor configuration.
(77) Aspects specific to certain computing environments are discussed below.
(78) The content playback device can take many forms, and multiple content playback devices can be coupled to and selected from within a given local network. Exemplary content playback devices may include, e.g., an IPTV, a digital TV, a digital sound system, a digital entertainment system, a digital video recorder, a video disc player, a combination of these, or any number of other electronic devices addressable by a user on the local network 16 and capable of delivering an ad over the Internet. The same may also include more traditional video and audio systems that have been appropriately configured for connectivity. For the sake of simplicity, in this specification, the content playback device has generally been exemplified by an IPTV, in which case the same will generally include a processor that controls a visual display and an audio renderer such as a sound processor and one or more speakers. The processor may access one or more computer-readable storage media such as but not limited to RAM-based storage, e.g., a chip implementing dynamic random access memory (DRAM), flash memory, or disk-based storage. Software code implementing present logic executable by the content playback device may also be stored on various memories to undertake present principles. The processor can receive user input signals from various input devices including a second display, a remote control device, a point-and-click device such as a mouse, a keypad, etc. A TV tuner may be provided in some implementations, particularly when the content playback device is an IPTV, to receive TV signals from a source such as a set-top box, satellite receiver, cable head end, terrestrial TV signal antenna, etc. Signals from the tuner are then sent to the processor for presentation on the display and sound system. A network interface such as a wired or wireless modem communicates with the processor to provide connectivity to the Internet through the local network. It will be understood that communications between the content playback device and the Internet, or between the second display and the Internet, may also take place through means besides the local network. For example, the second display may communicate with the content playback device through a separate mobile network.
(79) The second displays may include any device that can run an application that communicates with a content playback device, including, but not limited to, personal computers, laptop computers, notebook computers, netbook computers, handheld computers, personal digital assistants, mobile phones, smart phones, tablet computers, hand-held gaming devices, gaming consoles, Internet appliances, and also on devices specifically designed for these purposes, in which case the special device would include at least a processor and sufficient resources and networking capability to run the second display application. The second displays may each bear a processor and components necessary to operate an application for service provider and content selection. In particular, the processor in the second display may access one or more computer-readable storage media such as but not limited to RAM-based storage, e.g., a chip implementing dynamic random access memory (DRAM), flash memory, or disk-based storage. Software code implementing present logic executable by the second display may also be stored on various memories to undertake present principles. The second display can receive user input signals from various input devices including a point-and-click device such as a mouse, a keypad, a touch screen, a remote control, etc. A network interface such as a wired or wireless modem communicates with the processor to provide connectivity to wide area networks such as the Internet 26 as noted above.
(80) The servers, e.g., the management server and content server, have respective processors accessing respective computer-readable storage media which may be, without limitation, disk-based and/or solid state storage. The servers communicate with a wide area network such as the Internet via respective network interfaces. The servers may mutually communicate via the Internet. In some implementations, two or more of the servers may be located on the same local network, in which case they may communicate with each other through the local network without accessing the Internet.
(81) Various illustrative implementations of the present invention have been described. However, one of ordinary skill in the art will recognize that additional implementations are also possible and are within the scope of the present invention. For example, service and asset choices may be made by a client device, i.e., a content playback device, e.g., an IPTV, or the same may also be made by a second display presenting appropriate authentication credentials to a management server, as disclosed in assignee's co-pending US patent applications incorporated by reference above.
(82) The description above may pertain to any digital content, including streamed, live streaming, video-on-demand content, and stored digital content. Any type of digital content file is contemplated, including media files in live streaming formats, e.g., .m3u8 files. The terms “content item”, “content”, and “asset”, have been used interchangeably, unless the context dictates otherwise.
(83) In the system where master devices drive slave devices, the master device may provide to the slave device alternate versions of presented content, the alternate versions incorporating video of lower quality, different codecs, different subtitles, different captions, as well as alternate audio tracks such as descriptive audio for the blind, etc. Further in such systems, a master device may simultaneously transmit a plurality of content items to multiple content playback devices, instead of just a common content item. For example, the master device may receive network content or DVR content and transmit the same to one content playback device while the master device is simultaneously receiving content from a tuner and transmitting such tuner content to another content playback device. In a further implementation, it is noted that a content playback device may act simultaneously as both a master and a slave, connecting to two separate devices. The content that the master device is transmitting may be the content it is receiving or content from another source, such as a tuner, that it has access to.
(84) Not all steps described above (or in any of the flowcharts below) need be undertaken in any particular implementation, and the order of steps may vary to a certain extent as well.
(85) Accordingly, the present invention is not limited to only those implementations described above.