DYNAMIC MEDIA CONTENT RUNTIME MODIFICATION
20230217077 · 2023-07-06
Inventors
Cpc classification
H04N21/41407
ELECTRICITY
H04N21/8456
ELECTRICITY
H04N21/47217
ELECTRICITY
H04N21/4532
ELECTRICITY
International classification
H04N21/45
ELECTRICITY
Abstract
Systems and methods for modifying runtimes of media assets are disclosed. A first input is received indicating a request to modify a current runtime of a media asset being generated for display on a media device. The media asset includes segments. A second input is received and based on the second input, a shortened version of the media asset having a shortened runtime is generated. One or more segments of the media asset are removed from the media asset to generate the shortened version and the shortened version of the media asset is generated for display on the media device.
Claims
1.-20. (canceled)
21. A method comprising: generating for display a content item on a display device; in response to receiving a first request from an input device operatively connected to the display device during the display of the content item, accessing a schedule; determining whether at least one of an activity, a task, or an event from the schedule occurs during the display of the content item; and in response to determining that the at least one of the activity, the task, or the event from the schedule occurs during the display of the content item, generating for display with the content item a progress bar of the content item and an indicator of the at least one of the activity, the task, or the event from the schedule on the progress bar.
22. The method of claim 21, wherein the schedule is not associated with the content item.
23. The method of claim 22, wherein the schedule further includes a start time, a duration, and an end time for the at least one of the activity, the task, or the event.
24. The method of claim 23, wherein the at least one of the activity, the task, or the event is work-related or home-related.
25. The method of claim 21, comprising: in response to receiving a request from the input device after the display of the indicator, generating for display a selectable option to generate a shortened version of the content item on the progress bar.
26. The method of claim 25, comprising: in response to receiving via a screen of the display device dragging or swiping of two touch inputs away from each other while continuing to make contact with the screen of the display device in a distancing direction, increasing a runtime of the shortened version of the content item; and in response to receiving via a screen of the display device dragging or swiping two touch inputs toward each other while continuing to make contact with a screen of the display device in a closing direction, decreasing a runtime of the shortened version of the content item.
27. The method of claim 25, comprising: in response to selection of the selectable option, generating for display the shortened version of the content item.
28. The method of claim 25, comprising: in response to receiving on-screen inputs, dynamically scheduling and displaying on the progress bar start and end times of the shortened version of the content item on the progress bar.
29. The method of claim 25, wherein the shortened version of the content item on the progress bar is generated based on a media content preference model trained with input representing past user selections of media content.
30. The method of claim 25, comprising: identifying a segment of the content item having a lowest level of importance; and removing the identified segment to generate the shortened version of the content item.
31. A system comprising: circuitry configured to: generate for display a content item on a display device; in response to receiving a first request from an input device operatively connected to the display device during the display of the content item, access a schedule; determine whether at least one of an activity, a task, or an event from the schedule occurs during the display of the content item; and in response to determining that the at least one of the activity, the task, or the event from the schedule occurs during the display of the content item, generate for display with the content item a progress bar of the content item and an indicator of the at least one of the activity, the task, or the event from the schedule on the progress bar.
32. The system of claim 31, wherein the schedule is not associated with the content item.
33. The system of claim 32, wherein the schedule further includes a start time, a duration, and an end time for the at least one of the activity, the task, or the event.
34. The system of claim 33, wherein the at least one of the activity, the task, or the event is work-related or home-related.
35. The system of claim 31, wherein the circuitry is configured to: in response to receiving a request from the input device after the display of the indicator, generate for display a selectable option to generate a shortened version of the content item on the progress bar.
36. The system of claim 35, wherein the circuitry is configured to: in response to receiving via a screen of the display device dragging or swiping of two touch inputs away from each other while continuing to make contact with the screen of the display device in a distancing direction, increase a runtime of the shortened version of the content item; and in response to receiving via a screen of the display device dragging or swiping two touch inputs toward each other while continuing to make contact with a screen of the display device in a closing direction, decrease a runtime of the shortened version of the content item.
37. The system of claim 35, wherein the circuitry is configured to: in response to selection of the selectable option, generate for display the shortened version of the content item.
38. The system of claim 35, wherein the circuitry is configured to: in response to receiving on-screen inputs, dynamically schedule and display on the progress bar start and end times of the shortened version of the content item on the progress bar.
39. The system of claim 35, wherein the shortened version of the content item on the progress bar is generated based on a media content preference model trained with input representing past user selections of media content.
40. The system of claim 35, wherein the circuitry is configured to: identify a segment of the content item having a lowest level of importance; and remove the identified segment to generate the shortened version of the content item.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The above and other objects and advantages of the disclosure will be apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
DETAILED DESCRIPTION
[0019] The present disclosure is, in some embodiments, directed to methods and systems for media asset runtime modification, and more particularly to techniques for generating for display a shortened version of a media asset based on certain user action.
[0020]
[0021] In an embodiment of the disclosure, as shown in
[0022] With continued reference to
[0023] In some embodiments, R1 may be a first input and R2 may be a second input and the location or position of R2 on a display screen of the media device may indicate the (shortened) runtime of the shortened version of the media asset. In some embodiments, R1 and R2 may collectively be the first input and r1 and r2 may collectively be the second input (e.g., after a user swiping R1 and R2 to the display screen positions of r1 and r2, respectively.) Yet in some embodiments, r1 and r2 may collectively be the first input and the position on a display screen of the media device to which r1 and r2 are swiped may collectively be the second input. In still some embodiments, R1 may be the first input and r1 and r2 may collectively be the second input (e.g., each of r1 and r2 being a single (e.g., simultaneous) touch point on the display screen.) The start and stop times of the shortened version of the media asset displaying on the display screen may be indicated by one or a combination of R1, R2, r1, and r2. For example, the start time of the shortened version of the media asset may be represented by R1 or r1 and in some cases, R2 or r2. Similarly, the stop time of the shortened version of the media asset may be represented by R2 or r2 and in some cases, R1 or r1. In some embodiments, input r1 is effectively the start time of the shortened version of the media asset and input r2 is effectively the end or stop time of the shortened version of the media asset displaying on the display screen. In some embodiments, input R1 is effectively the start time of the shortened version of the media asset and input R2 is effectively the end or stop time of the shortened version of the media asset displaying on the display screen. In some embodiments, either R1 or r1 is effectively the stop time of the shortened version of the media asset. Each of the two inputs, i.e., the first input and the second input, may include a first touch input, r1, indicating a start time of the media asset and a second touch input, r2, indicating an end time of the media asset. In some embodiments, the second input, r2, may trigger generating a shortened version of the media asset. Either the first input, r1, or the second input, r2, may be a swipe to a new start or end time, respectively. The first touch input and the second touch input may be received simultaneously or at different times. For example, a user may make contact with two distinct points of the media device screen using two distinct user fingers at about the same time to indicate the start and end times of the shortened version of the media asset. Alternatively, or additionally, the user may make contact with two distinct points of the media device using two distinct fingers at two distinct times to indicate the start and end times of the selected shortened version of the media asset. In some embodiments, in response to the second input, as described above, media device 102 automatically determines a runtime 124 of the shortened version of the media asset based on one or more of a user schedule or a user profile.
[0024] Touch inputs r1 and r2, may be facilitated by user haptic movements such as with two human finger touches on the display screen of media device 102, as shown at the bottom of the screen of media device 102 in
[0025] In some embodiments, media device 102 generates for display a media progress bar 122 simultaneously with the depiction of the media asset for ease of summarized media asset selection. For example, progress bar 122 may assist the user with a visual comparison of the runtime 124 of the shortened version of the media asset relative to the total media asset runtime. Further, the location of the summarized media asset within the media asset may be more pronounced via progress bar 124. For example, runtime 124 of shortened version of the media asset, defined by the distance between inputs r1 and r2, may be displayed by progress bar 122 simultaneously with the displaying of the media asset to aid the user in dynamic selection of the shortened version of the media asset, as further described below.
[0026] To generate the shortened version of the media asset, media device 102 may select a media asset segment based on a level of importance. For example, and without limitation, the level of importance of a media asset segment may be identified by a segment of the media asset with the lowest level of importance. Other criteria may form the basis of the media asset segment selection, such as a user profile. For example, user preferences in a social media user profile may indicate a user dislike (a thumb down icon) of a particular actor that may cause media device 102 to flag the scenes as unimportant or candidates for removal. Next, media device 102 removes the identified segment to generate the modified version of the media asset and determines whether the modified version of the media asset has a runtime greater than the user desired runtime. In response to determining that the runtime of the modified version of the media asset is not greater than the desired runtime, media device 102 generates for display the shortened version of the media asset.
[0027] In embodiments with a user thumb touch and a user index finger touch, serving as inputs on the display screen of media device 102, a user media asset summary selection may be indicated, such as shown in the embodiment of
[0028] In some embodiments, in response to the first input, R1, one or more reference time points on the media progress bar 122 may be generated for display on a display screen based on a schedule of the user. Receipt of the second input, R2, is indicative of receiving a selected reference time point among the displayed reference time points. In an embodiment of the disclosure, as shown in
[0029] Each reference time point may be taken from a user calendar, a user social media profile, or a combination thereof, and represents a point in time when a user of media device 202 is scheduled to perform a certain task or an event involving the user may be due to occur. For instance, reference time point 214 may be taken from a user daily calendar and denotes a time when the user is due to report to or start work for the day; reference time point 216 may be taken from the user social media profile, preferences, or schedule event, such as a scheduled group activity involving the user; reference time point 218 may be taken from the user calendar, denoting a time when the user is due to return from or stop work for the day; and reference time point 220 may be also taken from the user calendar, denoting a time when the user is expecting guests to arrive at home.
[0030] In some embodiments, one or more of the reference time points is based on a schedule of a user. In an embodiment of the disclosure, as shown in
[0031] As earlier discussed, media device 302 may generate for display, in response to a first input, such as R1 or a first touch input, such as r1, one or more reference time points on a media progress bar 322 based on a schedule of a user. In some embodiments, receiving a second input, R2, may include receiving a selection from the reference time points 314-320. For example, a user selection of reference time point 320 may represent receiving the second input, R2. The selection of any other of the reference time points 314-318 may alternatively indicate receipt of the second input, R2.
[0032] As in the embodiment of
[0033] In some embodiments, in response to the second input, R2, the shortened runtime is automatically determined based on one or more of a schedule of a user or a user profile. For example, in response to selection of reference time point 320, the shortened runtime (of a shortened version of the media asset displayed by media device 302) may be automatically determined. Reference time point 320 may be automatically taken from a user calendar and may represent a scheduled time for when the user expects a house guest. Accordingly, the system may automatically self-adjust, setting the end time of the shortened version of the media asset for a naturally convenient time, immediately or a predetermined number of minutes before the house guest is due to arrive, a user programmable feature.
[0034] As with
[0035] The media asset runtime modification systems of various embodiments may include a remote or local connection between a media device displaying the media asset and a content database maintaining segments of the media asset to be shortened. In an embodiment of the disclosure, as shown in
[0036] In
[0037] In some embodiments, such as shown in the embodiment of
[0038] In some embodiments and without limitation, database 434 may comprise one or more flat, hierarchical, relational or network types of databases. In some embodiments, database 434 may be, in part or in whole, incorporated in or coupled to a server, a networking device, such as the server illustrated and discussed relative to
[0039]
[0040] At step 502, media device 102 receives a first input, such as input R1, which includes a request to modify a current runtime of a media asset currently displaying on a display screen of media device 102. The media asset includes media asset segments. Input R1 with the request to modify the runtime of the media asset may be received in various optional manners. For example, as earlier noted, input R1 may be a touch input on the display screen of media asset 102 indicative of a request when received at a designated area of the display screen or when the touch input is detected for a minimum threshold period of time. Input R1 may be generated from a click or depression of a particular button of a remote control device communicatively coupled to media device 102. Input R1 may be any input recognizable by media device 102 for facilitating the request to modify the media asset runtime.
[0041] At step 504, a second input, such as input R2, is received by media device 102 that triggers the generation of a shortened version of the media asset displaying on the display screen of media asset 102. The shortened version of the media asset has a shortened runtime relative to the total runtime of the media asset. For example, the total runtime of the media asset may be 2 hours whereas the shortened runtime may be 15 minutes. At step 506, media device 102 generates for display on the display screen of media device 102 the shortened version of the media asset.
[0042] In
[0043] At step 602, media device 302 receives a first input, such as input R1, which may be a first touch input, r1, and a second input, which may be a second touch input, r2. In some embodiments, input R1 and R2 may be in a manner as described relative to the input R1 of
[0044] At step 604, media device 302 generates for display on its display screen a media progress bar, such as progress bar 322. In some embodiments, media device 302 displays the progress bar when or shortly after it starts to display the media asset in which case step 604 may precede step 602. At step 606, in response to receiving the first input, for example touch input r1, media device 302 generates for display one or more reference time points, for example, reference time points 314-320, on or in the vicinity of progress bar 322. As earlier discussed, reference time points 314-320 may be based on the schedule of a user of media device 302.
[0045] At step 608, media device 302 waits for a second input, such as touch input r2, and in response to receiving the second touch input, media device 302 proceeds to step 610. In some embodiments, touch input r2 is as described above relative to the embodiments of
[0046] In
[0047] At step 702, media device 102 identifies a segment of a media asset being generated for display on the display screen of media device 102. For example, media device 102 may retrieve a media asset segment from segments 436 (
[0048] At step 706, media device 102 compares the runtime of the modified version of the media asset of step 704 to a desired runtime. The desired runtime may be selectable and programmable. For example, media device 102 may store a user-selected desired runtime in the user preferences (of user profile 426) in database 434 or the desired runtime may be received as a desired runtime input, from the user, in real-time. At step 706, media device 102 determines to perform step 708 or step 712 based on the outcome of the comparison. In response to media device 102 determining the runtime of the modified version of the media asset is greater than the desired runtime, media device 102 performs step 708 and in response to media asset 102 determining the runtime of the modified version of the media asset is less than the desired runtime, media device 102 performs step 712.
[0049] At step 712, media device 102 generates (outputs) the modified version of the media asset as the shortened version of the media asset. At step 708, media device 102 identifies the next segment of the media asset, analogously to identifying a segment at step 702. That is, the identification of the next segment at step 708 may be based on the next lowest level of importance. The level of importance may be based on a user preference as earlier discussed. At step 710, the modified version of the media asset is set equal to the next segment of the media asset of step 710 (the modified version of the media asset of step 706 is effectively removed and replaced with the next segment of the media segment from step 708) and process 700 repeats from step 706. Media device repeats steps 706-710 until a modified version of the media asset with a runtime smaller than the desired runtime is determined at step 706 and process 700 proceeds to step 712.
[0050] The order of steps of each of the processes 400-700, as shown in the flowcharts of
[0051] A user device may access, process, transmit and receive signals, in addition to other features, for example to carry out the functions and implementations shown and described herein, with one or more media devices (i.e., user equipment or user devices) such as the generalized embodiments of an illustrative user device.
[0052] In the embodiment of
[0053] In some embodiments, server 802 is, in part or in whole, incorporated in communication network 814. In some embodiments, communication network 814 is configured as communication network 432 of
[0054] In some embodiments, computing device 800 may be configured, in whole or in part, as a media device. In some embodiments, computing device 800 may include any user electronic device that performs media asset shortening or modification operations as disclosed herein. In some embodiments, user device 800 may incorporate, in part or in whole, or is communicatively coupled to, each of media devices 102, 202, 302, or 402 of
[0055] Computing device 800 is shown to generally include control circuitry 828, hardware interface 842, speaker 832, display 834, and computing device interface 836. In some embodiments, display 834 is configured as or analogous to, in whole or in part, media devices 102, 202, 302, or 402 of
[0056] In some embodiments, display 834 (or display screen 834) may include a touchscreen, a television display or a computer display. In a practical example, display 834 may display a media asset or a shortened version of media asset, as processed by devices 102, 202, 302, or 402. Display 834 may further display a media progress bar, for example progress bars 122, 222, or 322 of
[0057] In some embodiments, computing device 800 is part of a system along with a server 802 and a communication network 814. It is understood that while a single instance of a component may be shown and described relative to
[0058] Communication network 814 may comprise one or more network systems, such as, without limitation, an Internet, LAN, WIFI or other network systems suitable for audio processing applications. In some embodiments, the system of
[0059] Server 802 includes control circuitry 820 comprising processing circuitry 826 and storage 824. Each of storages 824 and 838 may be an electronic storage device. Each storage 824, 838 may be used to store various types of content, metadata, and or other types of data. Non-volatile memory may also be used (e.g., to launch a boot-up routine and other instructions). Cloud-based storage may be used to supplement storages 824, 838 or instead of storages 824, 838. In some embodiments, control circuitry 820 and/or 828 executes instructions for an application stored in memory (e.g., storage 824 and/or storage 838). Specifically, control circuitry 820 and/or 828 may be instructed by the application to perform the functions discussed herein. In some implementations, any action performed by control circuitry 820 and/or 828 may be based on instructions received from the application. For example, the application may be implemented as software or a set of executable instructions that may be stored in storage 824 and/or 838 and executed by control circuitry 820 and/or 828. In some embodiments, the application may be a client/server application where only a client application resides on computing device 800, and a server application resides on server 802.
[0060] The application may be implemented using any suitable architecture. For example, it may be a stand-alone application wholly implemented on computing device 800. In such an approach, instructions for the application are stored locally (e.g., in storage 838), and data for use by the application is downloaded on a periodic basis (e.g., from an out-of-band feed, from an Internet resource, or using another suitable approach). Control circuitry 828 may retrieve instructions for the application from storage 838 and process the instructions to perform the functionality described herein. Based on the processed instructions, control circuitry 828 may, for example, perform processes 500-700 of
[0061] In client/server-based embodiments, control circuitry 828 may include communication circuitry suitable for communicating with an application server (e.g., server 802) or other networks or servers. The instructions for carrying out the functionality described herein may be stored on the application server. Communication circuitry may include a cable modem, an Ethernet card, or a wireless modem for communication with other equipment, or any other suitable communication circuitry. Such communication may involve the Internet or any other suitable communication networks or paths (e.g., communication network 814). In another example of a client/server-based application, control circuitry 828 implements a web browser that interprets web pages provided by a remote server (e.g., server 802). For example, the remote server may store the instructions for the application in a storage device. The remote server may process the stored instructions using circuitry (e.g., control circuitry 828) and/or generate displays. Computing device 800 may receive the displays generated by the remote server and may display the content of the displays locally via display 834. This way, the processing of the instructions is performed remotely (e.g., by server 802) while the resulting displays, such as the display windows described elsewhere herein, are provided locally on computing device 800. Computing device 800 may receive inputs from the user via input circuitry 850 and transmit those inputs to the remote server for processing and generating the corresponding displays. Alternatively, computing device 800 may receive inputs from the user via input circuitry 850 and process and display the received inputs locally, by control circuitry 828 and display 834, respectively.
[0062] Server 802 and computing device 800 may transmit and receive content and data such as media content data, representing a media asset, media asset segments, a modified version of a media asset, or a shortened version of the media asset, via communication network 814. For example, server 802 may be configured as a media content processor, and computing device 800 may be configured as a media content device to transmit media content data in media content files to and receive media content files from server 802. In some embodiments, server 802 may be configured as a server communicatively coupled to media devices 102, 202, 302, and 402 of
[0063] In some embodiments, processing circuitry 840, control circuitry 828, or a combination thereof, may implement one or more of the processes in media devices 102, 202, 302, or 402 of
[0064] Control circuitry 820 and/or 828 may be based on any suitable processing circuitry such as processing circuitry 826 and/or 840, respectively. As referred to herein, processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores). In some embodiments, processing circuitry may be distributed across multiple separate processors, for example, multiple of the same type of processors (e.g., two Intel Core i9 processors) or multiple different processors (e.g., an Intel Core i7 processor and an Intel Core i9 processor). In some embodiments, control circuitry 820 and/or control circuitry 828 are configured to implement a media asset management system, such as systems 100, 200, 300, or 400 of
[0065] Computing device 800 receives a user input at input circuitry 850. For example, computing device 800 may receive text data and user input, as previously discussed. Computing device 800 transmits data through output circuitry 852. For example, computing device 800 may transmit audio data through output circuitry 852. In some embodiments, computing device 800 is a user device (or player) configured as devices 102, 202, 303, or 402 of
[0066] In some embodiments, input circuitry 850 and output circuitry 852 may be configured as a part of or coupled to a media device, such as the media devices of
[0067] Processing circuitry 840 may receive input from input circuitry 850. Processing circuitry 840 may convert or translate the received user input, which may be in the form of a haptic movement, a touch or a gesture, to digital signals. In some embodiments, input circuitry 850 performs the translation to digital signals. In some embodiments, processing circuitry 840 (or processing circuitry 826, as the case may be) carry out disclosed processes and methods. For example, processing circuitry 840 or processing circuitry 826 may perform processes 500-700 of
[0068] In some embodiments, display 834 is caused by generation of a display by devices 102, 202, 302, and 402 of
[0069] Speaker 832 may be provided as integrated with other elements of user device 800 or may be a stand-alone unit. The audio component of videos and other content displayed on display 834 may be played through speaker 832. In some embodiments, the audio may be distributed to a receiver (not shown), which processes and outputs the audio via speaker 832. In some embodiments, for example, control circuitry 828 is configured to provide audio cues to a user, or other audio feedback to a user, using speaker 832. In some embodiments, the audio receiver of computing device 800 may be a microphone configured to receive audio input such as voice utterances or speech. For example, a user may speak letters or words that are received by the microphone and converted to text by control circuitry 828. In a further example, a user may voice commands that are received by the microphone and recognized by control circuitry 828.
[0070] An application may be implemented using any suitable architecture. For example, a stand-alone application may be wholly implemented on computing device 800. In some such embodiments, instructions for the application are stored locally (e.g., in storage 838), and data for use by the application is downloaded on a periodic basis (e.g., from an out-of-band feed, from an Internet resource, or using another suitable approach). Control circuitry 828 may retrieve instructions of the application from storage 838 and process the instructions to generate any of the displays discussed herein. Based on the processed instructions, control circuitry 828 may determine what action to perform when input is received from input circuitry 850. For example, a screen highlighted word by detection of a double-click on a displayed word may be indicated by the processed instructions when input circuitry 850 indicates that a word was selected. An application and/or any instructions for performing any of the embodiments discussed herein may be encoded on computer-readable media. Computer-readable media includes any media capable of storing data. The computer-readable media may be transitory, including, but not limited to, propagating electrical or electromagnetic signals, or it may be non-transitory including, but not limited to, volatile and non-volatile computer memory or storage devices such as a hard disk, floppy disk, USB drive, DVD, CD, media cards, register memory, processor caches, Random Access Memory (“RAM”), etc.
[0071] The systems and processes discussed above are intended to be illustrative and not limiting. One skilled in the art would appreciate that the actions of the processes discussed herein may be omitted, modified, combined, and/or rearranged, and any additional actions may be performed without departing from the scope of the invention. More generally, the above disclosure is meant to be exemplary and not limiting. Only the claims that follow are meant to set bounds as to what the present disclosure includes. Furthermore, it should be noted that the features and limitations described in any one embodiment may be applied to any other embodiment herein, and flowcharts or examples relating to one embodiment may be combined with any other embodiment in a suitable manner, done in different orders, or done in parallel. In addition, the systems and methods described herein may be performed in real time. It should also be noted that the systems and/or methods described above may be applied to, or used in accordance with, other systems and/or methods.