SYSTEMS AND METHODS FOR AUTOMATED EVALUATION OF DIGITAL SERVICES
20230140605 · 2023-05-04
Inventors
Cpc classification
G06Q10/06393
PHYSICS
H04L41/5009
ELECTRICITY
H04L43/08
ELECTRICITY
G06Q10/0639
PHYSICS
H04L41/509
ELECTRICITY
H04L43/20
ELECTRICITY
International classification
Abstract
A digital service evaluation system evaluates services and user sessions provided by a service, to provide an overall score of the service. The digital service evaluation system detects client sessions associated with one or more devices. The digital service evaluation system obtains a first plurality of scores associated with performance metrics of the client session, and calculates an overall score for the client session. The digital service evaluation system obtains a second plurality of scores and calculates a second overall score. The digital service evaluation system determines a weight for each performance metric based on the first and second plurality of scores and the overall scores. The digital service evaluation system uses the weights to determine which performance metric caused a change in the overall scores. The digital service evaluation system takes an action based on the determination that a performance metric caused a change in the overall scores.
Claims
1. A service evaluation system, comprising: at least one processor; and at least one memory coupled to the at least one processor, the memory having computer-executable instructions stored thereon that, when executed by the at least one processor, cause the system to: detect a client session associated with a device, the device having access to a service via the client session; determine a first overall score describing an overall metric of the client session at a first time based on at least a first plurality of scores, each score of the first plurality of scores describing a different performance metric of a plurality of performance metrics of the client session; determine a second overall score describing the overall metric of the client session at a second time based on at least a second plurality of scores, each score of the second plurality of scores describing a different performance metric of the plurality of performance metrics of the client session; determine a weight of each performance metric of the plurality of performance metrics based on at least the first plurality of scores, the second plurality of scores, the first overall score, and the second overall score; identify a performance metric of the plurality of performance metrics of the client session that caused a change in the overall metric between the first time and the second time based on at least the weight of each performance metric of the plurality of performance metrics, the first plurality of scores, and the second plurality of scores; and take an action to change the identified performance metric, such that the overall metric of the client session increases.
2. The service evaluation system of claim 1, wherein to take an action to change the performance metric, the computer-executable instructions further cause the system to: identify at least one aspect of the service based on the identified performance metric; and determine at least one action to change the at least one aspect of the service based on the identified performance metric and the at least one aspect of the service.
3. The service evaluation system of claim 1, wherein one or more scores of the first plurality of scores is obtained via user input obtained from the device.
4. The service evaluation system of claim 1, wherein the different performance metrics of the client session comprise performance metrics related to the performance of the device.
5. The service evaluation system of claim 4, wherein the performance metrics related to the performance the device include performance metrics related to at least one of: a crash rate of the client session; a quality of content provided by the service; a time spent buffering content provided by the service; a time spent accessing the client session; or a likelihood a user of the device would recommend the service.
6. The service evaluation system of claim 1, wherein the computer-executable instructions further cause the system to: receive an indication of whether one or more performance metrics should be prioritized; and alter the weight of each performance metric based on the indication of whether one or more performance metrics should be prioritized.
7. The service evaluation system of claim 1, wherein the different performance metrics of the client session comprise performance metrics related to the performance of an application receiving the client session.
8. The service evaluation system of claim 1, wherein the different performance metrics include information obtained from crash reports and application logs.
9. A non-transitory processor-readable storage medium that stores at least one of instructions or data, the instructions or data, when executed by at least one processor, cause the at least one processor to: identify a client session associated with a device, the device having access to a service via the client session; determine a first overall score of the client session at a first time based on at least a first plurality of scores, each score of the first plurality of scores describing at least one performance metric of a plurality of performance metrics of the client session; determine a second overall score of the client session at a second time based on at least a second plurality of scores, each score of the second plurality of scores describing at least one performance metric of the plurality of performance metrics of the client session; determine a weight for each performance metric of the plurality of performance metrics based on at least the first plurality of scores, the second plurality of scores, the first overall score, and the second overall score; identify one or more performance metrics of the plurality of performance metrics of the client session caused a change in an overall score of the client session based on at least the weight of each performance metric of the plurality of performance metrics, the first plurality of scores, and the second plurality of scores; and cause at least one performance metric of the one or more performance metrics to be changed, such that the overall metric of the client session increases.
10. The non-transitory processor-readable storage medium of claim 9, wherein to cause at least one performance metric to be changed, the processor is further caused to: identify at least one aspect of the service based on the identified one or more performance metrics; and determine at least one action to change the at least one aspect of the service based on the identified one or more performance metrics and the at least one aspect of the service.
11. The non-transitory processor-readable storage medium of claim 10, wherein the at least one action includes causing the processor to: receive diagnostic data related to the client session; identify at least one cause of the change in the overall score of the client session based on the diagnostic data, the identified one or more performance metrics, and the at least one aspect of the service; identify at least one entity associated with the at least one aspect of the service; and cause the diagnostic data and the at least one cause to be transmitted to the at least one entity.
12. The non-transitory processor-readable storage medium of claim 11, wherein the diagnostic data includes at least one of: information regarding the performance of the client session; and information received from a user of the device.
13. The non-transitory processor-readable storage medium of claim 10, wherein the at least one action includes causing the processor to: receive diagnostic data related to the client session; identify at least one cause of the change in the overall score of the client session based on the diagnostic data, the identified one or more performance metrics, and the at least one aspect of the service; determine whether the cause of the change in the overall score of the client session is related to an aspect of a subscription to the service by a user of the device; and based on a determination that the cause of the change is related to an aspect of the subscription, cause an aspect of the subscription to the service to be changed based on the at least one cause.
14. The non-transitory processor-readable storage medium of claim 10, wherein to identify the at least one aspect of the service, the processor is further caused to: receive a plurality of overall scores for client sessions other than the client sessions; and identify at least one cause of the change in the overall score of the client session based on a comparison of the overall score for the client session to the plurality of overall scores, the identified one or more performance metrics, and the at least one aspect of the service.
15. A system, comprising: at least one processor; and at least one memory coupled to the at least one processor, the memory having computer-executable instructions stored thereon that, when executed by the at least one processor, cause the system to: detect a plurality of client sessions, each client session being provided to at least one device by a service; determine a first overall score of the service at a first time based on at least a first plurality of overall client session scores, each respective score of the first plurality of overall client session scores being determined based on one or more performance metrics of a respective client session of the plurality of client sessions; determine a second overall score of the service at a second time based on at least a second plurality of overall client session scores, each respective score of the second plurality of overall client session scores being determined based on one or more performance metrics of a respective client session of the plurality of client sessions; determine a weight of each client session of the plurality of client sessions based on at least the first plurality of scores, the second plurality of scores, the first overall score, and the second overall score; identify one or more client sessions of the plurality of client sessions which had an impact on the change between the first overall score and the second overall score based on the determined weights for each client session; and take an action based on the one or more identified client sessions and the impact of the one or more identified client devices on the change between the first overall score and the second overall score.
16. The system of claim 15, wherein to take the action based on the one or more identified client sessions, the computer-executable instructions further cause the system to: identify at least one aspect of the service based on the one or more client sessions; determine the action based on the at least one aspect of the service.
17. The system of claim 16, wherein the action comprises one or more of: obtaining diagnostic information from one or more devices to which a client session was provided; causing the at least one aspect of the service to be changed; and changing one or more subscriptions associated with one or more client sessions of the plurality of client sessions.
18. The system of claim 16, wherein to identify one or more client sessions, the computer-executable instructions further cause the system to: for each respective client session of the plurality of client sessions: identify a plurality of performance metrics for measuring an overall score of the respective client session; identify one or more performance metrics of the plurality of performance metrics that had a greater impact on the overall score of the respective client session than other performance metrics of the plurality of performance metrics; and identify the one or more client sessions based on the identified one or more performance metrics for each client session and the determined weights for each client session.
19. The system of claim 16, wherein to determine the action, the computer-executable instructions further cause the system to: receive a plurality of overall scores for services other than the service; identify at least one cause of the change in the overall score of the service based on a comparison of the overall score for the service to the plurality of overall scores and the at least one aspect of the service; and determine the action based on the at least one aspect of the service and the cause of the change in the overall score of the service.
20. The system of claim 16, wherein to determine the action, the computer-executable instructions further cause the system to: for each respective client session of the identified one or more client sessions: receive information regarding the performance of the client session from a user of a device associated with the respective client session; and determine the action based on the at least one aspect of the service and the information received for each of the identified one or more client sessions.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0003]
[0004]
[0005]
[0006]
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
DETAILED DESCRIPTION
[0015] Developers are generally unable to determine the impact of changes and adjustments made to a service. This impact could include user satisfaction, performance of the service on certain devices, user retention, growing the user-base, etc. Additionally, a comparison of the impact of changes is difficult to perform because developers lack an overall score to use as a reference point when comparing changes in the service over time. Furthermore, it is difficult to compare the service to other services, like those provided by the same service provider, similar services from other service providers, etc., without an overall score calculated based on the performance metrics of a service.
[0016] The embodiments disclosed herein address the issues above and help solve the above technical problems and improve the evaluation of services by providing a technical solution that obtains performance data describing the performance of a service and calculates an overall score based on that performance data. The system to evaluate services (or “digital service evaluation system”) additionally uses the performance data to determine which performance metrics, of a plurality of performance metrics used to determine the overall score, caused the greatest changes to the overall score over time. In some embodiments, a digital service evaluation system electronically detects a plurality of client sessions associated with one or more devices and obtains a first plurality of scores each describing one or more performance metrics of the client session and uses the scores to determine a first overall score of the service. The digital service evaluation system receives a second plurality of scores associated with the client session and uses the second plurality of scores to determine a second overall score of the service. The digital service evaluation system then determines a weight for each performance metric based on the first overall score, second overall score, first plurality of scores, and second plurality of scores. The digital service evaluation system uses the weight of each performance metric to determine which performance metric caused a change between the first overall score and second overall score, and takes an action to change the performance metric such that the overall score increases. The digital service evaluation system is thus able to evaluate a service based on a large number of client sessions to give an accurate score describing the overall performance of the service. In some embodiments, the performance metrics include data related to a user's experience, such as user engagement, churn rate, user satisfaction, desire to recommend the service, etc. The performance metrics related to a user's experience may be part of a net promoter score (a score describing the likelihood of a user to recommend, or promote, a service). The performance metrics may include scores related to the functioning of the service, such as time spent on the service, quality of the content (such as video quality, audio quality, etc.), the crash rate, the number of client sessions within a certain time period, etc. In some embodiments, the digital service evaluation system obtains the data making up the performance metrics from log files generated by the service. The digital service evaluation system may obtain the data making up the performance metrics from crash reports. The digital service evaluation system may obtain the data making up the performance metrics from user input. The digital service evaluation system may utilize a survey presented to a user to obtain data for the performance metrics.
[0017] In some embodiments, the digital service evaluation system uses the performance metrics to obtain an overall score for the service. The digital service evaluation system may use the performance metrics to obtain a score for a specific portion of a service. The digital service evaluation system may use the performance metrics to obtain a score for individual client sessions. The digital service evaluation system may use the performance metrics to obtain a score for specific devices, or groups of devices, associated with a user.
[0018] In some embodiments, one or more of the performance metrics are classified as a prioritized performance metric. The digital service evaluation system may alter the weight of a performance metric based on a determination that it is a prioritized performance metric. The digital service evaluation system may obtain user input indicating that a performance metric should be prioritized. The digital service evaluation system may utilize the prioritized performance metrics to create a customized score for a specific user, device, group of devices, etc.
[0019] In some embodiments, the client session may include one or more other sessions, such as a video session, an audio session, a game session, etc. (collectively “sub-sessions”). The digital service evaluation system may obtain performance metrics from each sub-session within a client session. The digital service evaluation system may determine an overall score for the sub-sessions based on the obtained performance metrics from each sub-session.
[0020] In some embodiments, the digital service evaluation system obtains data specifying a device type of a device receiving the client session. The digital service evaluation system may use the device type to determine an overall score for client sessions related to specific devices. The digital service evaluation system may obtain data specifying a type of service associated with a client session. The digital service evaluation system may use the service type to determine an overall score for a specific service. The digital service evaluation system may compare the overall score for a specific service to the overall score of another service. The overall score may be a linear score. The overall score may be a matrix score, such that the overall score reflects a score for two or more parameters.
[0021] In some embodiments, the digital service evaluation system takes an action after determining that a performance metric caused a change in the overall score. The action may include alerting a developer that the performance metric caused a change in the overall score. The action may include a message to a user of a device which received a client session. The digital service evaluation system may request information from a user to determine why the performance metric changed the overall score. The action may include comparing overall scores for the service related to different devices. The action may include comparing overall scores for different services.
[0022] In some embodiments, the digital service evaluation system transmits a client session indicating a digital video to a device. The device may obtain performance data describing performance metrics of the client session. The digital service evaluation system may obtain the performance data from the device. The digital service evaluation system may use the performance data to determine whether there was a change in the overall performance of the client session. The digital service evaluation system may transmit an indication of an action to take to the device based on the determination of whether there was a change in the overall performance of the client session.
[0023] In some embodiments, the digital service evaluation system includes a service evaluation data structure. The service evaluation data structure may be used to determine an overall performance metric of a client session. The service evaluation data structure may be used to determine whether one or more performance metrics caused a change in the overall performance metric of the client session. The service evaluation data structure may include information specifying a client session of a plurality of client sessions. The service evaluation data structure may include information specifying a digital video associated with the client session. The service evaluation data structure may include information specifying one or more scores representing performance metrics of the client session and an overall score based on the scores representing the performance metrics. The service evaluation data structure may include information specifying a weight of each performance metric based on the scores representing performance metrics and the overall score.
[0024] Unless the context requires otherwise, throughout the specification and claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open, inclusive sense, for example “including, but not limited to.”
[0025] Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0026] As used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the content clearly dictates otherwise. The term “or” is generally employed in its sense including “and/or” unless the content clearly dictates otherwise.
[0027] The headings and Abstract of the Disclosure provided herein are for convenience only and do not interpret the scope or meaning of the embodiments.
[0028]
[0029] The user premises may also include an optional network, communication system, or networked system (not shown), to which the server 101, as well as user devices 103a-103c and other endpoint devices (not shown), may be coupled. Non-limiting examples of such a network or communication system include, but are not limited to, an Ethernet system, twisted pair Ethernet system, an intranet, a local area network (“LAN”) system, short range wireless network (e.g., Bluetooth®), a personal area network (e.g., a Zigbee network based on the IEEE 802.15.4 specification), a Consumer Electronics Control (CEC) communication system, Wi-Fi, satellite communication systems and networks, cellular networks, cable networks, or the like. One or more endpoint devices, such as PCs, tablets, laptop computers, smartphones, personal assistants, Internet connection devices, wireless LAN, WiFi, Worldwide Interoperability for Microwave Access (“WiMax”) devices, or the like, may be communicatively coupled to the network and/or to each other so that the plurality of endpoint devices are communicatively coupled together. Thus, such a network enables the server 101, user devices 103a-103c, and any other interconnected endpoint devices, to communicate with each other.
[0030] The user devices 103a-103c may include devices such as cellular telephones, smartphones, tablets, personal computers, laptop computers, wireless peripheral devices such as headphones, microphones, mice, keyboards, etc., security cameras, Internet of Things (or “smart”) devices, televisions, smart televisions, smart television devices—such as FireTV, Roku, AppleTV, etc.,—routers, digital assistants, personal assistant devices—such as Amazon Alexa, Google Home, etc.,—drones, etc. The user devices 103a-103c may interconnect to one or more communications media or sources, such as routers, network switches, modems, etc., to transmit communications to other devices. The user devices 103a-103c may transmit data representing performance metrics, data describing the user devices 103a-103c, etc., to the server 101.
[0031] The above description of the digital service evaluation system 100, and the various devices therein, is intended as a broad, non-limiting overview of an example environment in which various embodiments of a digital service evaluation system 100 can operate. The digital service evaluation system 100, and the various devices therein, may contain other devices, systems and/or media not specifically described herein.
[0032] Example embodiments described herein provide applications, tools, data structures and other support to implement systems and methods for evaluating services. The example embodiments described herein additionally provide applications, tools, data structures and other support to implement systems and methods for providing an overall score of a service, client session, sub-session, etc. Other embodiments of the described techniques may be used for other purposes, including for determining whether one or more performance metrics caused a change in the overall score and taking an action based on that determination. In the description provided herein, numerous specific details are set forth in order to provide a thorough understanding of the described techniques. The embodiments described also can be practiced without some of the specific details described herein, or with other specific details, such as changes with respect to the ordering of processes or devices, different processes or devices, and the like. Thus, the scope of the techniques and/or functions described are not limited by the particular order, selection, or decomposition of steps described with reference to any particular module, component, or routine.
[0033]
[0034] While a server 101 configured as described above is typically used to support the operation of the digital service evaluation system 100, those skilled in the art will appreciate that the digital service evaluation system 100 may be implemented using devices of various types and configurations, and having various components. The memory 201 may include a service evaluation controller 210 which contains computer-executable instructions that, when executed by the CPU 202, cause the server 101 to perform the operations and functions described herein. The memory 201 may also include a client session detector 212 which contains computer-executable instructions that, when executed by the CPU 202, cause the server 101 to perform the operations and functions described herein. For example, the programs referenced above, which may be stored in computer memory 201, may include or be comprised of such computer-executable instructions.
[0035] The service evaluation controller 210 and client session detector 212 perform the core functions of the server 101, as discussed herein and also with respect to
[0036] The client session detector 212 electronically detects digital signals to detect client sessions for use by a user device, such as user devices 103a-103c, and facilitates communication between the user device and the server for purposes of operating the client session. The client session detector 212 may additionally include instructions to receive an indication of a sub-session and facilitate communication between a user device and the server for purposes of operating the sub-session. For example, a user device may be communicating with the server via a client session and request several videos, which may require the creation of and transmission of at least one video session to the user device via the client session. The client session detector 212 may additionally contain computer-executable instructions to cause the digital service evaluation system 100 to perform some or all of the operations further described in
[0037] In an example embodiment, the service evaluation controller 210, client session detector 212, and/or computer-executable instructions stored on memory 201 of the server 101 are implemented using standard programming techniques. For example, the service evaluation controller 210, client session detector 212, and/or computer-executable instructions stored on memory 201 of the server 101 may be implemented as a “native” executable running on CPU 202, along with one or more static or dynamic libraries. In other embodiments, the service evaluation controller 210, client session detector 212, and/or computer-executable instructions stored on memory 201 of the server 101 may be implemented as instructions processed by a virtual machine that executes as some other program.
[0038]
[0039] In addition, programming interfaces to the data stored as part of the service evaluation controller 210 and client session detector 212 can be available by standard mechanisms such as through C, C++, C#, and Java APIs; libraries for accessing files, databases, or other data repositories; through scripting languages such as JavaScript and VBScript; or through Web servers, FTP servers, or other types of servers providing access to stored data. The service evaluation controller 210 and client session detector 212 may be implemented by using one or more database systems, file systems, or any other technique for storing such information, or any combination of the above, including implementations using distributed computing techniques.
[0040] Different configurations and locations of programs and data are contemplated for use with techniques described herein. A variety of distributed computing techniques are appropriate for implementing the components of the embodiments in a distributed manner including but not limited to TCP/IP sockets, RPC, RMI, HTTP, Web Services (XML-RPC, JAX-RPC, SOAP, and the like). Other variations are possible. Also, other functionality could be provided by each component/module, or existing functionality could be distributed amongst the components/modules in different ways, yet still achieve the functions of the server 101 and/or user devices 103a-103c.
[0041] Furthermore, in some embodiments, some or all of the components/portions of the service evaluation controller 210, client session detector 212, and/or functionality provided by the computer-executable instructions stored on memory 201 of the server 101 may be implemented or provided in other manners, such as at least partially in firmware and/or hardware, including, but not limited to, one or more application-specific integrated circuits (“ASICs”), standard integrated circuits, controllers (e.g., by executing appropriate instructions, and including microcontrollers and/or embedded controllers), field-programmable gate arrays (“FPGAs”), complex programmable logic devices (“CPLDs”), and the like. Some or all of the system components and/or data structures may also be stored as contents (e.g., as executable or other machine-readable software instructions or structured data) on a computer-readable medium (e.g., as a hard disk; a memory; a computer network or cellular wireless network; or a portable media article to be read by an appropriate drive or via an appropriate connection, such as a DVD or flash memory device) so as to enable or configure the computer-readable medium and/or one or more associated computing systems or devices to execute or otherwise use or provide the contents to perform at least some of the described techniques. Such computer program products may also take other forms in other embodiments. Accordingly, embodiments of this disclosure may be practiced with other computer system configurations.
[0042] In general, a range of programming languages may be employed for implementing any of the functionality of the servers, user devices, etc., present in the example embodiments, including representative implementations of various programming language paradigms and platforms, including but not limited to, object-oriented (e.g., Java, C++, C#, Visual Basic.NET, Smalltalk, and the like), functional (e.g., ML, Lisp, Scheme, and the like), procedural (e.g., C, Pascal, Ada, Modula, and the like), scripting (e.g., Perl, Ruby, PHP, Python, JavaScript, VBScript, and the like) and declarative (e.g., SQL, Prolog, and the like).
[0043]
[0044] At start-up block 401, the client session 400 begins and the digital service evaluation system 100 may collect performance data related to the time required to start the client session 400. At loading delay block 403, the interface of the client session 400 is loaded, and the system may collect performance data related to the time required to load the interface. At video session blocks 450a and 450b, the client session 400 receives a video session 450, and the digital service evaluation system 100 may collect performance data related to the video session 450. The client session 400 may collect performance data to other aspects of the client session (not shown), such as logging in, crashing, idle time, search time, loading delays for specific pages, content discovery time, customer feedback, etc.
[0045]
[0046] At start delay block 451, the video session begins and the digital service evaluation system 100 may collect performance data related to the time required to start the video session. At buffering block 453, the video session buffers, or loads, the video onto a user device, and the digital service evaluation system 100 may collect performance data related to the time required to load the video. At playing video block 455, the user device plays the buffered video indicated by the video session, and the digital service evaluation system 100 may collect performance data related to playing the video. The digital service evaluation system 100 may also collect other performance data related to the video session (not shown), such as failure rate, black frames, video quality, audio quality, ad transitions, re-buffering times, crashing, loading program information, etc. Collection of such performance data includes electronic detection in real time of the video session buffering or loading the video onto a user device and the digital video failure rate, black frames, video quality, audio quality, ad transitions, re-buffering times, crashing, loading program information as the session is occurring at data rates of millions of bits per second and cannot be performed in the human mind.
[0047] The digital service evaluation system 100 may collect performance data related to the client session, video session, or other sub-sessions, by prompting the user for feedback during, after, or before a session. The digital service evaluation system 100 may collect performance data related to the client session, video session, or other sub-sessions via logs created during the life of the session. The digital service evaluation system 100 may collect performance data related to the client session, video session, or other sub-sessions via external Application Programming Interfaces (APIs) configured to obtain performance data related to the sessions. The digital service evaluation system 100 may collect performance data related to the client session, video session, or other sub-sessions via an API configured to operate or manage the sessions. In some embodiments, the digital service evaluation system 100 requests the performance data from an API configured to operate or mange the sessions. In some embodiments, the API configured to operate or manage the sessions pushes the performance data to the digital service evaluation system 100.
[0048]
[0049] In some embodiments, the overall score of the client session, stored in column 516, is calculated based on the data stored within the service evaluation data structure. The digital service evaluation system 100 may receive the data after each type of data has been converted to conform to the same scale. The digital service evaluation system 100 may convert the data within the service evaluation data structure to conform to the same scale before calculating the overall score of the client session. The digital service evaluation system 100 may calculate a video session score based on data gathered during a video session 450, and may use that video session score as part of calculating the client score. The digital service evaluation system 100 may calculate a customer score based on data regarding customer feedback, and may use that customer score as part of calculating the client score. The digital service evaluation system 100 may use each of the client scores stored in the service evaluation data structure to calculate an overall score for the service. The digital service evaluation system 100 may use a similar data structure (not shown) to store performance data regarding, and calculate an overall score for, a service as a whole, a customer, a device or group of devices, a sub-session, a device type, etc.
[0050] As shown in
[0051]
[0052]
[0053]
[0054] The graphical depictions of
[0055]
[0056] In some embodiments, a portion of the first plurality of scores is obtained via user input. The user input may be obtained via a survey presented on the device associated with the client session.
[0057] In some embodiments, the performance metrics relate to the performance of one or more devices having access to the service. The performance metrics may be related to: a crash rate of the client session, a video quality of the digital video, a time spent buffering the digital video, a time spent accessing the client session, a likelihood that a user of the device would recommend the service, etc.
[0058] In some embodiments, the digital service evaluation system 100 receives an indication that one or more performance metrics should be prioritized. The digital service evaluation system 100 may alter the weights of each performance metric based on the indication that one or more performance metrics should be prioritized. The digital service evaluation system 100 may alter the weighted formula based on the indication that one or more performance metrics should be prioritized.
[0059] In some embodiments, the performance metrics include performance data related to an application receiving a client session or sub-session. The performance metrics may include data obtained from crash reports and application logs related to an application receiving a client session or sub-session.
[0060] In some embodiments, the digital service evaluation system 100 uses a plurality of overall scores for a plurality of client sessions to determine an overall score for a service. The digital service evaluation system 100 may obtain a second plurality of overall scores for each of the plurality of client sessions at a future time. The digital service evaluation system 100 may use the first plurality of overall scores for a plurality of client sessions and the second plurality of overall scores for a plurality of client sessions to determine which performance metric caused a change between the first plurality of overall scores and the second plurality of overall scores. The digital service evaluation system 100 may then take an action to change the performance metric which caused a change between the first plurality of overall scores and the second plurality of overall scores.
[0061]
[0062] At act 904, the computing device obtains a second set of performance data describing performance metrics of the client session in a similar manner to act 902. At act 905, the computing device transmits the second performance data to the service evaluation server. At act 906, the computing device receives an indication of an action to take from the service evaluation server. In some embodiments, the indication of an action to take is based on a determination by the service evaluation server that a performance metric caused a change in an overall score calculated by the service evaluation server based on the performance data and the second performance data.
[0063] In some embodiments, the computing device prompts the user for information related to the performance of the client session. The digital service evaluation system 100 may use the information related to the performance of the client session as at least a portion of the performance data.
[0064] In some embodiments, the computing device obtains, via user input, an indication of one or more prioritized performance metrics. The computing device may transmit the indication of one or more prioritized performance metrics to the service evaluation server.
[0065] In some embodiments, where the computing device receives a video session, the computing device obtains video performance data describing one or more performance metrics of the video session. The computing device may transmit the video performance data to the service evaluation server. The computing device may receive an indication of an action to take from the service evaluation server based on a determination that a performance metric of the video session caused a change in the overall performance of the video session.
[0066]
[0067] In some embodiments, the service evaluation data structure includes information specifying one or more video scores representing performance metrics of one or more video sessions. The digital service evaluation system 100 may use the video scores when determining the overall score. The digital service evaluation system 100 may utilize the video scores to determine an overall video metric describing the performance of one or more video sessions.
[0068] In some embodiments, the service evaluation data structure includes information specifying a service performance metric based on the overall metric for a plurality of client sessions. The service evaluation data structure may use the service performance metric to determine an overall score for the service.
[0069] In some embodiments, the service evaluation data structure includes information specifying a device type for each client session. The service evaluation data structure may use the device type to compare the performance of client sessions with different device types. The service evaluation data structure may use the device type to compare the performance of client sessions with the same device type.
[0070] In some embodiments, the service evaluation data structure includes information specifying a service type for each client session. The service evaluation data structure may use the service type to compare the performance of client sessions with different service types. The service evaluation data structure may use the service type to compare the performance of client sessions with the same service type.
[0071] The various embodiments described above can be combined to provide further embodiments. All of the U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet are incorporated herein by reference, in their entirety. Aspects of the embodiments can be modified, if necessary to employ concepts of the various patents, applications and publications to provide yet further embodiments.
[0072] These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.