DISTRIBUTED SCORING SYSTEM
20220343351 · 2022-10-27
Inventors
Cpc classification
International classification
Abstract
Systems and methods are provided for computing user scores or ratings for an application program. The method comprises receiving a survey request from a scoring computer system and providing multiple instances of a survey template to a plurality of end-user-computers. A scoring computer system receives survey data from the plurality of end-user-computers and sends the survey data to the scoring computer system. The survey data is free of data allowing identification of individual end-users or end-user-computers. The scoring computer system computes a user experience score for the one application program as a function of the survey data and outputs a graphical representation of the computed score by the scoring computer system. The survey data is provided in a structure defined by one template, which allows the user experience of all end-users to be comparable.
Claims
1. A method for computing a score for an application program, the method comprising: providing a scoring computer system operatively coupled to a data repository, the data repository comprising registration data of multiple customers and their software applications; providing a survey computer system comprising one or more survey templates, wherein the data repository is protected against access by the survey computer system, receiving, by the survey computer system, a survey request from the scoring computer system, the survey request comprising an application-ID of one of the registered software applications; providing, by the survey computer system, multiple instances of the one of the survey templates to a plurality of end-user-computers; receiving, by the survey computer system, survey data from the plurality of end-user-computers via a network connection, the survey data being indicative of the user experience of end-users in respect to the one application program whose application-ID is comprised in the survey request, the survey data being provided in a structure defined by the template; sending the survey data from the survey computer system to the scoring computer system, whereby the sent survey data is free of data allowing identification of individual end-users or end-user-computers; and computing, by the scoring computer system, a score for the one application program as a function of the survey data received for at least the one application program, the score being indicative of the aggregated user experience of the end-users in respect to the one application program.
2. The method of claim 1 further comprising outputting a graphical representation of the computed score by the scoring computer system.
3. The method of claim 1, wherein the survey data received by the survey computer system comprises sensitive end-user data, in particular an IP-address and/or user-ID of the end-user-computers and wherein the method further comprises: storing, by the survey computer system, the received survey data such that the received survey data is access protected against access by the scoring computer system; and removing, by the survey computer system, the sensitive end-user-data from the received end-user-data for providing anonymized survey data, wherein the anonymized survey data is sent from the survey computer system to the scoring computer system.
4. The method of claim 1, wherein the computing of the score for the one application program comprises: aggregating the totality of survey data currently comprised in the data repository which relates to the one application program and which was received via the requested survey; automatically identifying one or more other ones of the application programs being similar to one or more application program features and/or belonging to one or more other ones of the registered customers which are similar to the customer owning the one application program; and comparing the aggregated survey data of the one application program with the aggregated survey data of the one or more other identified application programs, wherein the score is computed as a function of the result of the comparison, whereby the score indicates the user experience of end-users in respect to the one application program relative to the user experience of end-users with the one or more identified other application programs.
5. The method according to claim 4, wherein: the scoring computer system is configured to perform, in response to receiving survey data of one or more end-users from the survey computer system and/or in response to a repeatedly generated scheduler signal, the computing of the score; wherein the identification of the one or more similar application programs is performed solely based on a comparison of metadata and without decrypting any customer identification data; and wherein preferably the score computation is performed irrespective of whether the admin-user of the customer owning the one application program is currently successfully authenticated at a key store unit comprising a customer-specific cryptographic key of the said customer.
6. The method according to claim 1, wherein the data repository comprises multiple sub-repositories respectively assigned to a different one of the registered customers, whereby the data stored in the sub-repositories are isolated from each other and wherein each sub-repository comprises: customer identification data of the customer to whom it is assigned; application-IDs of one or more applications owned by the said customer; application metadata of the one or more applications owned by the said customer; customer metadata of the said customer, the customer metadata being free of information allowing identification of the said customer; and survey data gathered for the one or more application programs owned by the customer, the survey data being free of customer identification data, wherein selectively the customer identification data in each of the sub-repositories is encrypted with a customer-specific cryptographic key, and wherein the survey data and the application metadata and the customer metadata is stored in cleartext form.
7. The method according to claim 5, wherein the scoring computer system is operatively coupled to a data storage unit referred to as key store, wherein the customer-specific cryptographic keys are stored in the key store; wherein the key store is configured to grant the scoring computer system access to the cryptographic key of a particular customer only after a successful authentication of an admin-user of the particular customer at the key store; and wherein the key store is configured to deny the scoring computer system access to the cryptographic key of the particular customer automatically upon a log-out event of the admin-user from the key store.
8. The method of claim 1, further comprising: creating, by the scoring computer system, the survey request for one of the registered customers; in response to receiving the request, creating, by the survey-computer system, a URL being unique for the requested survey; sending the URL from the survey-computer system to the scoring computer system; and providing, by the scoring computer system, the URL directly or via the one of the registered customers for which the survey request was created to the end-users, in particular via a printout, an e-mail, a webpage or via an app on an end-user-device.
9. The method of claim 8, wherein the survey computer system or the scoring computer system is configured to encode the URL in a 2D code, in particular a matrix code, and/or wherein the survey template instance is a web-form provided via a network.
10. The method according to claim 1, wherein the customers are companies and the data repository (220) comprises customer metadata, the customer metadata comprising one or more property values being indicative of a respective property of the customer; wherein the data repository comprises application metadata of the application programs, the application metadata comprising one or more of: the name of the application program, the type of the application, the version of the application, the program libraries used by the application, the number of end-users to be used by the application program, the programming language of the application program, the deployment-type of the application program, the operating system required by the application program; performing, by the scoring system, a cluster analysis for assigning the multiple application programs to different clusters, the application programs in the same cluster being similar in respect to their application metadata and/or belonging to companies being similar in respect to the customer metadata; comparing the score of the one application program selectively with the score obtained for the ones of the application programs being in the same cluster; and outputting a result of the cluster-specific comparison.
11. The method according to claim 1, wherein the data repository comprises application-metadata of the application programs, the application metadata comprising one or more of: the name of the application program, the type of the application, the version of the application, the program libraries used by the application, properties of the IT-environment of the application program, the number of end-users to be used by the application program, the programming language of the application program, the deployment-type of the application program, the operating system required by the application program; wherein the scoring computer system comprises a trained machine-learning model, the trained machine learning model having learned to correlate application metadata with respectively computed scores being indicative of end-user experience with the respective software application; wherein the method further comprises using, by the scoring system, the trained machine learning model for predicting one or more software application modifications which will improve the end-user experience; and outputting the predicted improvement action.
12. The method according to claim 11, wherein the predicted improvement action is selected from a group comprising: the adding, replacement or removal of a software library; the use of a different type or version of a DBMS used by the application program for storing or reading data; the use of a different hardware component, in particular network interfaces, device drivers and/or data storage devices; the use of a different webserver or application server for deploying and/or distributing the software application; the re-programming of the software application, in particular the optimization of specific source code sections.
13. The method of claim 1, wherein the score is a combination of a set of sub-scores, each of the sub-scores belonging to one out of 2-6, preferably 4 user-experience-categories, the method further comprising creating the graphical representation of the score in the form of a pie-chart (400), wherein each of the user-experience-categories is represented as a pie-chart-segment (402, 404, 406, 408) with a unique color, each segment comprising a plurality of sub-segments having the same color as the segment comprising the sub-segment, the radius of each sub-segment being indicative of a respective one of the sub-scores.
14. The method according to claim 1, the method comprising: in response to receiving, from any one of the multiple end-user-computers, survey data, automatically sending the received survey data from the survey computer system to the scoring computer system; in response to receiving the survey data, re-computing, by the scoring computer system, the score as a function of the received survey data; and outputting a graphical representation of the re-computed score by the scoring computer system.
15. The method according to claim 1, further comprising authenticating, by the scoring computer system, at the survey computer system, wherein the survey computer system is configured to process survey requests only in case the request is received from an authenticated scoring computer system.
16. The method according to claim 1, further comprising authenticating, by the survey computer system, at the scoring computer system, wherein the survey computer system is configured to receive and process survey data only in case the request is received from an authenticated survey computer system.
17. The method of claim 1, wherein at least some of the templates is customized specifically to the one of the application programs to whom it is assigned, the method further comprising: providing, by the scoring computer system an interface enabling an admin-user of the customers to create, modify and/or delete the ones of the templates being assigned to application programs owned by respective customer; receiving, by the scoring computer system, a newly created or modified survey template or a survey template deletion command from one of the admi-users via the interface; sending the newly created or modified survey template or the template deletion command from the scoring computer system to the survey computer system; updating, by the survey computer system, the survey one or more templates in accordance with the received newly created or modified survey template or the template deletion command.
18. A computer system comprising: a scoring computer system operatively coupled to a data repository comprising registration data of multiple customers and their software applications; a survey computer system comprising one or more survey templates, wherein the data repository is protected against access by the survey computer system, wherein the survey computer system is configured to: receive a survey request from the scoring computer system, the survey request comprising an application-ID of one of the registered software applications; provide, by the survey computer system, multiple instances of the one of the survey templates to a plurality of end-user-computers; receive, by the survey computer system, survey data from the plurality of end-user-computers via a network connection, the survey data being indicative of the user experience of end-users in respect to the one application program whose application-ID is comprised I the survey request, the survey data being provided in a structure defined by the template; and send the survey data from the survey computer system to the scoring computer system, whereby the sent survey data is free of data allowing identification of individual end-users or end-user-computers; wherein the scoring computer system is configured to: compute a score for the one application program as a function of the survey data received for at least the one application program, the score being indicative of the aggregated user experience of the end-users in respect to the one application program; and output a graphical representation of the computed score.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0051]
[0052]
[0053]
[0054]
DETAILED DESCRIPTION
[0055] The present disclosure provides devices, systems and methods for computing a score for an application program. The embodiments and examples described herein are to be understood as illustrative examples of the invention. Further embodiments of the invention are envisaged. Although the invention has been described by way of example to a specific combination and distribution of software programs and computer systems, it is to be understood that any feature described in relation to any one embodiment may be used alone, or in combination with other features described, and may also be used in combination with one or more features of any other of the embodiments, or any combination of any other of the embodiments as long as these features are not mutually exclusive.
[0056] Accordingly, some embodiments of the present application are directed to a computer program product. Other embodiments of the present application include a corresponding computer-implemented method and software programs to perform any of the method embodiment steps and operations summarized above and disclosed in detail below.
[0057] Any software program described herein can be implemented as a single software application or as a distributed multi-module software application. The software program or programs described herein may be carried by one or more carriers. A carrier may be a signal, a communications channel, a non-transitory medium, or a computer readable medium amongst other examples. A computer readable medium may be a tape; a disc for example a CD or DVD; a hard disc; an electronic memory; or any other suitable data storage medium. The electronic memory may be a ROM, a RAM, Flash memory or any other suitable electronic memory device whether volatile or non-volatile.
[0058] Each of the different features, techniques, configurations, etc. discussed herein can be executed independently or in combination and via a single software process on in a combination of processes, such as in client/server configuration.
[0059] It is to be understood that the computer system and/or the computer-implemented method embodiments described herein can be implemented strictly as a software program or application, as software and hardware, or as hardware alone such as within a processor, or within an operating system or a within a software application.
[0060] The operations of the flow diagrams are described with references to the systems/apparatus shown in the block diagrams. However, it should be understood that the operations of the flow diagrams could be performed by embodiments of systems and apparatus other than those discussed with reference to the block diagrams, and embodiments discussed with reference to the systems/apparatus could perform operations different than those discussed with reference to the flow diagrams.
[0061] A “key store” as used herein is a data container configured to store cryptographic keys such that the use and/or access to any one of the keys stored therein is strictly controlled. Once keys are in the keystore, they can be used for cryptographic operations with the key material completely or partially remaining non-exportable. According to some embodiments, the key store offers facilities to restrict when and how keys can be used, such as requiring user authentication for key use or restricting keys to be used only in certain cryptographic modes. In particular, the keys are protected from unauthorized use. For example, a key store can mitigate unauthorized use of key material by allowing the scoring software to use the key of a particular customer for decrypting sensitive customer-specific data only within a session between the scoring computer system and an authenticated admin-user of this company and preferably after this admin-user has explicitly agreed to the decryption of the sensitive data. According to preferred embodiments, the keys, or at least the private keys, are never released by the key store. Rather, the encrypted data is entered into the key store and the key store uses the appropriate key to decrypt the encrypted data and to return the input data in decrypted form. This has the advantage that in case the scoring computer system is compromised, the attacker may be able to use the key store but cannot extract the key material in a different computer system. According to embodiments, the key store binds the keys stored therein to a secure hardware, e.g., a hardware security module (HSM). When this feature is enabled for a key, its key material is never exposed outside of secure hardware. A HSM typically comprises its own CPU, a secure storage and often also a true random-number generator. Often, an HSM comprises additional mechanisms to resist package tampering and unauthorized sideloading of apps. According to some embodiments, one or more of the following algorithms and key sizes are used by the key store for creating and using cryptographic keys: RSA 2048, AES 128 and 256, ECDSA P-256, HMAC-SHA256 and/or Triple DES 168
[0062] According to some embodiments, the key store lets the admin-user of a customer and/or the admin of the scoring computer system specify authorized uses of a customer-specific key when generating or importing the key. Once a key is generated or imported, its authorizations cannot be changed. Authorizations are then enforced by the key store whenever the key is used.
[0063] A “survey request” as used herein is a request to collect information about in a population of end-users, in particular information about the experience the end-users had when using and interacting with a particular application program.
[0064] An “admin computer” as used herein is a computer system assigned to a user referred herein as “admin” or “admin user”, whereby an admin user represents a customer having registered at the scoring computer system. Typically, the customer (and the admin user representing the customer) have obtained the right to initiate the creation of survey requests and/or to receive the computed user experience score of the application programs owned by the said customer. According to some embodiments, the scoring computer system may perform these steps only after a successful authentication of the admin user at the scoring computer system. The admin computer system can be, for example, a standard computer system, e.g., a desktop computer system or a portable computer system such as a notebook or tablet computer or smartphone.
[0065] An “end-user computer” as used herein is a computer system assigned to a user referred herein as “end-user”. The end-user is a user whose feedback in respect to the usability of a software application is to be obtained and analyzed in order to compute the user experience score. Typically, an end-user does not have permission to trigger the creation of a survey request or the computation of a user experience score. The end-user computer system can be, for example, a standard computer system, e.g., a desktop computer system or a portable computer system such as a notebook or tablet computer or smartphone.
[0066] An “application program” as used herein is a program or group of programs designed for end-users. Examples of an application include a program for controlling a manufacturing process, a program for simulating a technical process, e.g., fluid dynamics, a word processor, a spreadsheet, an accounting application, an email client, a media player, a file viewer, or a photo editor.
[0067] A “customer” as used herein is a digital representation of a natural or legal person, in particular a company.
[0068] A “data repository” as used herein is a logical data store which may be based on one or more physical data stores and which is used for storing a particular type of data, e.g., registration data. The data repository can be a file directory, a single file, a database operated by a database management system (DBMS) or the like.
[0069] The term “registration data” as used herein refers to data provided by a customer during the customers' registration at the scoring computer system. The registration data can comprise data being indicative of the identity of the customer, e.g., name and address, and may comprise customer-metadata and/or application metadata of one or more applications owned by the customer. The metadata may be provided during or after completion of the registration process.
[0070] An “survey computer system” as used herein is a computer system, in particular a server computer system. The survey computer system is configured to provide template instances to a plurality of end-user devices and to receive survey data from the end-user devices.
[0071] An “scoring computer system” as used herein is a computer system, in particular a server computer system. The scoring computer system is configured to receive survey data from the survey computer system and to compute a user experience score for a survey.
[0072] An “template” as used herein is a standardized file type, typically a non-executable file type, used by computer software as a pre-formatted example on which to base other files, especially surveys. Typically, the template instances and the survey data are exchanged via a network, in particular the internet.
[0073] The expression “computer system” as used herein is a machine or a set of machines that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming. Modern computers have the ability to follow generalized sets of operations, called “programs”, “software programs”, “application” or “software applications”. These programs enable computers to perform a wide range of tasks. According to some embodiments, a computer system includes hardware (in particularly, one or more CPUs and memory), an operating system (main software), and additional software programs and/or peripheral equipment. The computer system can also be a group of computers that are connected and work together, in particular a computer network or computer cluster, e.g., a cloud computer system. Hence, a “computer system” as used herein can refer to a monolithic, standard computer system, e.g., a single server computer, or a network of computers, e.g., a clout computer system. In other words, one or more computerized devices, computer systems, controllers or processors can be programmed and/or configured to operate as explained herein to carry out different embodiments of the invention.
[0074] Referring now to
[0075] In a first step 102, a scoring computer system 214 is provided. The scoring computer system can be a monolithic computer system, e.g., a single server, or can be a distributed computer system, e.g., a cloud computer system and/or a virtualized computer system. The scoring computer system is operatively coupled to a data repository 220. The data repository comprises registration data of multiple customers having registered at the scoring computer system. Furthermore, the data repository comprises registration data and/or meter data of software applications owned by the registered customers. Each customer may own one or more application programs. The data repository 220 can be, for example, a DBMS hosted by a database server which is linked to the scoring computer system 214 via a network connection, e.g., an intranet or Internet connection. According to other embodiments, the data repository 220 can be an integral part of the scoring computer system, e.g., a DBMS instantiated on the same computer system as an application program 208, 218 used for performing a cluster analysis of customers and/or application programs and for computing the user experience score.
[0076] Next in step 104, the method comprises providing a survey computer system 236. The survey computer system comprises one or more survey templates 302, 304. For example, each template can be a file, e.g., an XML file or a JSON file, or a database record or any other type of data structure. According to preferred embodiment, each template has a format which allows editing by a user (e.g., XML or JSON, etc.). The data repository 220 is protected against access by the survey computer system. This means that sensitive customer-related information such as the customer's name and customer metadata such as company size, number of employees, number and type of software applications owned by the customer, user base and the like are not disclosed to the survey computer system.
[0077] Next in step 106, the survey computer system receives a survey request 310 from the scoring computer system 214. The survey request comprises application-ID of the one of the registered software applications for which a user experience score is to be obtained and computed.
[0078] In response to receiving the request, the survey computer system in step 108 provides multiple instances 306, 308 of the of the one of the survey templates which is assigned to the one of the application programs whose application-ID is comprised in the survey request and provides these instances to a plurality of end-user-computers 250, 252, 254. For example, the survey computer system can be configured to analyze the received survey request and to extract the application-ID and a request-ID comprised in the survey request. The survey computer system then identifies a template which is to be instantiated. For example, according to some embodiments, the survey computer system may comprise a single template which is used for generating survey template instances for all the application programs registered at the scoring computer system. According to other embodiments, the survey computer system may comprise different templates for different types of applications or may in some cases even comprise templates which are particular for a particular application program and/or which has been customized for a particular customer or application program. However, according to preferred embodiments, the same survey template is used for all application programs to ensure that the survey data received for the different application programs is comparable.
[0079] Typically, the survey computer system does not distribute the template instances directly to the end-users, because the survey computer system has no access to the data repository 220 or to any other data source being indicative of the name of the customer for whom the survey is to be conducted or being indicative of the identity of the end-users or the addresses of the end-user devices. For example, the survey computer system may be configured to create, in response to receiving the request, a survey-specific URL which allows any client device which is in possession of this URL to instantiate a survey template via a call of the URL. The survey computer system sends this URL to the scoring computer system 214 and the scoring computer system provides the URL directly or via an admin user 268, 270 and respective admin-devices 202, 204 to the end-users 262, 264, 266 and end-user devices 250, 252, 254.
[0080] Next in step 110 the survey computer system 236 receives survey data 316 from the plurality of end-user computers via a network connection. For example, the survey template instances can be HTML, forms which are downloaded via the Internet by the paralysis of the end-user devices. The end-users fill in survey data into the form and submit the survey data via the network back to the survey computer system. The survey data is indicative of the user experience of the end-users in respect to the one application program whose application ID is indicated in the survey request.
[0081] Next in step 112, the survey computer system sends the survey data to the scoring computer system 214. The sent survey data is free of data allowing identification of individual end-users and end-user computers. The survey data is free of data allowing identification of individual end-users or end-user computers such as end-user names or end-user-computer-IP-addresses. Hence, any end-user can be sure that his or her feedback data regarding usability of the software application of interest cannot be traced back. This may ensure that the end-user provides honest feedback and does not try to please expectations of the owner or provider of the software application program of interest (for which the score is to be computed).
[0082] Next in step 114, the scoring computer system computes a score 242 for the one application program whose usability score is to be computed. The score is computed as a function of the survey data received for at least the one application program of interest. According to preferred embodiments, the score is computed as a function of the survey data received for a plurality of different application programs owned by different customers. Preferably, only survey data of application programs which are similar to the application program of interest and/or survey data of application programs owned by customers which are similar to the customer owning the application program of interest are used for computing the usability score 242. For example, the application program of interest can be an application program for in-house organization and documentation of vacation days for a plurality of employees. In this case, according to one embodiment, only survey data obtained for application programs for organizing and/or documenting vacation days are used as input for computing the score value. According to some embodiments, the set of survey data used as input for the score computation is further limited to a survey data received for vacation planning programs used by customers which are similar to the customer owning the application program of interest, e.g., in respect to the number of employees, in respect to the technical field of operation, etc.
[0083] The computed score is indicative of the aggregated user experience of the end-users in respect to the one application program of interest. In case aggregated survey data of other applications is taken into account, the computed score is also indicative of the user experience relative to the user experience obtained for other application programs of other customers, in particular of similar and/or comparable application programs and customers.
[0084] Next in step 116, the scoring computer system outputs a graphical representation of the computed score. For example, the output can be provided in the form of a printout or in the form of a graphical user interface element which is displayed via a display of the scoring computer system or via a display of a client computer system receiving the graphical representation of the score via a network. For example, the graphical representation of the score can be displayed in a browser of an admin computer 202, 204, 206 of an admin-user 286, 270 and can be provided by the scoring computer system via a network connection. An example for a graphical representation of a usability score is presented in
[0085]
[0086] According to some embodiments, the system further comprises a data repository 220 operatively coupled to the scoring computer system. According to some embodiments, the system further comprises one or more admin computers 202, 204, 206 and/or one or more end-user computers 250, 252 and 254.
[0087] The scoring computer system comprises survey data aggregation functions 245 configured to aggregate the totality of survey data received from all end-users in respect to a particular survey request at a given time when the score is to be computed. For example, the aggregation can comprise computing the mean, min, max and/or median of a question-specific score based on the totality of survey data received from a plurality of end-users for a particular survey request. According to some embodiments, the application of survey data obtained for a particular survey can be repeated automatically, e.g., after a predefined time period has lapsed (e.g., a minute, an hour, a day, etc.) or in response to receiving survey data for a survey request from any single end-user via the survey computer system.
[0088] The survey data aggregation functions 245 can be an application program hosted by the scoring computer system 214 and/or can be a submodule of the scoring application program 208.
[0089] The scoring computer system comprises an application program or program module 208 configured for computing a usability score 242, also referred to as “UX-score”, from aggregated survey data 214 of one or more application programs registered at the scoring computer system.
[0090] According to some embodiments, the score 242 for a particular application program is computed based on the user experience data obtained for similar application programs (not from the usability data of all application programs registered at the scoring computer system). For example, the scoring computer system can comprise a software program or software module 218 for performing a cluster analysis of application metadata having been aggregated for a plurality of application programs 244 owned by registered customers. The clustering is performed for identifying groups (clusters) of similar application programs 243. Only the aggregated survey data of application programs having been identified to be similar to the application program of interest are used as input for computing the usability score of the application program of interest. According to embodiments, the clustering analysis performed by module 218 is performed based on application metadata and/or customer metadata 246. The metadata allows identifying, for a particular application program of interest which is owned by a particular customer, a plurality of application programs which are similar to the application program of interest and/or which are owned by a customer which is similar to the said particular customer. Only the aggregated survey data 244 of the application programs identified by the cluster analysis module 218 are provided to the scoring application 208 and used as input for computing the score.
[0091] According to preferred embodiments, the customer metadata and/or application metadata used by the clustering module 218 is free of customer-identifying information. This may allow identifying similar application programs without risking to disclose sensitive customer-related data such as company size in association with the companies' name or address.
[0092] According to some embodiments, the scoring computer system can comprise a module or application program 238 for creating and/or modifying one or more survey templates 302, 304. The survey templates are then provided to the survey computer system 236 and stored in a survey template repository 239.
[0093] According to further embodiments, the scoring computer system comprises or is operatively coupled to a key store unit 212. The key store unit is a software and/or hardware unit configured to store a plurality of customer-specific cryptographic keys which have been created for each customer having registered at the scoring computer system individually.
[0094] For example, for each of the registered customers, a symmetric cryptographic key can be stored which acts as encryption and decryption key for customer specific data, in particular sensitive customer-specific data which would allow determining the identity of the customer (such as the name and/or address of the customer). According to other embodiments, for each of the registered customers, an asymmetric cryptographic key pair with a public encryption key and a private decryption key is created upon registration of the customer, whereby at least the private decryption key stored in the key store unit 212 such that only admin-users of the customer owning the key(s) are allowed to access and use the stored key(s). The key store unit is configured to provide the keys of a particular customer to the scoring application 208 and to any other component of the scoring computer system 214 only in case an admin user of the said customer has successfully authenticated at the key store unit, has requested an action which requires the customer's encryption or decryption key and only if the customer has not yet logged out. This may have the advantage that it is ensured that the admin uses of the different customers having registered at the scoring computer system are not able to see and/or manipulate the names of the other companies having registered at the scoring computer system. Even the technical admins of the scoring computer system 214 are not able to decrypt the customer -related encrypted data stored in the data repository 220 and hence do not know the identity of the registered customers. The technical admins have only access to the unencrypted metadata which may comprise a customer-ID, e.g., a number or random character string, but is free of any data which identifies the customer.
[0095] Each time a sensitive, encrypted information is read from the customer-specific database 222, 224, 226, the cryptographic decryption key for the currently requesting customer is determined by the key store 212 to finally decrypt the encrypted parts of the information retrieved from the database, in particular the customer's name. Likewise, every time sensitive information is to be stored in the customer-specific database 222, 224, 226, in particular the customer's name and optionally further data which could reveal the identity of the customer, the cryptographic encryption key for the currently requesting customer is determined by the key store 212 to encrypt the sensitive parts of the information to be stored into the database, in particular the customer's name.
[0096] According to some embodiments, the key store is implemented based on a HSM (hardware security module). According to other embodiments, the key store is implemented as Service, e.g., the SAP Cloud Platform Credential Store service.
[0097] The data repository 220 comprising customer-related and application program -related data can be implemented, for example, as a database management system (DBMS) which can be hosted on the scoring computer system 214 or on a database server 216 operatively coupled to the scoring computer system. According to preferred embodiments, the data repository 220 comprises multiple, customer specific sub-repositories which are isolated from each other. For example, the DBMS 220 can comprise a plurality of different databases 222, 224, 226, whereby each of the databases is assigned to exactly one of the registered customers and selectively comprises data of this particular customer and of the one or more applications owned by this customer. For example, database 222 selectively comprises data for the customer “tenant 1” represented by admin user 268, database 224 selectively comprises data for the customer “tenant 2” represented by admin user 270, and database 226 selectively comprises data for the customer “tenant 3”.
[0098] Each database comprises metadata 246, whereby the metadata comprises customer metadata and application metadata of the one or more applications owned by this customer. The customer-metadata comprises sensitive customer-specific information such as information allowing identification of the customer, e.g., the name or address of the customer. Typically, the customer identifying information is stored in the database in encrypted form, whereby preferably a customer specific encryption key was used for the encryption. The customer-metadata may comprise one or more property values being indicative of a respective property of the customer. These properties can be, for example, technical field in which the customer operates, number of employees, company size, annual profit, annual turnaround, etc. The customer metadata are preferably stored in a non-encrypted form in the customer-specific database. As the customer-specific databases are isolated from each other, no customer can see any data of another customer. The technical admin of the scoring computer system and also the score computing and clustering application programs 208 and 218 can access the non-encrypted metadata, but they do not know to which customer this metadata belongs, because customer-identifying information is stored in the database only in encrypted form.
[0099] In addition, each customer-specific database comprises, for each of one or more application programs owned by the customer, anonymized survey data which was received from the survey computer system and which is free of any information being indicative of the identity of the end-users having submitted the survey data. According to preferred embodiments, also aggregated survey data 244 having been computed by module 245 for each of the one or more application programs owned by the customer is stored in the customer-specific database.
[0100] For example, database 222 comprises metadata 246.1 related to “tenant 1” (T1) and the application program(s) T1-AP1, T1-AP2 owned by “tenant 1”. In addition, it comprises survey data 227 in respect to the application program T1-AP1 obtained in a first survey request, and comprises survey data 228 in respect to the application program T1-AP2 obtained in a further survey request.
[0101] Database 224 comprises metadata 246.2 related to “tenant 2” and the application program(s) T2-AP3, T2-AP4 owned by “tenant 2” (T2). In addition, it comprises survey data 229 in respect to an application program T2-AP3 obtained in one survey request for program T2-AP3, and comprises survey data 228.2 in respect to the application program T2-AP4 obtained in a further survey request.
[0102] Database 226 comprises metadata 246.3 related to “tenant 3” (T3) and the application program(s) T3-AP5 owned by “tenant 3”. In addition, it comprises survey data 231 in respect to an application program T3-AP5 obtained in a survey request for program T3-APS.
[0103] According to embodiments, the scoring computer system is configured so store customer related data in the data repository 220 using Client-Side Field Level Encryption with a customer-specific encryption key. This means that customer related data is stored in different fields such as “customer-name”, “customer-ID”, “customer-metadata”, “application-name”, “application-ID”, “application metadata”, “application-version”, etc. Only the content of the field “customer-name” is encrypted as it comprises customer-identifying information while the content of the other fields is stored in the data repository 220 in cleartext (unencrypted) form.
[0104] Survey data 2228, 230 and 231 of different application programs deemed to be “similar” by the cluster analysis program 218 are indicated I by identical hatchings.
[0105] According to some embodiments, the scoring computer system comprises a caching subsystem 213 which is configured to receive and temporarily store survey data in case the data repository 220 is not available, e.g., because the database server 216 is down. The caching subsystem repeatedly checks whether the data repository 220 is available again and, in this case, automatically stores the cashed survey data in the sub-repository specifically assigned to the customer owning the application program whose survey data has been received and cached. This may ensure that no feedback data is lost in case the database server 216 is off-line or out of service.
[0106] The survey computer system 236 can comprise a module 241 for creating a survey-ID, for creating a survey-specific URL in response to receiving a survey request from the scoring computer system, for instantiating a survey template, for distributing the template instance to multiple end-user computers and/or for collecting the survey data obtained from the end-user computers 250, 252, 254. For example, the module 241 can be configured to create, in response to receiving a survey request for a particular application program, a QR code comprising a URL being unique to this survey. The QR code is returned to the scoring computer system to enable the scoring computer system to provide the QR code—typically via the admin user of the customer having requested the survey—to a plurality of end users. The module 241 can comprise a web server which is configured to create, upon receiving a call to the above-mentioned survey-request-specific URL, a HTML, page with a multi-page form, wherein each page comprises a plurality of questions regarding the usability of a particular application program. The web server provides the HTML page with the multipage form via a network connection to the browser of the end user computer having called the survey-request specific URL.
[0107] The web form typically comprises 20-60 questions. According to some embodiments, the web form allows the end-user to enter a question specific score value, e.g., one out of a predefined set of different numerical values. Preferably, the end-user is enabled to select, for each of the questions of the form, a question-specific score within a range covering 3-10, preferably 5, different score values such as “0”, “1”, “2”, “3”, “4”. For example, the user may enter the question-specific score by selecting one item from a radio button group, or by selecting a check-box in a group of check-boxes allowing only a single box to be selected. The form may enable the end-user to navigate between the different pages. The that form preferably comprises a survey-request-ID which is returned, together with the survey data entered by the end-user once the end-user submits the filled-out form. The survey-request-ID enables the survey computer system to identify the survey request to which the received survey data needs to be assigned. The survey computer system sends the survey data received from each of the end-user computers in Association with the survey request ID to the scoring computer system.
[0108] According to preferred embodiments, the survey computer system is configured to analyze the survey data before the data is sent to the scoring computer system. This means that any information being indicative of the identity of the end-user or end-user computer having provided the survey data (such as the IP address, place and/or time of survey data submission, etc.) is removed from the survey data.
[0109] Preferably, the survey computer system does not store the survey data permanently or is configured to delete the survey data after a predefined time, e.g., some hours, days or weeks, after having sent the survey data to the scoring computer system.
[0110]
[0111] The scoring computer system 214 can comprise an interface which enables an admin user 268 of a registered customer referred herein as “tenant 1” (T1) to submit a request 301 indicating that a survey should be started in order to compute a score for the user experience of a plurality of end-users with a particular application program, e.g., application program T1-AP1. The application program T1-AP1 may not be the only application program owned by the customer T1, so the request 301 may comprise the name or an identifier of the application program of interest. The application program name can be the official name of the application program and the application-ID can be a numerical value or a random character string created by the scoring computer system upon registration of the application program T1-AP1 for the customer T1.
[0112] In response to receiving the request 301, the scoring computer system identifies at least the application-ID of the application of interest. Optionally, further parameters are identified which can be of relevance in the context of the new survey request, e.g., the customer-ID, the version number of the application program, etc. Preferably, the following steps are performed by the scoring computer system 214 only in case the admin-user 268 has successfully authenticated at the scoring computer system before or while submitting the request 301.
[0113] The scoring computer system creates a “create new survey request” 310 comprising at least the application-ID of the application program whose user experience score is to be compute. In addition, the request can optionally comprise the name of the application program, the version number, the customer-ID (referred to as “tenant-ID”), a review number to indicate the number of times the customer requested a survey of this particular application program, etc. According to preferred embodiments, the “create new survey request” 310 is free of customer-identification data, so the survey computer system does not know for which customer/company the survey is to be conducted. This greatly increases the security, because in case the request 310 should be disclosed to an unauthorized party, no sensitive information would be revealed. In addition, according to preferred embodiments, the data exchange between the scoring computer system and the survey computer system 236 is performed via a cryptographically secured communication channel, as indicated by the key symbols in
[0114] In response to receiving the request 310, the survey computer system creates a survey request ID. The survey request ID is unique for the requested survey. The survey computer system is configured to store the request 310 or at least the application-program ID is comprised in the request 310 in association with the request-ID. In addition, the survey computer system creates a URL 317 which is unique for this survey request. The URL comprises an address via which a plurality of end-user computers can obtain a survey template instance with data input means for providing survey data to the survey computer system. For example, the URL can be an http or https Internet address via which a web-form can be opened in a web-browser.
[0115] The survey computer system 236 is configured to send the URL 317 together with the request-ID to the scoring computer system 214.
[0116] According to some embodiments, the survey computer system or the scoring computer system is configured to encode the URL in a graphical code, in particular a 2D code such as a barcode, or a matrix code, e.g., a QR code. According to other embodiments, the survey computer system only provides the URL to the scoring computer system and the graphical code is created by the scoring computer system or by the computer system of the admin-user having requested the survey.
[0117] Then, the scoring computer system 214 distributes the URL to a plurality of end-users who are supposed to provide the survey feedback data. Depending on the embodiment, different distribution methods can be implemented. According to one embodiment, the scoring computer system outputs the URL as a printout which is then provided to the address of the customer T1 having requested the survey by mail. Another option is to send the URL in electronic form, e.g., by email or by an application interface, to an admin computer 202 of the customer T1. The customer T1/the admin-user 268 then distributes the URL to the end-user's which are supposed to provide the user experience data in respect to the software application of interest. For example, the admin-user 268 can may post a notice with the URL on a bulletin board in a company building, such as a cafeteria or coffee room, and invite employees to participate in the survey in the notice. The admin user can also send an email with the URL to selected employees of the company to participate in the survey. The URL can be provided as string or in the form of a 2D code.
[0118] In response to receiving a notification of the survey with the URL, a plurality of end-users 262, e.g., the employees of customer T1 having requested the survey, will access the URL 317 via their end-user devices 250 comprising a browser. The opening of the URL by the browser will trigger the survey computer system to provide an instance of a survey template to the end-user device via a network connection. For example, the survey template instance 306 can be an instance of a default survey template 302 used for acquiring user experience feedback data for a plurality of different application programs of different types.
[0119] According to embodiments, the URL comprises a parameter being indicative of the survey-ID and the survey data entered by the end-user's also comprises the survey-ID. The instantiated template integrates the survey-ID such that the survey-ID is provided by the end-users together with the survey data to the survey computer system. The survey-ID which is received by the survey computer system together with the survey data 316 enables the survey computer system to identify the survey to which the survey data belongs to.
[0120] According to some embodiments, the survey computer system comprises 2 or more different templates 302, 304 and the URL which is created in response to receiving the “create new survey request” 310 comprises a parameter which determines which ones of the survey templates is to be instantiated and distributed to the end-user computers for collecting user experience survey data 316.
[0121] According to some embodiments, the survey computer system 236 immediately forwards, upon receiving survey data 316 from any one of the end-user devices, the survey data to the scoring computer system 214. Preferably, the survey data is anonymized before it is forwarded in the form of an anonymized survey response 314 to the scoring computer system and is free of any information revealing the identity of the end-user or end-user device having submitted the survey data. The survey response 314 comprises a survey-ID and may comprise one or more further optional parameters such as the tenant-ID, the application-ID, a version-ID of the application under review, the survey template version and the survey response data.
[0122] According to other embodiments, the survey computer system pools survey data obtained from 2 or more end-user computers for the same survey and forwards the would survey data in the form of batch -wise survey responses 314 to the scoring computer system. As the survey computer system is not permitted to access the data repository 220, the survey computer system cannot store the survey data into the data repository. Rather, the scoring computer system 214 is configured to receive the survey responses 314 and store the survey data in association with the survey-ID in the data repository. For performing the storing, the scoring computer system analyzes the survey-ID and other parameters comprised in the survey response in order to identify the customer for whom the survey was conducted and for identifying the one 222 of the customer-specific databases/data sub-repositories which is specifically assigned to this customer. The survey data is stored selectively in the identified customer-specific database/data sub-repository to ensure it cannot be access by any admin-user of other customers.
[0123] According to some embodiments, the scoring computer system can send a stop survey request 312 comprising a survey-ID or other parameters allowing the identification of an ongoing survey to the survey computer system. For example, the sending of the stop survey request 312 can be triggered by the admin-user having initiated the survey to be stopped or can be triggered by the scoring computer system automatically upon determining that the predefined minimum number of end-user survey data has been received or upon determining that a predefined time period has lapsed.
[0124] The scoring computer system 214 comprises a score computation module 208 configured to compute a user experience score 242 based on the totality of survey data 314 received so far from one or more end-users having participated in the requested survey. The score can be computed after a survey was stopped or can be computed even for an ongoing survey and may be recomputed later upon receiving additional feedback data. This may have the advantage that it is not necessary to wait until a survey which may take several days or even several weeks a month was completed. Rather, it is possible to obtain a preliminary usability score already during an ongoing survey.
[0125] Then, the scoring computer system computes a graphical representation of the computed score which preferably provides a qualitative and quantitative and reproducible score for the user experience provided by the software application of interest. An example of the graphical score representation is depicted in
[0126]
[0127] Each segment comprises a plurality (in this case: 3) sub-segments having the same color as the segment comprising this sub-segment. For example, the sub-segments 410 “consistency” and 412 “clear structure” have the same (here: green) color as the segment 402 comprising the sub-segments 410, 412.
[0128] The radius of each of the sub-segments correlates with and/or is indicative of a respective one of the sub-scores. For example, the sub-segment 410 representing the sub-score “consistency” has a sub-score value of 3.8 and the sub-segment representing the sub-score “intuitive handling” has a sub-score value of 4.1. As a consequence, the radius of sub-segment 410 “consistency” is smaller than the radius of the sub-segment “intuitive handling”.
[0129] According to some implementations, the query template and each query form created as an instance of the template comprises a predefined number of questions for each of the sub-scores. Every question provides a predefined set of options for answering the question, e.g., enables a user to select one out of a predefined number of numerical values such as “1”, “2”, “3”, “4”, and “5”.
[0130] For each of the sub-scores, e.g., for the sub-score of the sub-category m (e.g., “consistency” 410), and for a number n of questions per sub-category m, the respective sub-score value can be computed as follows:
[0131] In an example where the user has the option to select an integer value between 1 and 5 for each of the n questions for a particular sub-category m, the sub-score.sub.m obtained from any single end-user will always be between 1.0 and 5.0.
[0132] In order to compute an aggregated sub-score sub-score.sub.m.agg for sub-category m based on feedback data obtained from a number of e end-users, the following formula can be used:
[0133] The user experience score may then be computed based on the aggregated sub-scores computed for the totality of sub-categories f as follows:
[0134] The radius for the visualization of the individual sub-scores sub-score.sub.1.agg, sub-score.sub.2.agg, sub-score.sub.f.agg can then be calculated using the rule of three based on a given maximum radius in pixels, which corresponds to the maximum possible aggregated sub-score value for a sub-category (e.g., “5”).
[0135] Thus, the depicted graphical representation provides a reproducible, quantitative as well as qualitative indicator of user experience for a plurality of different aspects of user interaction a given application program. The complex utility score comprising a plurality of different sub-scores allows identifying strengths and weaknesses of each individual software application and allows to manually, semiautomatically or automatically improve the usability of a particular application program. For example, by comparing the graphical score representations of 2 different application programs used for inhouse vacation management, the admin user of a customer using a specific medication management software can easily determine whether the software which is used provides a better user experience than most of the other software applications used for the same or a similar purpose by similar companies.
[0136] Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the embodiment disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the embodiment being indicated by the following claims.