Patent classifications
G06F11/368
TECHNIQUES FOR VISUAL SOFTWARE TEST AUTOMATION MANAGEMENT
Various embodiments of the present invention provide methods, apparatuses, systems, computing devices, computing entities, and/or the like for executing efficient and techniques for generating automated testing workflow data entities based at least in part on session data entities, integrating nestable automated testing workflow data entities based at least in part on session data entities into integrative automated testing workflow data entities based at least in part on session data entities, and generating execution longs for automated testing workflow data entities based at least in part on session data entities.
TECHNIQUES FOR DECOUPLED MANAGEMENT OF SOFTWARE TEST EXECUTION PLANNING AND CORRESPONDING SOFTWARE TEST EXECUTION RUNS
Various embodiments of the present invention provide methods, apparatuses, systems, computing devices, computing entities, and/or the like for executing efficient and reliable techniques for software test execution planning by utilizing at least one of static execution plan data entities, dynamic execution plan data entities, worksheet execution run data entities, automated execution run data entities, and manual execution run data entities.
Method of, and apparatus for, testing computer hardware and software
A method for generating an automated test configured to test a system under test. The system under test has a plurality of operational states, at least one operational state having one or more executable actions associated therewith operable to execute predetermined operations and/or transition the system under test between operational states. The method includes the steps of: defining a model of the system under test comprising a plurality of model states; defining one or more selectable model actions; associating one or more test description sections with one or more model actions; selecting a sequence of model actions; and utilising the test description sections associated with the selected sequence of model actions to define a sequence of operation commands for execution on the system under test as an automated test.
Mobile log heatmap-based auto testcase generation
A system is provided for mobile log heatmap-based auto test case generation. In particular, the system may continuously track and log user actions and data flows for applications within the production environment. Based on the logs, the system may generate a navigation network graph through which the system may identify all possible navigation paths that may be taken by the user to access certain functions or screens of the application. Once the paths have been identified, the system may collect and sanitize testing data based on user session and system interaction data in the production environment. The testing data may then be used to drive the development of the next release or version of the application.
METHOD AND SYSTEM FOR MANAGING LIFE CYCLE ITERATION OF TEST CASE, AND MEDIUM
The disclosure provides a method and system for managing a life cycle iteration of a test case, where the method includes the following steps: writing a test case according to a mind map template; importing the written test case in batches based on the mind map template; storing the imported test case, and marking the test case, where the marking includes marking the test case as a manual test case; and creating an automated test case, and associating the automated test case with the manual test case.
RISK-BASED ROOT CAUSE IDENTIFICATION METHODS AND RELATED AUTOBUILD SYSTEMS
Database systems and methods are provided for identifying a change associated with an update to executable code resulting in test failure. One method involves calculating risk scores for different changes associated with the update based on change characteristics associated with the respective changes, identifying a change from among the different changes associated with the update based on the risk scores associated with the respective changes, generating a modified update to the executable code that includes the identified change and excludes remaining changes of the update from the modified update, and initiate execution of one or more tests with respect to a compiled version of the modified update to the executable code. When execution of the one or more tests against the modified update results in a test failure, the change is identified as a potential root cause of the test failure associated with the update.
Operation management server, development operation support system, method thereof, and non-transitory computer readable medium storing program thereof
An operation management server (20) includes: an associating unit (21) configured to associate an updated content of configuration information of a CMDB (30) with a test scenario corresponding to the updated content; a setting unit (22) configured to set a target system (40) according to the updated content of the configuration information of the CMDB (30); and a test executing unit (23) configured to execute a system test on the set target system (40) based on the test scenario associated with the updated content. This configuration provides an operation management server and a development operation support system capable of easily executing a system test, and a method and a program thereof.
Analysis of code coverage differences across environments
Methods, systems, and computer-readable media for analysis of code coverage differences across environments are disclosed. A code coverage profiling system determines a first code coverage profile associated with execution of program code in a first environment. The first code coverage profile indicates one or more portions of the program code that were executed in the first environment. The code coverage profiling system determines a second code coverage profile associated with execution of the program code in a second environment. The second code coverage profile indicates one or more portions of the program code that were executed in the second environment. The code coverage profiling system performs a comparison of the first code coverage profile and the second code coverage profile. The comparison determines a difference between the portions of the program code that were executed in the first and second environments.
Distributed canary testing with test artifact caching
Methods, systems, and computer-readable media for distributed canary testing with test artifact caching are disclosed. Using one or more storage components, a test client stores one or more software artifacts for testing of a software product. The client initiates a first test of the software product using the software artifact(s) stored in the storage component(s). In the first test, the client sends a first set of requests to the software product at a first point in time. The client initiates a second test of the software product using the software artifact(s) stored in the storage component(s). In the second test, the client sends a second set of requests to the software product at a second point in time. The software artifact(s) are maintained in the storage component(s) between the first point in time and the second point in time.
Testing systems and methods
A computer implemented method, system and computing device for identifying a test option associated with an application for a user is described. The method comprises selecting a predefined test indicated by a test identifier associated with the requested application, the test having more than one test option associated therewith, generating a hash of the test identifier and a user identifier associated with the user, processing the hash to generate an index, comparing said index with a distribution of numbers divided into multiple ranges, each range being associated with a test option, and selecting a test option associated with the range into which the index falls. The applications may be computer gaming applications.