Patent classifications
G06F11/3668
Orchestration for automated performance testing
Methods, systems, and devices supporting orchestration for automated performance testing are described. A server may orchestrate performance testing for software applications across multiple different test environments. The server may receive a performance test indicating an application to test and a set of test parameters. The server may determine a local or a non-local test environment for running the performance test. The server may deploy the application to the test environment, where the deploying involves deploying a first component of the performance test to a first test artifact in the test environment and deploying a second component of the performance test different from the first component to a second test artifact in the test environment. The server may execute the performance test to obtain a result set, where the executing involves executing multiple performance test components as well as orchestrating results across multiple test artifacts to obtain the result set.
Techniques for conformance testing computational operations
Examples described herein generally relate to performing conformance testing of a computational operation. A reference result including one or more reference intermediate products and a reference accumulator output at a first level of precision can be generated for the computational operation and based on one or more inputs. A hardware result can similarly be created using hardware at a second level of precision. The reference result can be compared to the hardware result to determine a variance value. A conformance result can be output based on whether the variance value is within a threshold range.
Software pipeline configuration
In certain embodiments, a software pipeline (“pipeline”) is configured by the use of gates for progressing an application from one stage to another (e.g., from a development stage to a production stage). A configuration file having a set of attribute values that is descriptive of an application, and a gate mapping file having information associated with the gates to be invoked for different combinations of attribute values are obtained. The configuration file is processed using the gate mapping file to determine a set of gates to be invoked for progressing the application in the pipeline based on the attribute values of the application. The set of gates are invoked to cause a corresponding set of software routines to be executed for progressing the application.
Intelligent services for application dependency discovery, reporting, and management tool
Techniques for monitoring operating statuses of an application and its dependencies are provided. A monitoring application may collect and report the operating status of the monitored application and each dependency. Through use of existing monitoring interfaces, the monitoring application can collect operating status without requiring modification of the underlying monitored application or dependencies. The monitoring application may determine a problem service that is a root cause of an unhealthy state of the monitored application. Dependency analyzer and discovery crawler techniques may automatically configure and update the monitoring application. Machine learning techniques may be used to determine patterns of performance based on system state information associated with performance events and provide health reports relative to a baseline status of the monitored application. Also provided are techniques for testing a response of the monitored application through modifications to API calls. Such tests may be used to train the machine learning model.
Recommending programmatic descriptions for test objects
A technique includes receiving, by a computer, user input representing creation of a first programmatic description of a first test object of source code to be tested. The technique includes, in response to receiving the user input, determining, by the computer, based on other programmatic descriptions of other test objects, a recommendation of a parameter to be used in the first programmatic description to identify the first test object. The technique includes causing, by the computer, a display of the recommendation.
METHOD AND SYSTEM FOR TESTING AN AVIONIC COMPUTER
A method for testing an avionic computer having internal parameters of which only a subset of internal parameters is accessible to a test bench. The method includes connecting the avionic computer to the test bench, equipping the test bench with a test computer having software similar to software of the avionic computer, all of the internal parameters of which are accessible to the test bench, executing the software of the avionic computer in interaction with the test bench and executing the software of the test computer at the same time as the software of the avionic computer, and visualizing internal parameters belonging to the subset of internal parameters of the avionic computer and visualizing internal parameters of the test computer, corresponding to internal parameters of the avionic computer not belonging to the subset of internal parameters of the avionic computer, to check the conformity of operation of the software.
Remote software usage monitoring and entitlement analysis
A computational instance of a remote network management platform may execute a remote access call for a license consolidation server. The remote access call may contain instructions for obtaining concurrent license usage statistics from the license consolidation server. In response to obtaining the concurrent license usage statistics, the computational instance may update a software configuration with the concurrent license usage statistics, where the software configuration contains a license rights allocation for the concurrent software application. Based on the concurrent license usage statistics and the license rights allocations, the computational instance may generate a representation of a graphical user interface that contains an overview pane indicating a utilization of the concurrent software application. Then the computational instance may transmit, to a client device, the representation of the graphical user interface.
Systems and methods for provisioning and decoupled maintenance of cloud-based database systems
Methods and systems are described for provisioning cloud-based database systems and performing decoupled maintenance. For example, conventional systems may rely on database management systems to provision and modify databases hosted by a service provider. However, for entities operating complex database systems with the need for highly customized cloud infrastructure, database management systems fail to provide the granular customization and the control necessary to create and service these systems. In contrast, the described solutions provide an improvement over conventional database management system architecture by providing direct communication between an entity and its cloud-based database systems via command line prompts or API calls, decoupling database system maintenance from database system provisioning process to increase the speed and granular customization of the database system. Moreover, the disclosed solution leverages machine learning to predict optimal database system provisioning and maintenance processes and resources.
Generation of an issue response analysis evaluation regarding a system aspect of a system
A method includes determining, by an analysis system, a system aspect of a system for an issue response analysis evaluation. The method further includes determining, by the analysis system, at least one evaluation perspective and at least one evaluation viewpoint for use in performing the issue response analysis evaluation on the system aspect. The method further includes obtaining, by the analysis system, issue response analysis data regarding the system aspect in accordance with the at least one evaluation perspective and the at least one evaluation viewpoint. The method further includes calculating, by the analysis system, an issue response analysis rating as a measure of system issue response analysis maturity for the system aspect based on the issue response analysis data, the at least one evaluation perspective, the at least one evaluation viewpoint, and at least one evaluation rating metric.
PENETRATION TESTING FOR API SERVICE SECURITY
According to some embodiments, a method comprises: obtaining an application programming interface (API) specification for an API service; performing one or more tests on the API service to determine an amount of deviation between the API service and the API specification; and determining a deviation score based on the amount of deviation between the API service and the API specification. The method may include transmitting the deviation score to a scoring agent.