G06F11/3461

Devices, systems and methods for optimizing workload performance of user facing web applications during high load events

Disclosed are devices, systems, apparatuses, methods, products, and other implementations for optimizing system performance of user facing web applications with load testing scripts. According to some embodiments, the system includes an analytics engine and a workload model including one or more load variables. The workload model generates a distribution of values for each of the one or more load variables. The system further includes a script engine and a load test controller that controls load generators to simulate internet traffic to a website. The load test controller determines an amount of computer resources needed to meet a high load scenario based on the performance of the system in response to the simulated internet traffic to the website.

PRE-SILICON CHIP MODEL OF EXTRACTED WORKLOAD INNER LOOP INSTRUCTION TRACES

A system is provided to validate a computer processor. The system includes a computing system configured to obtain core dump data including executable instructions corresponding to a code stored in a legacy processor. An instruction-level simulator is installed in the computing system and is configured to simulate the executable instructions to generate a plurality of instruction traces. The system further includes a pre-silicon chip model simulator configured to execute the instruction traces to generate performance data. The computer processor is verified based at least in part on the performance data.

ARCHITECTURE AGNOSTIC REPLAY VERFICATION
20210191841 · 2021-06-24 · ·

According to aspects of the disclosure a method is provided, comprising: generating a live execution trace log corresponding to a live execution of a computer program, the live execution being performed by using both hardware emulation and hardware acceleration; generating a first trace entry corresponding to a replay execution of the computer program, the replay execution being performed by using hardware emulation without hardware acceleration, the replay execution being performed based on a set of events that are recorded during the live execution of the computer program; detecting whether the first trace entry is valid based on the live execution trace log; and in response to detecting that the first trace entry is not valid, transitioning into a safe state.

Method and apparatus for a parallel, metadata-based trace analytics processor
11126532 · 2021-09-21 · ·

Method and apparatus for a parallel, metadata-based trace analytics processor is disclosed. The trace analytics processor is able to asynchronously parallelize the processing operation and use metadata about each parallel operation intelligently. The result is the ability to get analytics results quickly, efficiently, and in real time.

SOFTWARE BUG REPRODUCTION

Example methods and systems for software bug reproduction. One example method may comprise obtaining log information associated with multiple transactions processed by a control-plane node to configure a set of data-plane nodes and transform an initial network state to a first network state; and configuring a replay environment that is initialized to the initial network state, and includes a mock control-plane node and a set of mock data-plane nodes. The method may also comprise, based on the log information, replaying the multiple transactions using the mock control-plane node to configure the set of mock data-plane nodes and transform the replay environment from the initial network state to a second network state. Based on a comparison between the first network state and the second network state, a determination may be made as to whether a software bug is successfully reproduced in the replay environment

Graphical user interface (GUI) for representing instrumented and uninstrumented objects in a microservices-based architecture

A method of rendering a graphical user interface (GUI) comprising an application topology graph for a microservice architecture comprises generating a plurality of traces from a first plurality of spans generated by instrumented services in the architecture and generating generate a second plurality of spans for uninstrumented services using information extracted from the first plurality of spans. The method further comprises grouping the second plurality of spans with the plurality of traces. Subsequently, the method comprises traversing the traces and collecting a plurality of span pairs from the plurality of traces, wherein each pair of the span pairs is associated with a call between two services. The method also comprises aggregating information across the plurality of span pairs to reduce duplicative information associated with multiple occurrences of a same span pair from the plurality of span pairs. Finally, the method comprises rendering the application topology graph using the aggregated information.

Simulating Human Usage of a User Interface

A method, computer program, and computer system is provided for testing a user interface. A previously trained machine learning model trained with traces of interactions between one or more users and a user interface is accessed. The interactions include one or more timestamps of user interactions with the user interface, actions by each user associated with the user interface, and metadata associated with user interactions. A simulated interaction of a simulated agent utilizing the user interface is generated using the previously trained machine learning model. The simulated interaction is encoded as an input trace to a user interface. The encoded simulated interaction is input into the user interface for automated testing of the user interface. Results of the automated testing of the user interface are received.

Observer for simulation test and verification

Systems and methods validate the operation of a component of an executable model without inadvertently altering the behavior of the component. The model may be partitioned into a design space and a verification space. The component may be placed in the design space, while an observer for validating the component may be placed in the verification space, and linked to the component. During execution of the model, input or output values for the component may be computed and buffered. Execution of the observer may follow execution of the component. The input or output values may be read out of the buffer, and utilized during execution of validation functionality defined for the observer. Model compilation operations that may inadvertently alter the behavior of the component, such as back propagation of attributes, are blocked between the observer and the component.

Hardware In Loop Testing and Generation of Latency Profiles for Use in Simulation
20210049243 · 2021-02-18 ·

Systems, methods, tangible non-transitory computer-readable media, and devices associated with testing, simulation, or operation of an autonomous device including an autonomous vehicle are provided. For example, a service entity computing system can perform operations including obtaining operating software data associated with operating software of the autonomous vehicle. Log data associated with one or more real-world scenarios can also be obtained. One or more first simulations of the operating software can be performed based on the one or more real-world scenarios. A latency distribution profile associated with the operating software can be generated based on the one or more first simulations. One or more second simulations of the operating software can be performed based on the latency distribution profile and one or more artificially generated scenarios. Furthermore, a real-world behavior of the autonomous vehicle can be predicted based on the one or more second simulations.

PROCESSING SCREENSHOTS OF AN APPLICATION USER INTERFACE TO DETECT ERRORS
20210081294 · 2021-03-18 ·

A technique is introduced for detecting errors and other issues in an application graphical user interface (GUI) by applying machine learning to process screenshots of the GUI. In an example embodiment, the introduced technique includes crawling a GUI of a target application as part of an automated testing process. As part of the crawling, an executing computer system can interact with various interactive elements of the GUI and capture various screenshots of the GUI that depict the changing state of the GUI based on the interaction. These screenshots can then be processed using one or more machine learning models to detect errors and/or other issues with the GUI of the application. In some embodiments, the machine learning models can be trained using previously captured and labeled screenshots from other application GUIs.