Patent classifications
G06F9/44578
Systems, media, and methods for identifying loops of or implementing loops for a unit of computation
Systems, media, and methods may identify loops of a unit of computation for performing operations associated with the loops. The system, media, and methods may receive textual program code that includes a unit of computation that comprises a loop (e.g., explicit/implicit loop). The unit of computation may be identified by an identifier (e.g., variable name within the textual program code, text string embedded in the unit of computation, and/or syntactical pattern that is unique within the unit of computation). A code portion and/or a section thereof may include an identifier referring to the unit of computation, where the code portion and the unit of computation may be at independent locations of each other. The systems, media, and methods may semantically identify a loop that corresponds to the identifier and perform operations on the textual program code using the code portion and/or section.
Combined active and preinitialized resource management for rapid autoscaling
A scaling manager manages deques that track groups of preinitialized instances used to scale respective groups of active compute instances. Various techniques for deque management include a technique where a total instance quantity is preconfigured for the total number of instances assigned to both the group and the deque of preinitialized instances. As the size of the group grows for scale-ups, the size of the deque may go down. For example, the deque is not replenished when the group scales, but does expand when the group scales down. The total instance quantity may be bounded, in some examples, and an additional “buffer amount” of preinitialized instances may be implemented to provide a safety margin for burst scaling, which can be further enhanced by transferring instances between data structures of different groups of instances in some cases.
Gateway pull model
A computer system comprising: (i) a computer subsystem configured to act as a work accelerator, and (ii) a gateway connected to the computer subsystem, the gateway enabling the transfer of data to the computer subsystem from external storage at pre-compiled data exchange synchronization points attained by the computer subsystem, which act as a barrier between a compute phase and an exchange phase of the computer subsystem, wherein the computer subsystem is configured to pull data from a gateway transfer memory of the gateway in response to the pre-compiled data exchange synchronization point attained by the subsystem, wherein the gateway comprises at least one processor configured to perform at least one operation to pre-load at least some of the data from a first memory of the gateway to the gateway transfer memory in advance of the pre-compiled data exchange synchronization point attained by the subsystem.
Pre-loading of user applications including skipping of selected launch actions
A user device includes a memory and one or more processors. The memory is configured to store one or more user applications installed in the user device. The one or more processors are configured to select one or more operations, which are to be performed in normal launching of a user application but not in background pre-loading of the user application, to pre-load the user application before the user application is accessed by a user, including skipping the one or more selected operations, and to complete the one or more skipped operations in response to the user accessing the user application.
Method for application processing, storage medium, and electronic device
A method for application processing, a storage medium, and an electronic device are provided. The method includes: obtaining historical operation information of the electronic device; obtaining triggering probability values of a plurality of applications in an application platform installed in the electronic device based on the historical operation information; selecting an application with a triggering probability value greater than a first preset probability value as a target application; downloading resource files of the target application; buffering the resource files into a storage area corresponding to the application platform; and loading the resource files stored in the storage area and corresponding to the target application, in response to detecting a triggering operation on the target application.
Background pre-rendering of user applications
A user device includes a display screen and one or more processors. The display screen is configured to display content to a user. The one or more processors are configured to pre-load a user application by running at least part of a program code of the user application in a background, including enabling the program code of the user application to pre-render a visual display of the user application in the background, and, in response to the user accessing the user application, to transfer the pre-rendered visual display to a foreground, thereby displaying the visual display to the user on the display screen.
MEMORY OVERLAY USING A HOST MEMORY BUFFER
Two or more overlay sections are copied from a non-volatile memory device of a memory sub-system to a first memory buffer residing on a first volatile memory device of a host system in communication with the memory sub-system. Each overlay section includes a respective set of executable instructions. A first overlay section is copied from the host memory buffer to a second memory buffer residing on a second volatile memory device of the memory sub-system. A first set of executable instructions included in the first overlay section residing in the second memory buffer is executed. A second overlay section is copied from the host memory buffer to the second memory buffer. A second set of executable instructions included in the second overlay section residing in the second memory buffer is executed.
SORTING OPTIMIZATION BASED ON USER'S TIME PREFERENCES AND HABITS
System and methods discussed for automatically optimizing application and notification delivery based on user preferences and historical application usage. Applications that a user is likely to want to use at the present time or in the near future are displayed in an organizationally distinct way in an application catalog so they are easy to find and are pre-loaded on an application delivery server so they are available with minimal system lag caused by application loading processes. Application notifications are also optimized such that notifications that are likely to be relevant to users at the current time are identified and presented to them in an organizationally distinct way.
Application preloading method and device, storage medium and terminal
The embodiments of the disclosure provide an Application (APP) preloading method and device, a storage medium and a terminal. The method includes that: when an APP preloading event is detected to be triggered, an APP to be preloaded is determined; whether a first APP which is being started in foreground exists or not is judged; and if NO, the APP to be preloaded is preloaded. According to the application, with adoption of the technical solution, not only low speed and non-fluency, caused by preloading the APP, for starting of the APP in foreground may be effectively avoided, but also a speed of preloading the APP to be preloaded may be effectively increased, so that a speed of starting the APP to be preloaded may be increased.
SYSTEM AND METHOD FOR AUTOMATIC GENERATION AND MANAGEMENT OF FEATURE LEVEL APPLICATION DIRECTORY
Embodiments of the present invention provide a system for automatically generating and managing application directories of one or more applications associated with an entity. The system is configured for identifying initiation of packaging of one or more program codes associated with at least one application, scanning the one or more program codes to identify one or more parameters associated with the one or more program codes, and automatically generating an application directory associated with the at least one application based at least on the one or more parameters identified by scanning the one or more program codes, wherein the one or more parameters comprise one or more dependencies, one or more screens, one or more permissions, one or more services, one or more navigational parameters, one or more base classes, one or more logging frameworks, and one or more static analyzers.