G06F16/24568

INFORMATION PROCESSING METHOD NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM AND STREAM PROCESSING SYSTEM

An information processing method includes accepting a query including first information for specifying a target period and a condition for extracting a processing result, selecting, based on the query, a processing unit that has a processing result updated in the target period from among a plurality of the processing units included in the stream processing infrastructure by referring to correspondence information representing, in association with each other, second information indicating each of the plurality of the processing units and third information indicating a timing of a last update of a processing result by the processing unit, and transmitting, to the selected processing unit, a request to execute a predetermined process on the processing result that has updated in the target period and satisfies the condition.

MULTILAYER PROCESSING ENGINE IN A DATA ANALYTICS SYSTEM

Methods, systems, and computer storage media for providing a multilayer processing engine of a multilayer processing system. The multilayer processing engine supports an event layer, a metadata layer, and a multi-tier processing layer. The metadata layer can refer to a functional layer that operates via a sequential hierarchy of functional layers (i.e., event layer and multi-tier processing layer) to analyze incoming event streams and configure a downstream processing configuration. The metadata layer provides for dynamic metadata-based configuration of downstream processing of data associated with the event layer and the multi-tier processing layer. The multilayer processing system can be a data analytics system—operating via a serverless distributed computing system. The data analytics system implements the multilayer processing engine as a serverless data analytics management engine for processing high frequency data at scale based on dynamically-generated processing code—generated based on a downstream processing configuration—that supports automatically processing the data.

Interfaces for data monitoring and event response

A computing device is coupled to a display device, and includes a data monitoring software application program executing on a processor within a data monitoring system. Via the data monitoring software application program, various techniques are performed for generating user interfaces for data monitoring and event response. In a first technique, the data monitoring software application program displays a user interface that includes a first region including a data visualization and a second region including one or more images of a video stream. In a second technique, the data monitoring software application program generates a user interface associated with an event, receive an input corresponding to interaction with a user interface element in the user interface, and initiates an event channel associated with the event in response to the input.

Method and system for implementing subscription barriers in a distributed computation system

Embodiments of the invention relate to a method for managing subscriptions. The method includes initiating execution of a first subscription, in response to the initiating, obtaining a first subscription barrier associated with a first subscription, making, using the first subscription barrier, a first determination to block execution of a first query request associated with the first subscription, and in response to the first determination, ceasing execution of the first subscription.

Method and apparatus for processing data

A user device has a plurality of modules which support an application such as gaming application. The user device has a stream processing module which is able to stream process events which are generated, for example when the application is run. The events which are generated by the modules are passed to an event module which distributes the events to other of the modules.

Hybrid clouds

Systems and methods may create and manage hybrid clouds including both standard compute nodes and edge devices. Edge devices can be enrolled in a hybrid cloud by deploying a lightweight container to the edge device.

Distribution of data packets with non-linear delay

A computer system receives a data stream with a plurality of packets. In response to receiving the data stream with the plurality of packets, the computer system distributes individual packets of the plurality of packets to the inputs of each of a plurality of processing nodes. Each respective processing node has a local queue storing a respective number of packets to be processed by the respective processing node. Distributing a respective packet of the plurality of packets to the inputs of each of the plurality of processing nodes includes delaying sending the respective packet to each of the plurality of processing nodes by a delay time that is a non-linear function of an average number of packets in the local queues of the respective processing nodes.

Search infrastructure

A system for real-time search, including: a set of partitions, each including a set of segments, each segment corresponding to a time slice of messages posted to the messaging platform, and a real-time search engine configured to receive a search term in parallel with other partitions in set the set of partitions, and search at least one of the set of segments in reverse chronological order of the corresponding time slice to identify document identifiers of messages containing the search term; and a search fanout module configured to: receive a search query including the search term; send the search term to each of the set of partitions for parallel searching; and return, in response to the search query, at least one of the identified document identifiers of messages containing the search term.

Automated honeypot creation within a network

Systems and methods for managing Application Programming Interfaces (APIs) are disclosed. Systems may involve automatically generating a honeypot. For example, the system may include one or more memory units storing instructions and one or more processors configured to execute the instructions to perform operations. The operations may include receiving, from a client device, a call to an API node and classifying the call as unauthorized. The operation may include sending the call to a node-imitating model associated with the API node and receiving, from the node-imitating model, synthetic node output data. The operations may include sending a notification based on the synthetic node output data to the client device.

Switch event ordering

Examples disclosed herein relate to a method comprising detecting a plurality of changes in a database, wherein the database is used to configure a switch operating traffic on a network. The method may include determining that a subset of the plurality of changes are to be deferred before being used to configure the switch, wherein each change in the subset has a potential dependency with at least one other change in the subset. The method may also include iterating through each change in the subset. The iteration may include confirming that a target change has a dependency with another change in the subset, resolving the dependency and transmitting the target change to an object manager for configuration of the switch.