H04L67/289

SERVICE PROCESSING METHOD AND APPARATUS, AND STORAGE MEDIUM
20230049501 · 2023-02-16 ·

A service processing method, performed by a cloud application management server, includes: upon receiving an allocation request from a target terminal, acquiring N pieces of selection reference information corresponding to a pending edge server and related to the target terminal and running reference information, the pending edge server being one of P edge servers connected to the cloud application management server; upon determining that the pending edge server meets a requirement of providing a running service of a target cloud application for the target terminal, determining a connection reference score corresponding to the pending edge server; storing the connection reference score and identification information about the pending edge server into a candidate set; and transmitting the candidate set to the target terminal.

PROXIED PUSH
20180013851 · 2018-01-11 · ·

A system and method are described for establishing two-way push communication between an intermediate or companion device and a mobile device. Mobile devices register to listen for push notifications delivered through a push notification service from a specified set of providers. The presence of the mobile devices is delivered to the push notification service that maps the mobile devices to connections made between their respective companion devices and the push notification service. If the push notification service determines that a mobile device is “online,” in response to receiving a push notification for the mobile device, a current network connection over which a companion device is listening for push notifications is identified and the push notification is forwarded to the companion device. The companion device then can deliver the push notification to the mobile device.

Distributed transmission of messages in a communication network with selective multi-region replication
11711437 · 2023-07-25 · ·

To distribute messages to subscribers that are located in multiple regions, a data center will receive messages that to be published to the subscribers. The data center will then perform a limited replication process to other data centers in other regions by accessing an interest map indicating additional data centers at which subscribers have recently expressed interest. The data center will transmit the messages to only that group of additional data centers for replication but not to other data centers at which no interest has been recently expressed.

Distributed transmission of messages in a communication network with selective multi-region replication
11711437 · 2023-07-25 · ·

To distribute messages to subscribers that are located in multiple regions, a data center will receive messages that to be published to the subscribers. The data center will then perform a limited replication process to other data centers in other regions by accessing an interest map indicating additional data centers at which subscribers have recently expressed interest. The data center will transmit the messages to only that group of additional data centers for replication but not to other data centers at which no interest has been recently expressed.

EDGE ARTIFICIAL INTELLIGENCE (AI) COMPUTING IN A TELECOMMUNICATIONS NETWORK

Disclosed herein is the integration into edge nodes of a telecommunications network system of client computer system and server computer system where the server computer system includes a pool of shareable accelerators and the client computer runs an application program that is assisted by the pool of accelerators. The edge nodes connect to user equipment, and some of the user equipment can themselves act as one of the client computer systems. In some embodiments, the accelerators are GPUs, and in other embodiments, the accelerators are artificial intelligence accelerators.

EDGE ARTIFICIAL INTELLIGENCE (AI) COMPUTING IN A TELECOMMUNICATIONS NETWORK

Disclosed herein is the integration into edge nodes of a telecommunications network system of client computer system and server computer system where the server computer system includes a pool of shareable accelerators and the client computer runs an application program that is assisted by the pool of accelerators. The edge nodes connect to user equipment, and some of the user equipment can themselves act as one of the client computer systems. In some embodiments, the accelerators are GPUs, and in other embodiments, the accelerators are artificial intelligence accelerators.

Edge computing platform capability discovery

Systems and methods for establishing a connection with an edge application server are provided. A user equipment (UE) in a wireless communication network establishes a connection with an edge application server to offload the data processing of an application executing on the UE to the edge application server. The UE communicates key performance indicators (KPIs) associated with the application to the edge data network. The KPIs indicate the resources that application uses to process the data. In response, the UE receives edge application server parameters from multiple servers in the edge data network that meet or exceed the KPIs. The parameters include compute, graphical compute, memory and storage parameters with various levels of specificity. The UE selects one of the edge application servers to process the data on behalf of the application based on the parameters.

Edge computing platform capability discovery

Systems and methods for establishing a connection with an edge application server are provided. A user equipment (UE) in a wireless communication network establishes a connection with an edge application server to offload the data processing of an application executing on the UE to the edge application server. The UE communicates key performance indicators (KPIs) associated with the application to the edge data network. The KPIs indicate the resources that application uses to process the data. In response, the UE receives edge application server parameters from multiple servers in the edge data network that meet or exceed the KPIs. The parameters include compute, graphical compute, memory and storage parameters with various levels of specificity. The UE selects one of the edge application servers to process the data on behalf of the application based on the parameters.

Gracefully handling endpoint feedback when starting to monitor

A method, system and computer-usable medium for adaptively assessing risk associated with an endpoint, comprising: determining a risk level corresponding to an entity associated with an endpoint; selecting a frequency and a duration of an endpoint monitoring interval; collecting user behavior to collect user behavior associated with the entity for the duration of the endpoint monitoring interval via the endpoint; processing the user behavior to generate a current risk score for the entity; comparing the current risk score of the user to historical risk scores to determine whether a risk score of a user has changed; and changing the risk score of the user to the current risk score when the risk score of the user has changed.

Gracefully handling endpoint feedback when starting to monitor

A method, system and computer-usable medium for adaptively assessing risk associated with an endpoint, comprising: determining a risk level corresponding to an entity associated with an endpoint; selecting a frequency and a duration of an endpoint monitoring interval; collecting user behavior to collect user behavior associated with the entity for the duration of the endpoint monitoring interval via the endpoint; processing the user behavior to generate a current risk score for the entity; comparing the current risk score of the user to historical risk scores to determine whether a risk score of a user has changed; and changing the risk score of the user to the current risk score when the risk score of the user has changed.