G06F21/62

Subject-Level Granular Differential Privacy in Federated Learning
20230052231 · 2023-02-16 ·

Group-level privacy preservation is implemented within federated machine learning. An aggregation server may distribute a machine learning model to multiple users each including respective private datasets. The private datasets may individually include multiple items associated with a single group. Individual users may train the model using their local, private dataset to generate one or more parameter updates and to determine a count of the largest number of items associated with any single group of a number of groups in the dataset. Parameter updates generated by the individual users may be modified by applying respective noise values to individual ones of the parameter updates according to the respective counts to ensure differential privacy for the groups of the dataset. The aggregation server may aggregate the updates into a single set of parameter updates to update the machine learning model.

CORRELATION OF A VIRTUAL MACHINE TO A HOST WITHIN A VIRTUAL DOMAIN

Aspects of the subject disclosure may include, for example, identifying a request to install a guest virtual machine on a physical host; identifying a UUID of the physical host; generating a virtual machine reference value; defining a modified UUID of the guest virtual machine comprising the UUID of the physical host and the virtual machine reference value; and assigning the modified UUID to the guest virtual machine, the physical host being identifiable via the modified UUID of the guest virtual machine. Other embodiments are disclosed.

User-level Privacy Preservation for Federated Machine Learning

User-level privacy preservation is implemented within federated machine learning. An aggregation server may distribute a machine learning model to multiple users each including respective private datasets. Individual users may train the model using the local, private dataset to generate one or more parameter updates. Prior to sending the generated parameter updates to the aggregation server for incorporation into the machine learning model, a user may modify the parameter updates by applying respective noise values to individual ones of the parameter updates to ensure differential privacy for the dataset private to the user. The aggregation server may then receive the respective modified parameter updates from the multiple users and aggregate the updates into a single set of parameter updates to update the machine learning model. The federated machine learning may further include iteratively performing said sending, training, modifying, receiving, aggregating and updating steps.

SYSTEM AND PLATFORM FOR DEIDENTIFIED AND DECENTRALIZED SOCIAL GAMING VIA THE BLOCKCHAIN

A system, method, device, and platform for performing a gaming utilizing blockchain. A player profile is created in response to information received from a player. Gaming information is received from the player associated with the one or more games. Selections are assigned to the player for the one or more games utilizing player profile and the gaming information. Winners and a host with each of the one or more games of the one or more games are compensated once the one or more games are completed. The player profile, the gaming information, the selections, and the winners are documented utilizing the blockchain.

ODOMETER FRAUD DETECTION VIA DATA STORAGE ON COMPONENTS

An example operation includes one or more of incrementing an epoch value related to a transport event, transmitting the incremented epoch value to at least one component on the transport, receiving an odometer reading comprising the epoch value and comparing the epoch value of the odometer reading and the incremented epoch value to determine whether the odometer reading is valid.

AUTOMATICALLY ASSIGNING DATA PROTECTION POLICIES USING ANONYMIZED ANALYTICS
20230052851 · 2023-02-16 ·

Embodiments for a system and method of selecting data protection policies for a new system, by collecting user, policy, and asset metadata for a plurality of other users storing data dictated by one or more protection policies. The collected metadata is anonymized with respect to personal identifying information, and is stored in an anonymized analytics database. The system receives specific user, policy and asset metadata for the new system from a specific user, and matches the received specific user metadata to the collected metadata to identify an optimum protection policy of the one or more protection policies based on the assets and protection requirements of the new system. The new system is then configured with the identified optimum protection policy as an initial configuration.

Extensible platform for orchestration of data with enhanced security
20230046370 · 2023-02-16 ·

In a computer system, an orchestration platform includes extensible components that interact with external systems and technology. The platform is secured by means of architectural features, encryption, and access control.

System, method, and computer program for centralized consent management

A system, method, and computer program product are provided for centralized consent management. In operation, the consent management system receives user selections from a user indicating which user data is capable of being utilized for analysis by a company. The consent management system stores the user selections of which user data is capable of being utilized for analysis by the company in a consent database. The consent management system generates a consent vector corresponding to the user selections of which user data is capable of being utilized for analysis by the company. Additionally, the consent management system associates the consent vector with a consent vector identification. Further, the consent management system tags incoming data with the consent vector identification to associate a user consent with the incoming data. The consent management system stores and encodes the incoming data. Moreover, the consent management system enforces consent restrictions by conditionally allowing access to the stored data by the company based on corresponding consent vector identifications.

Intelligent data protection

A technological approach can be employed to protect data. Datasets from distinct computing environments of an organization can be scanned to identify data elements subject to protection, such as sensitive data. The identified elements can be automatically protected such as by masking, encryption, or tokenization. Data lineage including relationships amongst data and linkages between computing environments can be determined along with data access patterns to facilitate understanding of data. Further, personas and exceptions can be determined and employed as bases for access recommendations.

Query-based database redaction
11580251 · 2023-02-14 · ·

Embodiments of the present disclosure describe systems, methods, and computer program products for redacting sensitive data within a database. An example method can include receiving a data query referencing unredacted data of a database, responsive to the data query, executing, by a processing device, a redaction operation to identify sensitive data within the unredacted data of the database, and returning a redacted data set in which the sensitive data is replaced or removed to the data query.