Patent classifications
G06F21/6263
CRYPTOGRAPHICALLY SECURE REQUEST VERIFICATION
This disclosure relates to data security and cryptography. In one aspect, a method includes updating a user interface of a client device to present user interface controls that enable a user to specify data privacy settings that define how entities collect, store, and use data of the user. The data security system receives a request to modify a data privacy setting for one or more entities from the client device based on user interaction with one or more of the user interface controls. The request includes an ephemeral user identifier for the user and an attestation token. The data security system validates the request using at least the ephemeral user identifier and the attestation token. The data security system transmits data instructing the entity to modify usage of the user data based on the modified given data privacy setting to each of the one or more entities.
DATA PROCESSING METHOD, APPARATUS, AND SYSTEM, DEVICE, AND MEDIUM
A data providing apparatus obtains first privacy data and second privacy data, encrypts the first privacy data by using an encryption algorithm to obtain a ciphertext of the first privacy data, and sends the ciphertext of the first privacy data and the second privacy data to a data processing apparatus. The data processing apparatus inputs the ciphertext of the first privacy data and the second privacy data into a ciphertext computation function to obtain a ciphertext of a data processing result. In this way, the first privacy data is used in computation in a ciphertext form, thereby ensuring security. In addition, the second privacy data is used in computation in a plaintext form, thereby reducing ciphertext input for the ciphertext computation function.
Anti-cyberbullying systems and methods
Some embodiments use text and/or image processing methods to determine whether a user of an electronic messaging platform is subject to an online threat such as cyberbullying, sexual grooming, and identity theft, among others. In some embodiments, a text content of electronic messages is automatically harvested and aggregated into conversations. Conversation data are then analyzed to extract various threat indicators. A result of a text analysis may be combined with a result of an analysis of an image transmitted as part of the respective conversation. When a threat is detected, some embodiments automatically send a notification to a third party (e.g., parent, teacher, etc.)
Data processing systems for verification of consent and notice processing and related methods
A system and method for determining consent user interface validity for a provided consent user interface of a web form presenting consent information, comprising: accessing a consent user interface presented on a web form; determining one or more configuration attributes of the consent user interface; accessing one or more privacy regulations associated with presenting consent information; comparing the one or more configuration attributes of the consent user interface to each of the one or more privacy regulations; determining whether the consent user interface is compliant with each of the one or more privacy regulations; and in response to determining that the consent user interface is not compliant with one or more privacy regulations, flagging the consent user interface.
Key pair platform and system to manage federated trust networks in distributed advertising
Systems and methods are provided for object identifier translation using a key pairs platform in a virtualized or cloud-based computing system. A key pair refers to a pair of identifiers held by an entity. Each key pair includes at least one anonymized object identifier. Advantageously, the key pair system protects privacy and provides anonymity for objects by not disclosing the identity of the objects or the underlying data associated with the objects.
Systems and methods of determining compromised identity information
A compromised data exchange system extracts data from websites using a crawler, detects portions within the extracted data that resemble personally identifying information (PII) data based on PII data patterns using a risk assessment module, and compares a detected portion to data within a database of disassociated compromised PII data to determine a match using the risk assessment module. A risk score may be assigned to a data item within the database in response to determining the match. In some embodiments, URL data may also be detected in the extracted data. The detected URL data represents further web sites that can be automatically crawled by the system to detect further PII data.
Information processing method, apparatus, device, and storage medium
The present application discloses an information processing method. The method includes: acquiring a resource package parameter determined by a sender client; invoking an interface with a second server, and acquiring a current conversion rate that is between the first-type resource and a second-type resource and is issued by the second server; calculating a quantity of second-type resources equal in value to the first quantity of first-type resources according to the conversion rate, and using the quantity as a second quantity; deducting the second quantity of second-type resources from a second-type resource account of the sender client, and generating at least one resource package, a sum of quantities corresponding to the at least one resource package being equal to the first quantity; and distributing the at least one resource package to at least one receiver client by using the link information.
Data processing systems for orphaned data identification and deletion and related methods
In particular embodiments, an Orphaned Data Action System is configured to analyze one or more data systems (e.g., data assets), identify one or more pieces of personal data that are one or more pieces of personal data that are not associated with one or more privacy campaigns of the particular organization, and notify one or more individuals of the particular organization of the one or more pieces of personal data that are one or more pieces of personal data that are not associated with one or more privacy campaigns of the particular organization.
User device-based enterprise web filtering
Web-filtering operations may be implemented on the user device, rather than on a centralized proxy server, to improve reliability, performance, and/or security of the web-filtering operations. Some or all of the necessary functions related to web-filtering may be performed on the end user device to remove the complexity and security issues inherent with the current methodology. One technique for allowing operation of proxy servers on user devices is to install smart agents on the user device. The smart agents, under control of a management server, may configure the proxy server, issue trust certificates to applications on the device, and/or provide proxy access configuration (PAC) files to applications on the device.
System and method for automatically masking confidential information that is input on a webpage
A method for recording input text that is input in an input field of a webpage, the method comprising: providing the webpage to a client device, by processing circuitry, the webpage comprising: (a) the input field, and (b) instructions executable by web-accessing software for recording the input text; and wherein execution of the instructions results in: identification of masking information in the input text, if any, the masking information being information in the input text to be masked; and if the masking information is identified, masking of the masking information prior to the recording of the input text, so that the masking information is masked when recorded.