Patent classifications
G06F21/6263
Network resource privacy negotiation system and method
A method for accessing a network resource including detecting an attempt by a user via a computing device to access a service enabled by a computing system via a network and transmitting via the network to the computing system a first request to access the service in response to detecting the attempt by the user to access the service, the first request including at least one empty personally identifiable data structure. A failure to access the service responsive to the first request is determined. A second request to access the service in response to the first failure to access the service is transmitted via the network to the computing system, the second request including artificial personally identifiable information, and access to the service from the computing system is received for the user.
Methods and systems for secure cross-platform token exchange
Systems and methods are disclosed for cross-platform token exchange. One method comprises receiving a primary token exchange request from an upstream entity, generating an ancillary detokenization request based on the primary token exchange request, and transmitting the ancillary detokenization request to an input token vault. An ancillary detokenization response comprising sensitive data may then be received from the input token vault, and one or more ancillary tokenization requests may be generated based on the ancillary detokenization response and the primary token exchange request. The one or more ancillary tokenization requests may be transmitted to one or more output token vaults. Subsequently, one or more ancillary tokenization responses may be received from the one or more output token vaults, each ancillary tokenization response comprising an output token. A primary token exchange response may be generated based on the one or more ancillary tokenization responses and transmitted to the upstream entity.
System and a method for multisession analysis
A method and a system for arranging a user multi-session from a plurality of user sessions, where the sessions are received from a plurality of computerized client devices communicatively coupled via a communication network to at least one content server. At least some of the client devices may be operated by a same user, and the data content may include at least part of data communicated between any client device and any content server. The method including dividing the received data content into a plurality of sessions, where at least two sessions are associated with the same user, selecting at least two sessions received from at least two respective client devices associated with the same user, and associating the selected at least two sessions to form a multi-session.
Systems and methods for controlling data exposure using artificial-intelligence-based periodic modeling
Systems and methods for periodically modifying data privacy elements are provided. The systems and methods may identify a set of data privacy elements. A data privacy element can characterizes a feature of a computing device and can be detectable by a network host. A first artificial profile can be generated by modifying a first data privacy element based on an artificial profile model that defines a relationship associated with one or more constraints between the set of data privacy elements. Subsequent to generating the first artificial profile, a second artificial profile can be generated by periodically modifying a second data privacy element in accordance with the relationship defined by the artificial profile model. The computer device can be masked from being identified by the network host by sending the second artificial profile including the second data privacy element to a requested network location.
SYSTEMS AND METHODS FOR AUTOMATICALLY BLOCKING THE USE OF TRACKING TOOLS
Embodiments of the present invention provide methods, apparatus, systems, computing devices, computing entities, and/or the like for permitting or blocking tracking tools used through webpages. In particular embodiments, the method involves: scanning a webpage to identify a tracking tool configured for processing personal data; determining a data destination location that is associated with the tracking tool; and generating program code configured to: determine a location associated with a user who is associated with a rendering of the webpage; determine a prohibited data destination location based on the location associated with the user; determine that the data destination location associated with the tracking tool is not the prohibited data destination location; and responsive to the data destination location associated with the tracking tool not being the prohibited data destination location, permit the tracking tool to execute.
Privacy protection for third party data sharing
A set of raw data relating to activity of one or more users in accordance with a communication network is obtained. The communication network is managed by a network operator. The obtained set of raw data is processed in accordance with at least one data isolation policy maintained by the network operator to generate a first set of data comprising at least a portion of the set of raw data with sensitive data associated with the one or more users removed; a second set of data comprising the sensitive data removed from the set of raw data; and a third set of data comprising a mapping between portions of the set of raw data and the first set of data. The first set of data is exposed to a third party, while the second set of data and the third set of data are isolated from the third party.
Sanitization of content displayed by web-based applications
Embodiments enable a displayed webpage containing sensitive information to be accurately and efficiently sanitized. The sensitive information is contained within a text string of the webpage and displayed using a font specified in a style sheet. The text string that is to be sanitized is determined based on a tag for sanitization associated with the text string. When the tag is determined the text string is rendered using a font from the style sheet that is not legible. Upon rendering, the text string of the webpage is redisplayed using the non-legible font, which effectively sanitizes the text string containing the sensitive information.
INFORMATION PROCESSING APPARATUS, CONTROL METHOD, AND STORAGE MEDIUM
An information processing system, method, and computer-readable medium that generate emotion values based on information related to interactions between one of a plurality of objects and other ones of the plurality of objects, the one of the plurality of objects being associated with a person, acquire at least one emotion value of the generated emotion values based on an identification of the person, and provide personal credit information of the person based on the acquired at least one emotional value.
Privacy-Preserving Image Distribution
Some embodiments enable distributing data (e.g., recorded video, photographs, recorded audio, etc.) to a plurality of users in a manner which preserves the privacy of the respective users. Some embodiments leverage homomorphic encryption and proxy re-encryption techniques to manipulate the respective data so that selected portions of it are revealed according to an identity of the user currently accessing the respective data.
Systems and methods for managing privacy policies using machine learning
Systems, methods, and devices for managing privacy policies are disclosed. In one embodiment, a method for management of a user's privacy preferences may include: identifying a computer application installed on a user electronic device, or a website accessed using a browser executed by the user electronic device; retrieving a privacy policy document analysis for a privacy policy document associated with the computer application or the website, the privacy policy document analysis comprising a valuation of a plurality of privacy policy segments within the privacy policy document; receiving a privacy preference analysis for the user, the privacy preference analysis comprising a valuation of a plurality of privacy preferences for the user; identifying a recommended action in response to the valuation of one of the privacy policy segments being outside the scope of the valuation of one of the plurality of privacy preferences; and executing the recommended action.