H04L2209/42

Secure data storage for anonymized contact tracing

A network environment is described for securely storing data for anonymized contact tracing while an application is executing in a background state. An application can receive a message containing data while the application is executing in a background state. The data is encrypted using a public key. Next, the application can store the encrypted data in an alternate data store. Subsequently, and upon user authentication, the application can decrypt a secure data store decrypt the encrypted data. The application can then store the decrypted data in the decrypted secure data store. The application can receive user input indicating a positive test result for a communicable disease with an incubation period, and anonymously upload the data stripped of any uniquely identifying information.

SYSTEMS AND METHODS FOR FUNCTIONALLY SEPARATING GEOSPATIAL INFORMATION FOR LAWFUL AND TRUSTWORTHY ANALYTICS, ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING

Various systems, computer-readable media, and computer-implemented methods of providing improved data privacy, anonymity and security by enabling subjects to which data pertains to remain “dynamically anonymous,” i.e., anonymous for as long as is desired—and to the extent that is desired—are disclosed herein. Embodiments include systems that create, access, use, store and/or erase data with increased privacy, anonymity, and security—thereby facilitating the availability of more qualified and accurate information. When personal data is authorized by data subjects to be shared with third parties, embodiments described herein may facilitate the sharing of information in a dynamically-controlled manner that also enables the delivery of temporally-, geographically-, and/or purpose-limited information to the receiving party. In one example, the disclosed techniques may be used to functionally separate geospatial information, such that it remains “dynamically anonymous,” i.e., anonymous for as long as is desired—and to the extent or degree that is desired.

Differential privacy using a multibit histogram

Embodiments described herein ensure differential privacy when transmitting data to a server that estimates a frequency of such data amongst a set of client devices. The differential privacy mechanism may provide a predictable degree of variance for frequency estimations of data. The system may use a multibit histogram model or Hadamard multibit model for the differential privacy mechanism, both of which provide a predictable degree of accuracy of frequency estimations while still providing mathematically provable levels of privacy.

PRIVACY AWARENESS FOR PERSONAL ASSISTANT COMMUNICATIONS
20230052073 · 2023-02-16 ·

Aspects of the technology described herein maintain the privacy of confidential information to be communicated to a user through a computing device. The technology keeps confidential information private by assessing the privacy context of the communication. The privacy context can be determined by determining a privacy level of the information to be communicated and the privacy level of the environment into which the information is to be communicated. The privacy context can be used to select an appropriate communication channel for the information. The privacy context can also be used to determine whether all available content is shared or just a portion of it.

Systems and methods for privacy-enabled biometric processing
11502841 · 2022-11-15 · ·

A set of distance measurable encrypted feature vectors can be derived from any biometric data and/or physical or logical user behavioral data, and then using an associated deep neural network (“DNN”) on the output (i.e., biometric feature vector and/or behavioral feature vectors, etc.) an authentication system can determine matches or execute searches on encrypted data. Behavioral or biometric encrypted feature vectors can be stored and/or used in conjunction with respective classifications, or in subsequent comparisons without fear of compromising the original data. In various embodiments, the original behavioral and/or biometric data is discarded responsive to generating the encrypted vectors. In another embodiment, distance measurable or homomorphic encryption enables computations and comparisons on cypher-text without decryption of the encrypted feature vectors. Security of such privacy enabled embeddings can be increased by implementing an assurance factor (e.g., liveness) to establish a submitted credential has not been spoofed or faked.

METHOD AND SYSTEM FOR GENERALIZED PROVENANCE SOLUTION FOR BLOCKCHAIN SUPPLY CHAIN APPLICATIONS

A method for conveying auditable information regarding provenance of a product that is cryptographically accurate while retaining complete anonymity of product and participant on a blockchain includes: receiving a product identifier; generating a digital token by applying a hashing algorithm to the product identifier; generating an entry value by applying the hashing algorithm to a combination of an event identifier and the digital token; generating a digital signature by digitally signing a data package using a private key of a cryptographic key pair, where the data package includes at least a blockchain address, the event identifier, and the digital token; and transmitting the blockchain address, the digital signature, and the entry value to a node in a blockchain network.

System and method for general data protection regulation (GDPR) compliant hashing in blockchain ledgers

A computer implemented system and method for providing general data protection regulation (GDPR) compliant hashing in blockchain ledgers. The invention guarantees a user's right to be forgotten, in compliance with the GDPR regulations, utilizing blockchain technologies.

RFID ICS WITH PRIVACY MODES

RFID tag ICs may be configured with privacy modes. When a tag IC is in a privacy mode, it will not respond to commands unless a previous command includes correct verification information or specifies a recycling indicator of the tag IC. If the previous command includes correct verification information, then the tag IC will respond to one or more subsequent commands as normal, for example by responding with one or more identifiers. If the previous command does not include correct verification information but specifies a recycling indicator and the privacy mode is recycling-enabled, the tag IC may respond to one or more subsequent commands with recycling information. The recycling information identifies whether or how an item associated with the RFID IC can be recycled or disposed but does not otherwise identify the RFID IC or item. Otherwise, the tag IC may remain silent.

METHODS AND SYSTEMS FOR CRYPTOGRAPHICALLY SECURED DECENTRALIZED TESTING

A method of cryptographically secured decentralized testing includes receiving, by a computing device and from a secure test apparatus, an output of a cryptographic function of a secret test result identifier, authenticating the output, and recording, in a data repository, an indication of a test result as a function of the output.

DATA ANONYMIZATION OF BLOCKCHAIN-BASED PROCESSING PIPELINE
20220360450 · 2022-11-10 ·

An example operation may include one or more of anonymizing, via an anonymization service hosted within a trusted execution environment (TEE), raw data provided by a computing node to generate anonymized data, generating, via the anonymization service, an authenticator object that binds together a hash of the raw data and a hash of the anonymized data, transmitting the generated anonymized data to the computing node, and submitting the authenticator object to a blockchain ledger via a blockchain transaction.