H03M7/4043

SYSTEM AND METHOD FOR DATA COMPACTION AND SECURITY USING MULTIPLE ENCODING ALGORITHMS

A system and method for encoding data using a plurality of encoding libraries. Portions of the data are encoded by different encoding libraries, depending on which library provides the greatest compaction for a given portion of the data. This methodology not only provides substantial improvements in data compaction over use of a single data compaction algorithm with the highest average compaction, but provides substantial additional security in that multiple decoding libraries must be used to decode the data. In some embodiments, each portion of data may further be encoded using different sourceblock sizes, providing further security enhancements as decoding requires multiple decoding libraries and knowledge of the sourceblock size used for each portion of the data. In some embodiments, encoding libraries may be randomly or pseudo-randomly rotated to provide additional security.

FLEXIBLE COMPRESSION HEADER AND CODE GENERATION
20220200626 · 2022-06-23 · ·

An embodiment of an integrated circuit may comprise a hardware compressor to compress data, the hardware compressor including circuitry to store input data in a history buffer, compute one or more code tables based on the input data, and compute a compression stream header based on the computed one or more code tables. Other embodiments are disclosed and claimed.

SYSTEM AND METHOD FOR BLOCKCHAIN DATA COMPACTION

A system and method for faster communication between blockchain mining nodes and faster block validation. The system uses machine learning on data chunks to generate codebooks which compact the data to be stored, processed, or sent with a smaller data profile than uncompacted data. The system uses a data compaction in an existing blockchain fork or implemented in a new blockchain protocol from which nodes that wish to or need to use the blockchain can do so with a reduced storage requirement. The system uses network data compaction across all nodes to increase the speed of and decrease the size of a blockchain’s data packets. The system uses data compaction firmware to increase the efficiency at which mining rigs can computationally validate new blocks on the blockchain. The system can be implemented using any combination of the three data compaction services to meet the needs of the desired blockchain technology.

SYSTEM AND METHOD FOR DATA STORAGE, TRANSFER, SYNCHRONIZATION, AND SECURITY USING AUTOMATED MODEL MONITORING AND TRAINING

A system and method for data storage, transfer, synchronization, and security using automated system efficacy monitoring and model training, wherein statistical analyses of test datasets are used to determine if the probability distribution of two datasets are within a pre-determined range, and responsive to that determination new encoding and decoding algorithms may be retrained in order to produce new data chunklets. The new data chunklets may then be processed and assigned new codewords which are compiled into an updated codebook which may be distributed back to encoding and decoding systems and devices.

Compression device and control method

According to one embodiment, a compression device includes a coding information generation unit. The unit determines code lengths that are respectively associated with a plurality of symbols, based on a frequency of occurrence of each of the plurality of symbols. When the plurality of symbols include one or more first symbols that are respectively associated with one or more first code lengths exceeding an upper limit, the unit changes the first code lengths to the upper limit, selects, from one or more second symbols of the plurality of symbols that are respectively associated with one or more second code lengths shorter than the upper limit, at least one symbol in descending associated code length order, changes at least one code length associated with the symbol to the upper limit.

Methods, systems, and computer readable media for adaptive metadata architecture

Methods, systems, and computer readable media for using variable metadata tags. A method occurs at a metadata processing system for enforcing security policies in a processor architecture. The method comprises: receiving, at the metadata processing system, a tag associated with a word in memory, wherein the tag indicates a memory location containing metadata associated with the word and wherein the tag length is at least in part determined using tag usage frequency; obtaining the metadata from the memory location, and determining, using the metadata, whether the word or a related instruction violates a security policy.

SYSTEM AND METHOD FOR DATA COMPRESSION USING GENOMIC ENCRYPTION TECHNIQUES

A system and method for data compression with genomic encryption, which uses frequency analysis on data blocks within an input data stream to produce a prefix table, representing a first layer of transformation, and which applies a Burrow's-Wheeler transform (BWT) to the data inside the prefix table, representing a second layer of transformation, and which compresses the transformed data. In some implementations, the system and method may further include applying the BWT to a conditioned stream of genomic data, wherein the conditioned stream of genomic data is accompanied by an error stream comprising the differences between the original data and the encrypted data.

SYSTEM AND METHOD FOR DATA COMPACTION AND SECURITY WITH EXTENDED FUNCTIONALITY

A system and method for highly efficient encoding of data that includes extended functionality for asymmetric encoding/decoding and network policy enforcement. In the case of asymmetric encoding/decoding the original data is encoded by an encoder according to a codebook and sent to a decoder, but the output of the decoder depends on data manipulation rules applied at the decoding stage to transform the decoded data into a different data set from the original data. In the case of network policy enforcement, a behavior appendix into the codebook, such that the encoder and/or decoder at each node of the network comply with network behavioral rules, limits, and policies during encoding and decoding.

Information processing apparatus and information processing method

The present disclosure relates to an information processing apparatus and an information processing method that are capable of distributing higher-quality G-PCC streams. When G-PCC streams obtained by encoding Point Cloud data according to G-PCC are generated, there are generated spatial positional information indicating spatial positions of respective pieces of partial Point Cloud data that represent individual parts into which the Point Cloud data is segmented, and grouping information that groups the partial Point Cloud data. The present technology is applicable to a generating apparatus that generates G-PCC streams, for example.

System and method for data compression using genomic encryption techniques

A system and method for data compression with genomic encryption, which uses frequency analysis on data blocks within an input data stream to produce a prefix table, representing a first layer of transformation, and which applies a Burrow's-Wheeler transform (BWT) to the data inside the prefix table, representing a second layer of transformation, and which compresses the transformed data. In some implementations, the system and method may further include applying the BWT to a conditioned stream of genomic data, wherein the conditioned stream of genomic data is accompanied by an error stream comprising the differences between the original data and the encrypted data.