H03M7/6076

Data compression device, memory system and method

According to one embodiment, a data compression device includes a dictionary match determination unit, an extended matching generator, a match selector and a match connector. The dictionary match determination unit searches for first past input data matching first new input data. The extended matching generator compares second past input data subsequent to the first past input data with second new input data subsequent to the first new input data. The match selector generates compressed data by replacing a part of the input data with match information output from the dictionary match determination unit or the extended matching generator. The match connector replaces a plurality of match information in the compressed data with single match information.

SYSTEM AND METHOD FOR DATA COMPACTION UTILIZING MISMATCH PROBABILITY ESTIMATION

A system and method for encoding data utilizing mismatch probability estimates. A training data set can be statistically analyzed to calculate a mismatch probability estimate which is the estimated frequency at which a data packet received during system runtime is not part of (i.e., a mismatch) the training data set. A plurality of tokens may be created, based on the mismatch probability estimate, to represent potential mismatched data that may be encountered during runtime, and an entropy encoder may generate codewords for the tokens using the mismatch probability estimate. An opcode, indicating a mismatch, may be generated and appended to the generated codewords to form a mismatch codeword. During runtime when a mismatch occurs, the system can retrieve a mismatch codeword and assign it to the mismatched data, making the encoder system robust against previously unencountered data.

System and method for data compaction and security using multiple encoding algorithms

A system and method for encoding data using a plurality of encoding libraries. Portions of the data are encoded by different encoding libraries, depending on which library provides the greatest compaction for a given portion of the data. This methodology not only provides substantial improvements in data compaction over use of a single data compaction algorithm with the highest average compaction, but provides substantial additional security in that multiple decoding libraries must be used to decode the data. In some embodiments, each portion of data may further be encoded using different sourceblock sizes, providing further security enhancements as decoding requires multiple decoding libraries and knowledge of the sourceblock size used for each portion of the data. In some embodiments, encoding libraries may be randomly or pseudo-randomly rotated to provide additional security.

Hybrid data reduction
11157189 · 2021-10-26 · ·

An information handling system may include at least one processor and a memory coupled to the at least one processor. The information handling system may be configured to receive data comprising a plurality of data chunks; perform deduplication on the plurality of data chunks to produce a plurality of unique data chunks; determine a compression ratio for respective pairs of the unique data chunks; determine a desired compression order for the plurality of unique data chunks based on the compression ratios; combine the plurality of unique data chunks in the desired compression order; and perform data compression on the combined plurality of unique data chunks.

Cognitive compression with varying structural granularities in NoSQL databases

Cognitive compression with varying structural granularities in a NoSQL database by establishing a data training set for compressing and decompressing data stored within the NoSQL database. The data training set includes received user policy goals, compression parameters, and metered feedback associated with data usage and workload characteristics. A compression parameter model is dynamically implemented in real-time for the data selected according to the established data training set to compress and decompress the data at a given structural granularity.

METHODS AND DEVICES FOR ON-THE-FLY CODER MAPPING UPDATES IN POINT CLOUD CODING
20210167795 · 2021-06-03 · ·

Methods and systems for encoding and decoding data, such as point cloud data. The methods may include using a coder map to map a range of discrete dependency states to a smaller set of binary coders each having an associated coding probability. The selection of one of the discrete dependency states may be based on a contextual or situational factors, which may include a prediction process, for a particular symbol, such as an occupancy bit. The coder map is updated after each symbol is coded to possibly alter to which binary coder the selected discrete dependency state maps.

METHOD, ELECTRONIC DEVICE AND COMPUTER PROGRAM PRODUCT FOR PROCESSING DATA
20210135683 · 2021-05-06 ·

Embodiments of the present disclosure provide a method, electronic device and a computer program product for processing data. The method comprises determining target data that are used for determining a target compression level for a user. The method also comprises compressing at least part of the target data using a plurality of compression levels of a compression algorithm, respectively, to obtain a plurality of compression ratios and a plurality of compression latencies corresponding to the plurality of compression levels. The method further comprises determining the target compression level for the user for compressing data of the user data based on the plurality of compression ratios and the plurality of compression latencies.

SEARCHING COMPRESSION PROFILES FOR TRAINED NEURAL NETWORKS

Compression profiles may be searched for trained neural networks. An iterative compression profile search may be performed response to a search request. Different prospective compression profiles may be generated for trained neural networks according to a search policy. Performance of compressed versions of the trained neural networks according to the compression profiles may be tracked. The search policy may be updated according to an evaluation of the performance of the compression profiles for the compressed versions of the trained neural networks using compression performance criteria. When a search criteria is satisfied, a result for the compression profile search may be provided.

NESTED ENTROPY ENCODING
20210044801 · 2021-02-11 ·

Methods and systems for improving coding decoding efficiency of video by providing a syntax modeler, a buffer, and a decoder. The syntax modeler may associate a first sequence of symbols with syntax elements. The buffer may store tables, each represented by a symbol in the first sequence, and each used to associate a respective symbol in a second sequence of symbols with encoded data. The decoder decodes the data into a bitstream using the second sequence retrieved from a table.

HYBRID DATA REDUCTION
20210011644 · 2021-01-14 · ·

An information handling system may include at least one processor and a memory coupled to the at least one processor. The information handling system may be configured to receive data comprising a plurality of data chunks; perform deduplication on the plurality of data chunks to produce a plurality of unique data chunks; determine a compression ratio for respective pairs of the unique data chunks; determine a desired compression order for the plurality of unique data chunks based on the compression ratios; combine the plurality of unique data chunks in the desired compression order; and perform data compression on the combined plurality of unique data chunks.