H03M7/4062

Searching compression profiles for trained neural networks

Compression profiles may be searched for trained neural networks. An iterative compression profile search may be performed response to a search request. Different prospective compression profiles may be generated for trained neural networks according to a search policy. Performance of compressed versions of the trained neural networks according to the compression profiles may be tracked. The search policy may be updated according to an evaluation of the performance of the compression profiles for the compressed versions of the trained neural networks using compression performance criteria. When a search criteria is satisfied, a result for the compression profile search may be provided.

REORDERING DATASETS IN A TABLE FOR INCREASED COMPRESSION RATIO
20230092510 · 2023-03-23 ·

Selecting tables for compression by threshold statistical values. Identified tables are reordered according to fields having the lowest cardinality to increase the size of character strings replaced by keys during compression. Field locations are mapped between the original table and the reordered table. Dictionary-based compression is performed on reordered tables.

PARTITIONAL DATA COMPRESSION
20230370086 · 2023-11-16 ·

A system collects statistical data for a data page, divides the data page into parts, analyzes the data page and the statistical data, based on compression efficiency of one or more compression methods for each part of each page, to determine a compression method for each part of page, and compresses, based on the analyzing, the parts of the data page.

COOPERATIVE COMPRESSION IN DISTRIBUTED DATABASES
20230344446 · 2023-10-26 ·

In various embodiments a computer-implemented method for managing use of a shared compression dictionary in a distributed database environment. The method includes determining that a given version of the shared compression dictionary should be designated as a current primary version of the shared compression dictionary. The method also includes receiving, from a client device, first write data compressed with a previous primary version of the shared compression dictionary and in response to receiving the first write data, transmitting, to the client device, the current primary version of the shared compression dictionary and an instruction to compress new write data with the current primary version of the shared compression dictionary. Additionally, the method includes receiving, from the client device, a second write data compressed with the current primary version of the shared compression dictionary and storing the second write data in a database.

Memory system

A memory system includes a storage device and a memory controller. The memory controller includes an encoder and a decoder. The encoder includes a first code table updating section configured to update the encoding code table and an encoding flow controlling section configured to control input to the first code table updating section by using a first data amount indicating a data amount of the input symbol. The first data amount is calculated based on the input symbol. The decoder includes a second code table updating section configured to update the decoding code table and a decoding flow controlling section configured to control input to the second code table updating section by using a second data amount indicating a data amount of the output symbol. The second data amount is calculated based on the output symbol in the same way as the calculation of the first data amount.

Dynamic Dictionary-Based Network Compression
20230006690 · 2023-01-05 ·

Methods and systems for providing dynamic dictionary-based compression and decompression are described herein. A computing device may receive, during a currently running session with a client device, a plurality of messages. The computing device may determine, based on the plurality of messages, one or more frames. The computing device may determine, based on the one or more frames, data samples. The computing device may compress the one or more frames based on a compression dictionary. The computing device may train, during the currently running session, the compression dictionary based on the determined data samples, to create a new compression dictionary. The computing device may determine, during the currently running session and based on receiving additional messages, one or more additional frames. In addition, the computing device may compress the one or more additional frames based on the new compression dictionary.

Methods and Apparatus for Compressing Data Streams

Methods and apparatus for compressing data streams. In an embodiment, a method includes calculating a decomposition of matrix data to generate eigenvectors and associated eigenvalues, determining clusters of the eigenvectors based on weighting the eigenvalues, calculating an eigenvector centroid for each cluster so that a dictionary of centroids is generated, and tagging the eigenvectors with tags, respectively, that identify an associated eigenvector centroid for each eigenvector. The method also includes counting a number of eigenvectors associated with each eigenvector centroid to construct a probability distribution function (PDF) of centroids, matching the PDF of centroids to PDF templates to determine a closest matching PDF template, determining an encoder corresponding to the closest matching PDF template wherein a corresponding encoder identifier is identified, encoding the tags with the encoder to generate an encoded data stream, and transmitting the encoded data stream, the encoder identifier, the dictionary of centroids, and the eigenvalues.

Methods and Apparatus for Compressing Data Streams

Methods and apparatus for compressing data streams. In an embodiment, a method includes calculating a decomposition of matrix data to generate eigenvectors and associated eigenvalues, determining clusters of the eigenvectors based on weighting the eigenvalues, calculating an eigenvector centroid for each cluster so that a dictionary of centroids is generated, and tagging the eigenvectors with tags, respectively, that identify an associated eigenvector centroid for each eigenvector. The method also includes counting a number of eigenvectors associated with each eigenvector centroid to construct a probability distribution function (PDF) of centroids, matching the PDF of centroids to PDF templates to determine a closest matching PDF template, determining an encoder corresponding to the closest matching PDF template wherein a corresponding encoder identifier is identified, encoding the tags with the encoder to generate an encoded data stream, and transmitting the encoded data stream, the encoder identifier, the dictionary of centroids, and the eigenvalues.

SEMI-SORTING COMPRESSION WITH ENCODING AND DECODING TABLES

A data processing platform, method, and program product perform compression and decompression of a set of data items. Suffix data and a prefix are selected for each respective data item in the set of data items based on data content of the respective data item. The set of data items is sorted based on the prefixes. The prefixes are encoded by querying multiple encoding tables to create a code word containing compressed information representing values of all prefixes for the set of data items. The code word and suffix data for each of the data items are stored in memory. The code word is decompressed to recover the prefixes. The recovered prefixes are paired with their respective suffix data.

Feature dictionary for bandwidth enhancement

A system having multiple devices that can host different versions of an artificial neural network (ANN) as well as different versions of a feature dictionary. In the system, encoded inputs for the ANN can be decoded by the feature dictionary, which allows for encoded input to be sent to a master version of the ANN over a network instead of an original version of the input which usually includes more data than the encoded input. Thus, by using the feature dictionary for training of a master ANN there can be reduction of data transmission.