H03M7/6035

Parameter update method for entropy coding and decoding of conversion coefficient level, and entropy coding device and entropy decoding device of conversion coefficient level using same

An video decoding apparatus including a parser which obtains bit strings corresponding to current transformation coefficient level information by arithmetic decoding a bitstream based on a context model; a parameter determiner which determines a current binarization parameter by updating or maintaining a previous binarization parameter based on a comparison of a threshold and a size of a previous transformation coefficient; a syntax element restorer which obtains the current transformation coefficient level information by performing de-binarization of the bit strings using the determined current binarization parameter and generates a size of a current transformation coefficient using the current transformation coefficient level information, wherein the current binarization parameter has a value equal to or smaller than a predetermined value.

Hierarchical data compression and computation

According to embodiments of the present invention, machines, systems, methods and computer program products for hierarchical compression of data are presented comprising creating a compression hierarchy of compression nodes, wherein each compression node is associated with a compression operation to produce compressed data. An output of any of the compression nodes may be compressed by another compression node or the same compression node. A path of one or more compression nodes is determined through said compression hierarchy based upon compression statistics to compress data, and the data is compressed by the compression nodes of the path. Various computational techniques are presented herein for manipulating the compression hierarchy to defer or reduce computation during query evaluation.

System and method to dynamically abbreviate data
20250317155 · 2025-10-09 ·

A system comprises a memory communicatively coupled to at least one processor. The at least one processor is configured to receive a request to store received data, determine whether the received data comprises unstructured data, evaluate the unstructured data of the received data in accordance with a machine learning algorithm in response to determining that the received data comprises the unstructured data, and perform an analysis operation to identify datapoints in the received data in response to evaluating the unstructured data of the received data. The datapoints may be a signature representation of the unstructured data of the received data. Further, the at least one processor may be configured to generate a roadmap to store the datapoints and store the datapoints following the roadmap. The roadmap is a plan to store the datapoints in the memory in accordance with one or more quantum random number generator (QRNG) operations.

System and method to dynamically abbreviate data

A system comprises a memory communicatively coupled to at least one processor. The at least one processor is configured to receive a request to store received data, determine whether the received data comprises unstructured data, evaluate the unstructured data of the received data in accordance with a machine learning algorithm in response to determining that the received data comprises the unstructured data, and perform an analysis operation to identify datapoints in the received data in response to evaluating the unstructured data of the received data. The datapoints may be a signature representation of the unstructured data of the received data. Further, the at least one processor may be configured to generate a roadmap to store the datapoints and store the datapoints following the roadmap. The roadmap is a plan to store the datapoints in the memory in accordance with one or more quantum random number generator (QRNG) operations.

System and method for encrypted data compression with a hardware management layer

A system and method for encrypted data compression with hardware management combines data compression techniques with real-time hardware optimization. The system receives an input data stream comprising data blocks, analyzes frequency distributions within the stream, and generates a conditioned data stream by identifying portions that deviate from target frequency distributions and applying conditioning rules. A hardware management layer continuously monitors the system resource utilization including processing latency, memory consumption, and field-programmable gate array (FPGA) resource usage during data processing operations. When resource utilization exceeds performance thresholds, the system generates hardware optimization plans and implements configuration changes including memory reallocation, processor adjustments, and FPGA resource management. The system applies Burrow's-Wheeler transforms (BWT) to improve data compressibility and creates error streams through XOR operations between original and conditioned data. The hardware management layer enables dynamic adaptation to changing processing demands while maintaining optimal system performance for data compression operations.

Data encoding method, data decoding method, and data processing apparatus

This application relates to the field of artificial intelligence, and discloses a data encoding method, a data decoding method, and data processing apparatuses. Both the data encoding method and the data decoding method relate to an invertible flow-based model. The invertible flow-based model includes a target invertible flow layer, a model parameter of the target invertible flow layer is used to constrain an auxiliary variable generated in an inverse transform processing process, an operation corresponding to the target invertible flow layer includes a multiplication operation and a division operation that are determined based on the model parameter, and the auxiliary variable is an increment of a product of the multiplication operation or a remainder generated through the division operation.

System and Method for Distributed Node-Based Data Compaction with Dyadic Distribution-Based Compression and Encryption
20250350297 · 2025-11-13 ·

A system and method for distributed node-based data compaction. The system uses machine learning on data chunks to generate codebooks which compact the data to be stored, processed, or sent with a smaller data profile than uncompacted data. The system uses a data compaction in an existing blockchain fork or implemented in a new blockchain protocol from which nodes that wish to or need to use the blockchain can do so with a reduced storage requirement. The system uses network data compaction across all nodes to increase the speed of and decrease the size of a blockchain's data packets. The system uses data compaction firmware to increase the efficiency at which mining rigs can computationally validate new blocks on the blockchain. The system can be implemented using any combination of the three data compaction services to meet the needs of the desired blockchain technology.

System and method for data compression with homomorphic encryption

Data compression with homomorphic encryption enables secure storage of private information in a database and searching and comparison of encrypted data within the database. A stream conditioning system optimizes the contents of received data for lossless compression by a data encoder that performs the lossless compression. An encrypted search engine encrypts the compressed data according to a homomorphic encryption scheme and store the encrypted data in a database. Data queries are received and encrypted according to the homomorphic encryption scheme. The encrypted data query is compared against an encrypted element in the database and an encryption score generated. The encryption score may be compared against a set of criteria to determine if a match is found. Matched data may be returned to the requesting entity.

System and method for off-chip data compression and decompression for machine learning networks

There is provided a system and method for compression and decompression of a data stream used by machine learning networks. The method including: encoding each value in the data stream, including: determining a mapping to one of a plurality of non-overlapping ranges, each value encoded as a symbol representative of the range and a corresponding offset; and arithmetically coding the symbol using a probability count; storing a compressed data stream including the arithmetically coded symbols and the corresponding offsets; and decoding the compressed data stream with arithmetic decoding using the probability count, the arithmetic decoded symbols use the offset bits to arrive at a decoded data stream; and communicating the decoded data stream for use by the machine learning networks.