Patent classifications
H03M7/405
Code table generation device, memory system, and code table generation method
According to one embodiment, a code table generation device includes a table generation unit, a merge unit and a tree generation unit. The table generation unit generates a frequency table including symbols and frequencies of occurrence respectively associated with the symbols, based on a frequency of occurrence for each symbol of input symbols. The merge unit acquires top K symbols in descending order of the frequencies of occurrence and remaining symbols from the symbols, divides the remaining symbols into one or more symbol sets, and determines a frequency of occurrence associated with a root node of each of subtrees correspond to the respective symbol sets. The tree generation unit generates a Huffman tree using the K symbols and the root node of each of the subtrees.
Method and apparatus for processing matrix data through relaxed pruning
A matrix data processing method performed by a computing device which performs a matrix multiplication operation includes, with respect to each of one or more elements included in a matrix, when a value of each element satisfies a designated condition, determining the element to be a don't-care element and determining an output value of the don't-care element, generating a bitstream based on the output value of the don't-care element and index values of valid elements included in the matrix, and equally dividing the bitstream into pieces of a designated number, and generating a Huffman code corresponding to each of a plurality of lower bitstreams that are generated as a result of the equal division.
Low complexity optimal parallel Huffman encoder and decoder
A memory device includes a memory; and at least one processor configured to: obtain a symbol stream including a plurality of symbols; determine a Huffman tree corresponding to the symbol stream, wherein each symbol of the plurality of symbols is assigned a corresponding prefix code from among a plurality of prefix codes based on the Huffman tree; generate a prefix length table based on the Huffman tree, wherein the prefix length table indicates a length of the corresponding prefix code for each symbol; generate a logarithm frequency table based on the prefix length table, wherein the logarithm frequency table indicates a logarithm of a frequency count for each symbol, generate a cumulative frequency table which indicates a cumulative frequency count corresponding to each symbol; generate a compressed bitstream by iteratively applying an encoding function to the plurality of symbols based on the logarithm frequency table and the cumulative frequency table; and store the compressed bitstream in the memory.
Compressing device and method using parameters of quadtree method
A device configured to compress a tensor including a plurality of cells includes: a quadtree generator configured to generate a quadtree searching for a non-zero cell included in the tensor and extract at least one parameter value from the quadtree; a mode selector configured to determine a compression mode based on the at least one parameter; and a bitstream generator configured to generate a bitstream by compressing the tensor based on the compression mode.
Systems, methods and devices for eliminating duplicates and value redundancy in computer memories
A computer memory compression method involves analyzing computer memory content with respect to occurrence of duplicate memory objects as well as value redundancy of data values in unique memory objects. The computer memory content is encoded by eliminating the duplicate memory objects and compressing each remaining unique memory object by exploiting data value locality of the data values thereof. Metadata is provided to represent the memory objects of the encoded computer memory content. The metadata reflects eliminated duplicate memory objects, remaining unique memory objects as well as a type of compression used for compressing each remaining unique memory object. A memory object in the encoded computer memory content is located using the metadata.
Methods and Devices for Binary Entropy Coding of Point Clouds
Methods and devices for encoding or decoding a point cloud. A bit sequence signalling an occupancy pattern for sub-volumes of a volume is coded using entropy coding. For a current sub-volume, probabilities of respective entropy coders for entropy coding the occupancy pattern may be selected based on occupancy data for a plurality of neighbouring sub-volumes of the current sub-volume and on occupancy data for subdivisions of the neighbouring sub-volumes.
PERSONAL HEALTH MONITOR DATA COMPACTION USING MULTIPLE ENCODING ALGORITHMS
Health monitor data is encoded using a plurality of encoding libraries. Portions of the data are encoded by different encoding libraries, depending on which library provides the greatest compaction. This methodology not only provides substantial improvements in data compaction over use of a single data compaction algorithm with the highest average compaction, but also provides substantial additional security in that multiple decoding libraries must be used to decode the data. Optionally, each portion of data may further be encoded using different sourceblock sizes, providing further security enhancements as decoding requires multiple decoding libraries and knowledge of the sourceblock size used for each portion of the data.
K-D TREE ENCODING FOR POINT CLOUDS USING DEVIATIONS
An encoder includes a processor, a buffer, and a memory. The memory includes code as instructions that cause the processor to perform a number of steps. The steps include quantizing geometric data associated with a geometric construct, partitioning the geometric construct, determining a number of points in the partition, generating a deviation value based on the number of points in the partition, storing the deviation value in the buffer, and entropy encoding the deviation value.
Methods and devices using direct coding in point cloud compression
Methods and devices for coding point clouds using direct coding mode to code coordinates of a point within a sub-volume associated with a current node instead of a pattern of occupancy for child nodes. Eligibility for use of direct coding is based on occupancy data from another node. If eligible, then a flag is represented in the bitstream to signal whether direct coding is applied to points in the sub-volume or not.
Methods and devices for entropy coding point clouds
Methods and devices for encoding a point cloud. A current node associated with a sub-volume is split into further sub-volumes, each further sub-volume corresponding to a child node of the current node, and, at the encoder, an occupancy pattern is determined for the current node based on occupancy status of the child nodes. A probability distribution is selected from among a plurality of probability distributions based on occupancy data for a plurality of nodes neighbouring the current node. The encoder entropy encodes the occupancy pattern based on the selected probability distribution to produce encoded data for the bitstream and updates the selected probability distribution. The decoder makes the same selection based on occupancy data for neighbouring nodes and entropy decodes the bitstream to reconstruct the occupancy pattern.