H03M7/6047

DATA COMPRESSION AND DECOMPRESSION
20190132002 · 2019-05-02 ·

Apparatus comprises data compression circuitry to process a set of data values, the data compression circuitry comprising: detector circuitry to detect, for each of n complementary groups of m data values of the set of data values, a first subset of the groups for which the data values in the group have a predetermined pattern of data values, where m and n are integers and mn is the number of data values in the set of data values; generator circuitry to generate a compressed data packet comprising at least: a representation of a second subset of the groups, the second subset being each of then complementary groups other than groups in the first subset; and an indication of a group position, with respect to the set of data values, of each group in the second subset of groups. Complementary decompression apparatus is also described.

APPARATUS AND METHOD FOR ACCELERATING MULTIPLICATION WITH NON-ZERO PACKETS IN ARTIFICIAL NEURON

An acceleration apparatus applied in an artificial neuron is disclosed. The acceleration apparatus comprises an AND gate array, a first storage device, a second storage device and a multiply-accumulate (MAC) circuit. The AND gate array with plural AND gates receives a first bitmap and a second bitmap to generate an output bitmap. The first storage device stores a first payload and outputs a corresponding non-zero first element according to a first access address associated with a result of comparing the first bitmap with the output bitmap. The second storage device stores a second payload and outputs a corresponding non-zero second element according to a second access address associated with a result of comparing the second bitmap with the output bitmap. The MAC circuit calculates a dot product of two element sequences from the first storage device and the second storage device.

Server node arrangement and method

A server node arrangement includes a plurality of server nodes. The server node arrangement is coupled via communication network to a plurality of sources of input data, and to one or more output devices. The server node arrangement receives data content from the plurality of sources of input data, and processes the data content to supply to at least a subset of the output devices. The server node arrangement hosts one or more processes which process the data content into a form which is compatible to a native data rendering format of the subset of the output devices. The at least a subset of the output devices are operable to render the data content simultaneously. The server node arrangement provides a system which communicates content data in a more computationally efficient manner, which is capable of saving energy utilization.

Deep learning numeric data and sparse matrix compression
12039421 · 2024-07-16 · ·

An apparatus to facilitate deep learning numeric data and sparse matrix compression is disclosed. The apparatus includes a processor comprising a compression engine to: receive a data packet comprising a plurality of cycles of data samples, and for each cycle of the data samples: pass the data samples of the cycle to a compressor dictionary; identify, from the compressor dictionary, tags for each of the data samples, wherein the compressor dictionary comprises at least a first tag for data having a value of zero and a second tag for data having a value of one; and compress the data samples into compressed cycle data by storing the tags as compressed data, wherein the data samples identified with the first tag are compressed using the first tag and the data samples identified with the second tag are compressed using the second tag at the same time as values of the data samples identified with the first tag or the second tag are excluded from the compressed cycle data.

VLSI efficient Huffman encoding apparatus and method
10230393 · 2019-03-12 · ·

A compression algorithm based on Huffman coding is disclosed that is adapted to be readily implemented using VLSI design. A data file may be processed to replace duplicate data with a copy commands including an offset and length, such as according to the LV algorithm. A Huffman code may then be generated for parts of the file. The Huffman code may be generated according to a novel method that generates Huffman code lengths for literals in a data file without first sorting the literal statistics. The Huffman code lengths may be constrained to be no longer than a maximum length and the Huffman code may be modified to provide an acceptable overflow probability and be in canonical order. Literals, offsets, and lengths may be separately encoded. The different values for these data sets may be assigned to a limited number of bins for purpose of generating usage statistics used for generating Huffman codes.

ENERGY EFFICIENT ADAPTIVE DATA ENCODING METHOD AND CIRCUIT
20190068218 · 2019-02-28 ·

Various energy efficient data encoding schemes and computing devices are disclosed. In one aspect, a method of transmitting data from a transmitter to a receiver connected by plural wires is provided. The method includes sending from the transmitter on at least one but not all of the wires a first wave form that has first and second signal transitions. The receiver receives the first waveform and measures a first duration between the first and second signal transitions using a locally generated clock signal not received from the transmitter. The first duration is indicative of a first particular data value.

Method and apparatus for under-sampled acquisition and transmission of photoplethysmograph (PPG) data and reconstruction of full band PPG data at the receiver

Certain aspects of the present disclosure relate to a method for compressed sensing (CS). The CS is a signal processing concept wherein significantly fewer sensor measurements than that suggested by Shannon/Nyquist sampling theorem can be used to recover signals with arbitrarily fine resolution. In this disclosure, the CS framework is applied for sensor signal processing in order to support low power robust sensors and reliable communication in Body Area Networks (BANs) for healthcare and fitness applications.

APPARATUS AND METHOD FOR DATA COMPRESSION IN A WEARABLE DEVICE
20190036545 · 2019-01-31 ·

Described is an apparatus and method for data compression using compressive sensing in a wearable device. Described is also a machine-readable storage media having instruction stored thereon, that when executed, cause one or more processors to perform an operation comprising: receive an input signal from a sensor; convert the input signal to a digital stream; and symmetrically pad on either ends of the digital stream with a portion of the digital stream to form a padded digital stream.

COMPRESSION AND DECOMPRESSION ENGINES AND COMPRESSED DOMAIN PROCESSORS
20190013823 · 2019-01-10 ·

Compressed domain processors configured to perform operations on data compressed in a format that preserves order. The Compressed domain processors may include operations such as addition, subtraction, multiplication, division, sorting, and searching. In some cases, compression engines for compressing the data into the desired formats are provided.

Encoding data

An example method may include obtaining multiple symbols each associated with a different state in a quadrature amplitude modulation scheme used for transmission of a data signal. The method may further include determining a sequence of symbols of the multiple symbols for a first portion of the data signal. The determining may include selecting an index value from multiple index values for the first portion of the data signal. The determining may also include obtaining energy states of multiple sequences of the symbols using the energy levels of the symbols. The determining may further include obtaining a relationship between the energy levels of the symbols, the energy states of the multiple sequences of the symbols, and the multiple index values. The determining may further include selecting the sequence of symbols based on the relationship and the index value for the first portion of the data signal.