H03M13/3927

Systems and methods for multithreaded successive cancellation list polar decoding
11664828 · 2023-05-30 · ·

A polar decoder circuit can execute successive cancellation list polar decoding on multiple threads concurrently. An LLR update engine of the polar decoder circuit and a sort engine of the polar decoder circuit can operate concurrently, with the LLR update engine computing updated path metrics for one codeword while the sort engine sorts candidates for one or more other codewords according to path metrics already computed by the LLR update engine. Threads corresponding to different codewords can cycle sequentially between the LLR update engine and the sort engine.

SOFT-DECISION DECODING
20220321149 · 2022-10-06 ·

A method of soft-decision decoding including training a machine learning agent with communication signal training data; providing to the trained machine learning agent a signal that has been received via a communications channel; operating the machine learning agent to determine respective probabilities that the received signal corresponds to each of a plurality of symbols; and, based on the determined probabilities, performing soft decision decoding on the received signal.

METHOD OF OPERATING DECODER FOR REDUCING COMPUTATIONAL COMPLEXITY AND METHOD OF OPERATING DATA STORAGE DEVICE INCLUDING THE DECODER

A method of operating a decoder, which has variable nodes and check nodes, includes receiving variable-to-check (V2C) messages from the variable nodes using a first check node among the check nodes. The number of messages having a specific magnitude among the V2C messages is counted. The magnitude of a check-to-variable (C2V) message to be transmitted to a first variable node, among the variable nodes, is determined based on the count value and the magnitude of a V2C message of the first variable node.

POLAR CODE DECODING METHOD AND APPARATUS, STORAGE MEDIUM, AND TERMINAL
20220311453 · 2022-09-29 ·

A Polar code decoding method and apparatus, a storage medium, and a terminal are provided. The method includes: dividing a Polar code having a length of N into S groups of Polar codes, each group of the S groups of Polar codes being data extracted from the Polar code having the length of N according to a preset rule, and S being an integer power of 2; and performing joint decoding on calculation results of the S groups of Polar codes after performing a logarithm likelihood ratio (LLR) calculation on each group of the S groups of Polar codes.

Soft-aided decoding of staircase codes

A hard-decision (HD) forward error correcting (FEC) coded signal is decoded by a decoder to produce decoded bits using marked reliable bits of the HD-FEC coded signal and marked unreliable bits of the HD-FEC coded signal. The marked reliable and unreliable bits are computed by calculation and marking blocks based on an absolute value of log-likelihood ratios of the HD-FEC coded signal. The HD-FEC coded signal may be, for example, a staircase code coded signal or a product code coded signal.

Content aware decoding method and system

A method and apparatus for obtaining data from a memory, estimating a probability of data values of the obtained data based on at least one of a source log-likelihood ratio and a channel log-likelihood ratio, wherein each bit in the obtained data has an associated log-likelihood ratio, determining at least one data pattern parameter for the data and performing a decoding process using the at least one data pattern parameters to determine a decoded data set.

METHOD FOR POLAR DECODING WITH DYNAMIC SUCCESSIVE CANCELLATION LIST SIZE AND POLAR DECODER

It provides a method (300) for polar decoding a received signal into a number, N, of bits with Successive Cancellation List, SCL. The method (300) includes: at the i-th level of a binary tree for decoding the i-th bit of the N bits, where 1≤i≤N: when the 1-th bit is an information bit, calculating (310) a path metric for each of 2*L.sub.i-1 candidate paths at the i-th level, where L.sub.i-1 is an SCL size at the (i−1)-th level and L.sub.0=1; setting (320) an SCL size at the i-th level, L.sub.i, based on L.sub.i-1 and a statistical distribution of the path metrics calculated for the 2*L.sub.i-1 candidate paths; and selecting (330) L.sub.i surviving paths from the 2*L.sub.i-1 candidate paths based on their respective path metrics.

Error detection and correction using machine learning

A memory system including a memory device and a memory controller including a processor. The memory controller is configured to read outputs from the memory cells in response to a read command from a host and to convert the read outputs to a first codeword. The processor performs a first error correcting code (ECC) operation on the first codeword. The processor is further configured to apply, for each selected memory cell among the memory cells, a corresponding one of the read outputs and at least one related feature as input features to a machine learning algorithm to generate a second codeword, and the memory controller is configured to perform a second ECC operation on the second codeword, when the first ECC operation fails.

Recovering from hard decoding errors by remapping log likelihood ratio values read from NAND memory cells

Hard errors are determined for an unsuccessful decoding of codeword bits read from NAND memory cells via a read channel and input to a low-density parity check (LDPC) decoder. A bit error rate (BER) for the hard errors is estimated and BER for the read channel is estimated. Hard error regions are found using a single level cell (SLC) reading of the NAND memory cells. A log likelihood ratio (LLR) mapping of the codeword bits input to the LDPC decoder is changed based on the hard error regions, the hard error BER, and/or the read channel BER.

ITERATIVE DECODER FOR DECODING A CODE COMPOSED OF AT LEAST TWO CONSTRAINT NODES

An iterative decoder, comprises:

N variable nodes (VNs) v.sub.n, n=1 . . . N, configured to receive a LLR I.sub.n defined on a alphabet A.sub.l of q.sub.ch quantization bits, q.sub.ch≥2;

M constraint nodes (CNs) c.sub.m, m=1 . . . M, 2≤M<N;

v.sub.n and c.sub.m exchanging messages along edges of a Tanner graph;

each v.sub.n sending messages m.sub.v.sub.n.sub..fwdarw.c.sub.m, the set of connected constraint nodes being noted V.sub.(vn), and V.sub.(vn)\{cm} being V.sub.(vn) except c.sub.m, and,

each c.sub.m sending messages m.sub.c.sub.m.sub..fwdarw.v.sub.n to v.sub.n;

the LLR I.sub.n and the messages m.sub.v.sub.n.sub..fwdarw.c.sub.m and m.sub.c.sub.m.sub..fwdarw.v.sub.n are coded; and

each variable node v.sub.n, for each iteration l, compute:

sign-preserving factors:

[00001] = ξ × sign ( I n ) + .Math. c V ( v n ) \ { c m } sign ( )

where ξis a positive or a null integer;

[00002] = I n + 1 2 × + .Math. c V ( v n ) \ { c m } ( )

and

custom-character