Patent classifications
H03M13/4169
CONVOLUTIONAL DECODER AND METHOD OF DECODING CONVOLUTIONAL CODES
A convolutional decoder includes a first storage, a second storage, a branch metric processor to determine branch metrics for transitions of states from a start step to a last step according to input bit streams, an ACS processor to select maximum likelihood path metrics to determine a survival path according to the branch metrics and to update states of the start step to the first storage and the second storage alternately based on the selection of the maximum likelihood path metrics, and a trace back logic to selectively trace back the survival path based on the states of the start step stored in a selected storage among the first storage and the second storage.
TAILLESS CONVOLUTIONAL CODES
Certain aspects of the present disclosure relate to techniques and apparatus for increasing decoding performance and/or reducing decoding complexity. An exemplary method generally includes receiving, via a wireless medium, a codeword encoded using a tailless convolutional code (TLCC) with a known start state, evaluating a set of decoding candidate paths through a trellis decoder that originate at the known start state of the TLCC, performing, for each of a plurality of the decoding candidate paths, a back trace from a respective end state to the known start state, and selecting one of the decoding candidate paths based, at least in part, on path metrics generated while performing the back trace. Other aspects, embodiments, and features are also claimed and described.
TAIL BITING CONVOLUTIONAL CODE (TBCC) ENHANCEMENT WITH STATE PROPAGATION AND LIST DECODING
Certain aspects of the present disclosure relate to techniques and apparatus for enhanced decoding, for example, by providing a multi-phase tail biting convolutional code (TBCC) decoding algorithm. An exemplary method generally includes obtaining, via a wireless medium, a codeword encoded with a TBCC encoding scheme, generating metrics for candidate paths through trellis stages of a decoder, propagating information from at least one of the trellis stages to a later trellis stage, while generating the metrics, selecting a set of the candidate paths based on the propagated information, and decoding the encoded codeword by evaluating the selected set of candidate paths based, at least in part, on the generated metrics. Other aspects, embodiments, and features are claimed and described.
Parallel backtracking in Viterbi decoder
A Viterbi traceback processing method, system, and apparatus are provided wherein a first Viterbi traceback processing operation (MUX 514) is performed on a first survivor path metric (TMV1) by selecting, in response to a back track state (INDEX 0), a first output data bit (Ti1) for the first survivor path metric, wherein a plurality of Viterbi traceback processing operations (MUX 512, 513) are performed on respective portions of an additional survivor path metric (TMV2A, TMV2B) by selecting, in response to a shifted back track state (INDEX 1), candidate data bits (Tn1, Tn2) for the additional survivor path metric, wherein a multiplexer (MUX 518) controlled by the first output data bit selects between the candidate data bits to generate an additional output data bit (Ti2) for the additional survivor path metric such that the Viterbi traceback processing operations are performed in parallel to produce the output data bits.
Receiver and internal TCM decoder and associated decoding method
The present invention discloses a Trellis-Coded-Modulation (TCM) decoder applied in a receiver, wherein the TCM decoder includes a branch metric unit, a path metric unit, a trace-back length selection circuit and a survival path management circuit. In operations of the TCM decoder, the branch metric unit is configured to receive multiple input codes to generate multiple sets of branch information. The path metric unit is configured to calculate multiple survival paths according to the multiple sets of branch information. The trace-back length selection circuit is configured to select a trace-back length, wherein the trace-back length is determined according to a signal quality of the receiver. The survival path management circuit is configured to return the multiple survival paths for the trace-back length in order to generate an output code.
RECEIVER AND INTERNAL TCM DECODER AND ASSOCIATED DECODING METHOD
The present invention discloses a Trellis-Coded-Modulation (TCM) decoder applied in a receiver, wherein the TCM decoder includes a branch metric unit, a path metric unit, a trace-back length selection circuit and a survival path management circuit. In operations of the TCM decoder, the branch metric unit is configured to receive multiple input codes to generate multiple sets of branch information. The path metric unit is configured to calculate multiple survival paths according to the multiple sets of branch information. The trace-back length selection circuit is configured to select a trace-back length, wherein the trace-back length is determined according to a signal quality of the receiver. The survival path management circuit is configured to return the multiple survival paths for the trace-back length in order to generate an output code.
Convolutional code decoder and convolutional code decoding method
The invention discloses a convolutional code decoder and a convolutional code decoding method. The convolutional code decoder performs decoding operation according to a received data and an auxiliary data to obtain a target data and includes an error detection data generation circuit, a channel coding circuit, a selection circuit, and a Viterbi decoding circuit. The error detection data generation circuit performs an error detection operation on the auxiliary data to obtain an error detection data. The channel coding circuit, coupled to the error detection data generation circuit, performs channel coding on the auxiliary data and the error detection data to obtain an intermediate data. The selection circuit, coupled to the channel coding circuit, generates a to-be-decoded data according to the received data and the intermediate data. The Viterbi decoding circuit, coupled to the selection circuit, decodes the to-be-decoded data to obtain the target data.
Iterative equalization using non-linear models in a soft-input soft-output trellis
A method includes: generating a trellis; generating one or more predicted symbols using a first non-linear model; computing and saving two or more branch metrics using a priori log-likelihood ratio (LLR) information, a channel observation, and the one or more predicted symbols; if alpha forward recursion has not yet completed, generating alpha forward recursion state metrics using a second non-linear model; if beta backward recursion has not yet completed, generating beta backward recursion state metrics using a third non-linear model; if sigma forward recursion has not yet completed, generating sigma forward recursion state metrics using the branch metrics, the alpha state metrics, and the beta backward recursion state metrics; generating extrinsic information comprising a difference of a posteriori LLR information and the a priori LLR information; computing and feeding back the a priori LLR information; and calculating the a posteriori LLR information.
Convolutional code decoder and convolutional code decoding method
The invention discloses a convolutional code decoder and a convolutional code decoding method. The convolutional code decoder performs decoding operation according to a received data and an auxiliary data to obtain a target data and includes an error detection data generation circuit, a channel coding circuit, a selection circuit, and a Viterbi decoding circuit. The error detection data generation circuit performs an error detection operation on the auxiliary data to obtain an error detection data. The channel coding circuit, coupled to the error detection data generation circuit, performs channel coding on the auxiliary data and the error detection data to obtain an intermediate data. The selection circuit, coupled to the channel coding circuit, generates a to-be-decoded data according to the received data and the intermediate data. The Viterbi decoding circuit, coupled to the selection circuit, decodes the to-be-decoded data to obtain the target data.
Reinforced list decoding
Certain aspects of the present disclosure relate to techniques and apparatus for increasing decoding performance and/or reducing decoding complexity. A transmitter may divide data of a codeword into two or more sections and then calculate redundancy check information (e.g., a cyclic redundancy check or a parity check) for each section and attach the redundancy check information to the codeword. A decoder of a receiver may decode each section of the codeword and check the decoding against the corresponding redundancy check information. If decoding of a section fails, the decoder may use information regarding section(s) that the decoder successfully decoded in re-attempting to decode the section(s) that failed decoding. In addition, the decoder may use a different technique to decode the section(s) that failed decoding. If the decoder is still unsuccessful in decoding the section(s), then the receiver may request retransmission of the failed section(s) or of the entire codeword.