UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

On the construction, dimensionality, and decoding of linear block code trellises Kot, Alan D.

Abstract

In principle, the coding gain of a digital communications system can be improved through the use of longer error control codes and soft-decision maximum-likelihood (ML) decoding. Unfortunately, the implementation of such techniques is limited by the computational requirements of the decoder. In this thesis, we consider several aspects of trellis-based ML, and 'near-ML', soft-decision decoding of general linear block codes. ML decoding is feasible for reasonably compact trellises using the Viterbi algorithm. For general linear block codes we present simple methods for trellis construction that are based on the methods of Wolf and Massey. It is confirmed that the trellises so constructed are minimal, and an improvement of Muder's lower bound on the maximum trellis dimension is found. It is shown that when equivalent codes are constructed by permutations of the symbol positions that the resulting trellis dimensions are fixed near either end, while in the central portion of the trellis the dimensions vary between Wolf's upper bound on the maximum dimension and a new lower bound on the minimum dimension. This lower bound and the improved bound on the maximum trellis dimension are exact for maximum distance separable codes. These bounds indicate that only codes (and their duals) that have a smallest minimum distance min (dmin, d⊥min) significantly less than the corresponding Singleton bound can possibly have a compact trellis. For trellises that are impractically large for decoding by the Viterbi algorithm, we consider near-ML decoding using partial searches of a code trellis or tree. Despite the fact that this approach is suboptimal in terms of the probability of error, it has the potential for an improved trade-off between coding gain and computational effort. The decoding problem faced by a partial trellis search is described in an alternative manner to Massey's variable-length code model. As part of this approach, we specialize Massey's use of 'random-tails' in order to exploit a priori knowledge of the code and information-symbol distribution used. As a result there are some minor changes to the usual metric, but only in atypical situations -- such as the use of unequal a priori information-symbol probabilities, non-linear codes, and non-symmetric, non-binary discrete memory less channels. We introduce a simple and efficient (linear time) method for the selection of the best metrics from among a number of contenders in a breadth-first search. This method can provide a sorted set of survivor metrics during M algorithm decoding of binary linear block codes using only M comparisons in the worst case. The application of the method to the decoding of convolutional codes is also discussed. We present a method for soft-decision decoding of general linear block codes that we refer to as Reconfigurable Trellis (RT) Decoding. RT decoding carries out a partial search on a different and more easily searched trellis (or tree) for the code. The trellis used for decoding is dependent upon the reliability of the received data, so that it is determined 'on-the-fly'. Only a portion of the reconfigured trellis needs to be constructed, as guided by the partial search. While RT decoding requires some computational overhead prior to beginning each search, the search effort itself can be very significantly reduced, so that overall an improvement can be achieved in the trade-off between coding gain and computational effort. With respect to the performance analysis of suboptimal soft-decision decoding schemes, we extend the uniform error property of ML decoding of linear codes on a binary-input symmetric-output memory less channel to include the more general case of partial trellis searches. For RT-M decoding of general linear block codes the uniform error property is used together with a novel channel model and a very simple search technique to estimate the increase in the probability of error over that of ML decoding. This approach can yield accurate results while avoiding the complexity of considering numerous soft-decision output values, and without requiring detailed knowledge of the code structure.

Item Citations and Data

Rights

For non-commercial purposes only, such as research, private study and education. Additional conditions apply, see Terms of Use https://open.library.ubc.ca/terms_of_use.