Skip to main content

High-rate systematic recursive convolutional encoders: minimal trellis and code search

Abstract

We consider high-rate systematic recursive convolutional encoders to be adopted as constituent encoders in turbo schemes. Douillard and Berrou showed that, despite its complexity, the construction of high-rate turbo codes by means of high-rate constituent encoders is advantageous over the construction based on puncturing rate-1/2 constituent encoders. To reduce the decoding complexity of high-rate codes, we introduce the construction of the minimal trellis for a systematic recursive convolutional encoding matrix. A code search is conducted and examples are provided which indicate that a more finely grained decoding complexity-error performance trade-off is obtained.

1 Introduction

The typical turbo code configuration is the parallel concatenation of two systematic recursive constituent convolutional encoders of rate 1/2 connected via an interleaver, resulting in a code of rate 1/3. However, higher rate turbo codes may be useful in modern wireless, magnetic recording and fiber optics applications[1]. The usual approach to increase the overall rate is to puncture selected bits of the turbo codeword[2]. An alternative is to use high-rate constituent encoders[1, 3, 4] what, according to[3], offers several advantages, such as better convergence of the iterative process, higher throughput, reduced latency, and robustness of the decoder. If puncturing is still needed to achieve a required rate, fewer bits have to be discarded when compared to the conventional rate-1/2 constituent encoders, resulting in less degradation of the correcting capability of the constituent code[3].

In[1], a class of systematic recursive convolutional encoders restricted to be of rate k/(k + 1) is proposed. The codes are optimized in terms of the pairs (d i ,N i ), where d i is the minimum weight of codewords generated by input sequences of weight i and N i are their multiplicities, and thus suited to be used as constituent encoders of turbo codes[5]. Good systematic recursive encoding matrices with increasing values of encoder memory sizes for a fixed code rate are listed. Turbo codes constructed with the family of constituent encoders given in[1] are shown to outperform some high-rate turbo codes obtained by puncturing a rate-1/2 constituent encoder.

One major drawback of high-rate constituent encoders is the decoding complexity since it increases exponentially with k and with the constraint length for various decoding algorithms. With the motivation of proposing a class of turbo codes with low complexity, high-rate constituent encoders, Daneshgaran et al.[4] constructed recursive constituent encoders of rate k/(k + 1) by puncturing a rate-1/2 recursive mother encoder. However, a reduction in decoding complexity is obtained at the expense of a reduced spectrum (d i ,N i ) when compared to that of the best recursive encoder of rate k/(k + 1)[1].

An alternative to reduce the decoding complexity is to consider trellis representations for the constituent codes other than the conventional trellis usually adopted. For nonrecursive convolutional encodes, there is a trellis structure, the minimal trellis[6], which represents the coded sequences minimally under various complexity measures. The interest on the minimal trellis representation comes from its good error performance versus decoding complexity trade-off[710] and its potential power consumption and hardware utilization reductions[11]. However, the minimal trellis construction presented in[6] cannot be readily applied to turbo codes. The reason is that the mapping between information bits and coded bits produced by the minimal trellis corresponds to nonrecursive convolutional encoders, while systematic, recursive mappings are required in turbo coding. This article presents a method which can fill this gap.

In this article, we introduce the construction of the minimal trellis for a systematic recursive convolutional encoding matrix, the encoding required for the constituent codes of a turbo code. Our goal is to reduce the decoding complexity of a turbo decoder operating with high-rate constituent encoders. We also conduct a code search to show that a more finely grained decoding complexity-error performance trade-off is achieved with our approach. We tabulate several new encoding matrices with a larger variety of complexities than those in[1], as well as code rates other than k/(k + 1). The proposed minimal trellis can be constructed for systematic recursive convolutional encoders of any rate. Thus, our approach is more general than that in[1], while allowing that a distance spectrum (d i ,N i ) better than that of the punctured codes in[4] can be achieved.

The rest of this article is organized as follows. In Section 2, we introduce some basic definitions and notations. Section 3 introduces the minimal trellis construction for a systematic recursive convolutional encoding matrix. In Section 4, we present search code results. Section 5 concludes the article.

2 Preliminaries

Consider a convolutional code C(n,k,ν), where ν, k and n are the overall constraint length, the number of binary inputs and binary outputs, respectively, while the code rate is R = k/n. Every convolutional code can be represented by a semi-infinite trellis which (apart from a short transient in its beginning) is periodic, the shortest period being a trellis module. The conventional trellis module Φ conv consists of a single trellis section with 2ν initial states and 2ν final states; each initial state is connected by 2kdirected branches to final states, and each branch is labeled with n bits.

The minimal trellis module, Φ min , for nonrecursive convolutional codes was developed in[6]. Such a structure has n sections, 2 ν ~ t states at depth t, 2 b ~ t branches emanating from each state at depth t, and one bit labeling each branch, for 0 ≤ tn −1. The trellis complexity of the module Φ, TC(Φ), defined in[6] captures the complexity of trellis-based decoding algorithms[12]. It is shown in[6] thatTC( Φ conv )= n k 2 ν + k andTC( Φ min )= 1 k t = 0 n 1 2 ν ~ t + b t ~ symbols per bit. The state and the branch complexity profiles of the minimal trellis are denoted by ν ~ =( ν ~ 0 ,, ν ~ n 1 ) and b ~ =( b ~ 0 ,, b ~ n 1 ), respectively. It has been shown in[6] that for many nonrecursive convolutional codes the trellis complexity TC min ) of the minimal trellis module is considerably smaller than the trellis complexity TC conv ) of the conventional trellis module.

A generator matrix G(D) of a convolutional code C(n,k,ν) is a full-rank k × n polynomial (in D) matrix that encodes/generates C, i.e., is realizable by a linear sequential circuit (called an encoder for C)[13]. Let G(0) denote the binary matrix obtained when substituting D with 0 in the matrix G(D). If G(0) is full-rank, then G(D) is called an encoding matrix and is of particular interest. Two generator (encoding) matrices G(D) and G(D) are equivalent if they generate the same code or, equivalently, if and only if there exists a k × k nonsingular polynomial matrix T(D) such that G(D) = T(D)G(D). A generator matrix G(D) is called basic if it is polynomial and it has a polynomial right inverse n × k matrix G−1(D). A basic encoding matrix G(D) has a Smith form decomposition ([13], p. 44), ([14], Theorem 4.6)

G(D)=A(D) Γ (D)B(D),
(1)

where the non-zero elements of Γ(D) are called the invariant factors of G(D), the k × k matrix A(D) and the n × n matrix B(D) are both polynomial with unit determinants. Let ν i be the constraint length for the ith input of a polynomial generator matrix G(D), defined as ν i = max1≤jn{degGi,j(D)}. Then the overall constraint length (already mentioned) is given by ν = ν1 + + ν k . Define also the memory m of G(D) as m = max1≤ik{ν i }. A basic generator matrix G(D) is called minimal-basic if the overall constraint length ν is minimal over all equivalent basic encoding matrices.

A generator matrix G(D) can be decomposed as G(D) = G0 + G1D + + G m Dm, where G i , for i = 0,…,m are the k × n (scalar) generator submatrices. The scalar generator matrix G scalar is given by[6]

G scalar = G 0 G 1 G m G 0 G 1 G m .
(2)

The “matrix module” is the matrixĜ defined as[6]

Ĝ= G m G 0 .
(3)

The generator matrix G(D) is said to be in the left–right (LR) (or minimal span or trelis oriented) form, if no column of G scalar contains more than one underlined entry (the Leftmost nonzero entry in its row), or more than one overlined entry (the Rightmost nonzero entry in its row). If G(D) is in LR form, then it produces the minimal trellis for the code[6].

Next, we introduce a method to construct the minimal trellis for systematic recursive convolutional encoding matrices.

3 Construction of the minimal trellis for systematic recursive encoding matrices

The construction of the “minimal” trellis for the systematic recursive convolutional encoding matrix Gsys(D) involves two main steps. First, we find the minimal trellis of an equivalent nonsystematic nonrecursive minimal-basic generator matrix in LR form. Then, the mapping between the information bits and coded bits in this trellis is changed in order to construct a systematic minimal trellis. The complete algorithm is summarized in the end of this section.

3.1 Minimal trellis for an equivalent encoder

Let Gsys(D) be a systematic recursive encoding matrix for a rate R = k/n convolutional code and let q(D) be the least common multiple of all denominators of the entries in Gsys(D). We construct a nonsystematic nonrecursive basic encoding matrix, denoted by G b (D), equivalent to Gsys(D), as follows ([13], p. 44), ([14], Theorem 4.6):

  • Find the Smith form decomposition of the polynomial matrix q(D)Gsys(D):

    q(D) G sys (D)=A(D) Γ (D)B(D),
    (4)

    where the non-zero elements of Γ(D) are called the invariant factors of q(D) Gsys(D), the k × k matrix A(D) and the n × n matrix B(D) are both polynomial with unit determinants. Thus, the invariant factor decomposition of Gsys(D) is

    G sys (D)=A(D)Γ(D)B(D),
    (5)

    where Γ(D) = Γ(D)/q(D).

  • Form the desired encoding matrix G b (D) as the k × n submatrix of B(D) in (5) consisting of the first k rows.

We can then perform a sequence of row operations on G b (D) to construct a nonsystematic nonrecursive minimal-basic encoding matrix G(D) in LR form.

Example 1

Consider the (n k ν) = (4,3,3) systematic recursive encoding matrix[1, Table IV]

G sys (D)= 1 0 0 1 + D + D 3 1 + D 3 0 1 0 1 + D 2 + D 3 1 + D 3 0 0 1 1 + D + D 2 + D 3 1 + D 3 .
(6)

The invariant factor decomposition of Gsys(D) is given by

G sys ( D ) = 1 0 0 D 2 + D 4 + D 5 1 1 D 2 + D 3 + D 4 + D 5 1 0 1 1 + D 3 0 0 0 0 1 0 0 0 0 1 0 1 + D 3 0 0 1 + D + D 3 D 2 + D 3 + D 4 + D 5 0 1 1 + D + D 4 + D 5 D 3 1 1 D + D 3 D 2 0 0 1 + D 2 .
(7)

The basic encoding matrix G b (D) equivalent to Gsys(D) is readily obtained from (7). Using the greedy-algorithm[15] we turn Gb(D) into the LR form, or equivalently, into the minimal-span form[15, Theorem6.11], resulting in the following generator matrix

G(D)= 1 1 1 1 0 1 + D D 1 D + D 2 D + D 2 1 1 + D
(8)

with overall constraint length ν = 3. The trellis complexity of the conventional module for Gsys(D) is TC conv ) = 85. 33 symbols per bit.

The minimal trellis module for the convolutional code with G(D) given in (8), constructed with the method presented in[6], is shown in Figure1. It has state and branch complexity profiles given by ν ~ =(3,4,4,4) and b ~ =(1,1,1,0), respectively. Thus, the trellis complexity of the minimal trellis is TC min ) = 32 symbols per bit. In the first three trellis sections, the upper (resp. lower) branches correspond to information bit “0” (resp. “1”), i.e., the standard convention[6]. The solid branches represent “0” codeword bits and the dashed branches represent “1” codeword bits. The mapping between information bits and coded bits in Figure1 is not systematic, so this trellis does not represent the same encoder as Gsys(D). The construction of a minimal trellis for systematic recursive encoders is discussed in the next section.

Figure 1
figure 1

Minimal trellis module for the ( n ,k,ν) = (4,3,3) convolutional code with G(D) given in ( 8). The solid/blue branches represent “0” codeword bits and the dashed/red branches represent “1” codeword bits. In the first three transitions, the upper (resp. lower) branches correspond to information bit “0” (resp. “1”), i.e., the standard convention.

3.2 Minimal trellis for systematic recursive encoders

Originally, minimal trellises have been constructed for codes, not for matrices (or encoders). However, the convention that the upper branches refer to the information bit 0 and the lower branches refer to the information bit 1 yields a particular encoding which, in general, is not systematic. We note that the association of solid/dashed branches with codeword bits can not be changed, as any change in this regard would result in a trellis which would no longer represent the convolutional code. However, by enforcing a different convention on the association of the branches to the information bits, only a different encoding for the same code is obtained.

Since in the systematic part the information bit and the coded bit must have the same value, all we need to do is to change, in the information sections of the minimal trellis, the standard convention to that where solid branches refer to the information bit 0 and the dashed branches refer to the information bit 1. Let us refer to this convention as the systematic convention. Getting back to the minimal trellis in Figure1, for the nonsystematic nonrecursive convolutional encoding matrix G(D) given in (8), we only need to adopt the systematic convention in the first three sections to turn this minimal trellis into a systematic trellis.

It remains to show that the minimal trellis in Figure1 with the systematic convention is the minimal trellis for the systematic recursive convolutional encoding matrix Gsys(D) in (6). We note that the generator matrices G(D) given in (8) and Gsys(D) in (6) are equivalent in the sense that both generate the same code. Therefore, except for the edge convention, the two trellises are exactly the same. Assuming that the information bits at the input of the encoder associated with Gsys(D) and the information bits associated with the minimal trellis in Figure1 occupy the same positions, the systematic convention is unique. Consequently, the trellis in Figure1 with the systematic convention in the first three sections is the minimal trellis for the systematic recursive convolutional encoding matrix Gsys(D) in (6).

The complete algorithm for the construction of the minimal trellis for a systematic recursive encoding matrix is summarized as:

  1. 1.

    From G sys(D), use the Smith form decomposition procedure to obtain the basic nonsystematic nonrecursive encoding matrix G b (D);

  2. 2.

    If G b(D) does not have the LR property, then apply the greedy algorithm described in [15] to turn it into a LR form. Denote the new generator matrix by G(D). Else set G(D) ← G b(D);

  3. 3.

    Construct the minimal trellis module for G(D) with the method presented in [6];

  4. 4.

    Adopt the systematic convention on the minimal trellis module obtained in the previous step. The resulting trellis is the desired systematic trellis.

Example 2

Consider the following systematic recursive (n,k,ν) = (5,4,3) encoder matrix from[1, TableIV] with TC conv ) = 160 symbols per bit

G sys ( D ) = 1 0 0 0 1 + D + D 2 + D 3 1 + D + D 3 0 1 0 0 1 + D + D 2 1 + D + D 3 0 0 1 0 1 + D 2 + D 3 1 + D + D 3 0 0 0 1 1 + D 3 1 + D + D 3 .

Using the procedure described in Section 3 we construct the “minimal” trellis shown in Figure2. This module has state and branch complexity profiles given by ν ~ =(3,4,4,4,4) and b ~ =(1,1,1,1,0), respectively. Thus, TC min ) = 32 symbols per bit.

Figure 2
figure 2

Minimal trellis module for the ( n , k , ν ) = (5,4,3) systematic recursive encoder. Solid/blue branches represent “0” codeword bits while dashed/red branches represent “1” codeword bits. The same convention (i.e., the systematic convention) applies to the first four trellis sections.

4 New codes

Graell i Amat et al.[1] tabulated good k × (k + 1) encoding matrices Gsys(D) to be used as component encoders of parallel concatenated turbo codes. We search for good (with respect to the pair (d i ,N i ) constituent systematic recursive convolutional encoder matrices with a larger variety of complexities compared to those listed in[1]. The main idea is to propose templates with polynomial generator matrix G(D) in trellis-oriented form[6, 7] with fixed TC min ). This can be done by placing the leading (underlined) and trailing (overlined) 1’s of each row of the “matrix module” in specific positions, leaving the other positions free to assume any binary value.

Example 3

The following “matrix module”

1 ¯ 0 0 0 1 ¯ 1 ¯ 0 0 1 _ 0 1 _ 0 0 1 _

is associated with an ensemble of nonsystematic nonrecursive convolutional codes of rate 3/4 and trellis complexity of the minimal trellis module TC min ) = 21. 33 symbols per bit.

Remark

In our code search we enforce that the positions of the underlined 1’s are in the first k columns of the matrix module in order to assure that the information bits of the corresponding minimal trellis are in the first k sections. □

By varying the free positions (marked with an “*”) in this matrix, several polynomial generator matrices G(D) are produced. For each matrix G(D) in this ensemble, we apply Steps (3) and (4) of the algorithm, i.e., we construct the minimal trellis module for G(D) and adopt the systematic convention to the information sections, and then calculate the pairs (d i ,N i ), for i = 2,…,6. The dominant term, d2, is called the effective free distance of the turbo code[16].

Codes of rate R = 2/4,3/4,3/5,4/5 are listed in Table1, which indicates the relationship between TC min ) and the error performance expressed in terms of the pairs (d i ,N i ). The matrix G(D) shown in the table together with the systematic convention is used to construct the minimal trellis that attains the corresponding (d i ,N i ), i = 2,…,6. The existing codes taken from[1] are also indicated in Table1. Their (minimal) trellis complexity, shown in the table, was obtained by applying the complete algorithm of Section 3.2 to their respective systematic recursive encoding matrices. For example, the matrices G sys (D) listed in[1, Table IV] for R = 3/4 yield TC min ) = 10. 67 (for ν = 2), TC min ) = 32 (for ν = 3) and TC min ) = 64 (for ν = 4). New codes with a variety of trellis complexities are found with our code search procedure. The effective free distance and/or multiplicity of the code are gradually improved as more complex codes are sought.

Table 1 Minimal trellis complexity-performance trade-off for R = 2/4,3/4,3/5,4/5

5 Conclusions

We present a method to construct the minimal trellis for a recursive systematic convolutional encoding matrix. Such a trellis minimizes the trellis complexity measure introduced by McEliece and Lin[6], which applies to trellis-based decoding algorithms. As a contribution of this work, several new convolutional encoding matrices having an equivalent systematic recursive encoding matrix, optimized for turbo codes, are tabulated. They provide a wide range of performance-complexity trade-offs, to serve several practical applications.

Endnotes

aThis work was presented in part at the IEEE International Symposium on Information Theory (ISIT 2011), Saint Petersburg, Russia, July 2011.

References

  1. Amat A, Montorsi G, Benedetto S, Graell i: Design and decoding of optimal high-rate convolutional codes. IEEE Trans. Inf. Theory 2004, 50: 867-881.

    Article  Google Scholar 

  2. Kousa MA, Mugaibel AH: Puncturing effects on turbo codes. IEEE Proc. Commun 2002, 149: 132-138. 10.1049/ip-com:20020230

    Article  Google Scholar 

  3. Douillard C, Berrou C: Turbo codes with rate-m/(m+1) constituent convolutional codes. IEEE Trans. Commun 2005, 53: 1630-1638. 10.1109/TCOMM.2005.857165

    Article  Google Scholar 

  4. Daneshgaran F, Laddomada M, Mondin M: High-rate recursive convolutional codes for concatenated channel codes. IEEE Trans. Commun 2004, 53: 1846-1850.

    Article  Google Scholar 

  5. Benedetto S, Montorsi G: Design of parallel concatenated convolutional codes. IEEE Trans. Commun 1996, 44: 591-600. 10.1109/26.494303

    Article  Google Scholar 

  6. McEliece RJ, Lin W: The trellis complexity of convolutional codes. IEEE Trans. Inf. Theory 1996, 42: 1855-1864. 10.1109/18.556680

    Article  MathSciNet  Google Scholar 

  7. Uchôa-Filho BF, Souza RD, Pimentel C, Jar M: Convolutional codes under a minimal trellis complexity measure. IEEE Trans. Commun 2009, 57: 1-5.

    Article  Google Scholar 

  8. Katsiotis A, Rizomiliotis P, Kalouptsidis N: New constructions of high-performance low-complexity convolutional codes. IEEE Trans. Commun 2010, 58: 1950-1961.

    Article  Google Scholar 

  9. Hug F, Bocharova IE, Johannesson R, Kudryashov B: Searching for high-rate convolutional codes via binary syndrome trellises. In Proc. IEEE Int. Symp. Inform. Theory. Seoul, Korea; 2009:1358-1362.

    Google Scholar 

  10. Katsiotis A, Kalouptsidis N: On (n,n-1) punctured convolutional codes and their trellis modules. IEEE Trans. Commun 2011, 59: 1213-1217.

    Article  Google Scholar 

  11. Pedroni BU, Pedroni VA, Souza RD: Hardware implementation of a Viterbi decoder using the minimal trellis. In Proc. of the 4th Inter. Symp. on Commun., Control and Signal Processing (ISCCSP’2010). Limassol, Cyprus; 2010:1-4.

    Google Scholar 

  12. Vucetic B, Yuan J: Turbo Codes: Principles and Applications. Kluwer Academic Publishers, Boston/Dordrecht/London; 2000.

    Book  Google Scholar 

  13. Johannesson R, Zigangirov KS: Fundamentals of Convolutional Coding. Wiley-IEEE Press, New York; 1999.

    Book  Google Scholar 

  14. Schlegel C, Pérez L: Trellis and Turbo Coding. Wiley-IEEE Press, New York; 2004.

    Book  Google Scholar 

  15. McEliece RJ: On the BCJR trellis for linear block codes. IEEE Trans. Inf. Theory 1996, 42: 1070-1092.

    Google Scholar 

  16. Divsalar D, McEliece RJ: Effective free distance of turbo codes. IEEE Electron. Lett 1996, 32: 445-446. 10.1049/el:19960321

    Article  Google Scholar 

Download references

Acknowledgements

This work was partially supported by the CAPES and CNPq (Brazil).a

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cecilio Pimentel.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Authors’ original file for figure 2

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Benchimol, I., Pimentel, C., Souza, R.D. et al. High-rate systematic recursive convolutional encoders: minimal trellis and code search. EURASIP J. Adv. Signal Process. 2012, 243 (2012). https://doi.org/10.1186/1687-6180-2012-243

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1687-6180-2012-243

Keywords