High-rate systematic recursive convolutional encoders: minimal trellis and code search

We consider high-rate systematic recursive convolutional encoders to be adopted as constituent encoders in turbo schemes. Douillard and Berrou showed that, despite its complexity, the construction of high-rate turbo codes by means of high-rate constituent encoders is advantageous over the construction based on puncturing rate-1/2 constituent encoders. To reduce the decoding complexity of high-rate codes, we introduce the construction of the minimal trellis for a systematic recursive convolutional encoding matrix. A code search is conducted and examples are provided which indicate that a more finely grained decoding complexity-error performance trade-off is obtained.


Introduction
The typical turbo code configuration is the parallel concatenation of two systematic recursive constituent convolutional encoders of rate 1/2 connected via an interleaver, resulting in a code of rate 1/3. However, higher rate turbo codes may be useful in modern wireless, magnetic recording and fiber optics applications [1]. The usual approach to increase the overall rate is to puncture selected bits of the turbo codeword [2]. An alternative is to use highrate constituent encoders [1,3,4] what, according to [3], offers several advantages, such as better convergence of the iterative process, higher throughput, reduced latency, and robustness of the decoder. If puncturing is still needed to achieve a required rate, fewer bits have to be discarded when compared to the conventional rate-1/2 constituent encoders, resulting in less degradation of the correcting capability of the constituent code [3].
In [1], a class of systematic recursive convolutional encoders restricted to be of rate k/(k +1) is proposed. The codes are optimized in terms of the pairs (d i , N i ), where d i is the minimum weight of codewords generated by input sequences of weight i and N i are their multiplicities, and thus suited to be used as constituent encoders of turbo codes [5]. Good systematic recursive encoding matrices with increasing values of encoder memory sizes for a fixed code rate are listed. Turbo codes constructed with the family of constituent encoders given in [1] are shown *Correspondence: cecilio@ufpe.br 2 UFPE, Recife-PE, Brazil Full list of author information is available at the end of the article to outperform some high-rate turbo codes obtained by puncturing a rate-1/2 constituent encoder.
One major drawback of high-rate constituent encoders is the decoding complexity since it increases exponentially with k and with the constraint length for various decoding algorithms. With the motivation of proposing a class of turbo codes with low complexity, high-rate constituent encoders, Daneshgaran et al. [4] constructed recursive constituent encoders of rate k/(k + 1) by puncturing a rate-1/2 recursive mother encoder. However, a reduction in decoding complexity is obtained at the expense of a reduced spectrum (d i , N i ) when compared to that of the best recursive encoder of rate k/(k + 1) [1].
An alternative to reduce the decoding complexity is to consider trellis representations for the constituent codes other than the conventional trellis usually adopted. For nonrecursive convolutional encodes, there is a trellis structure, the minimal trellis [6], which represents the coded sequences minimally under various complexity measures. The interest on the minimal trellis representation comes from its good error performance versus decoding complexity trade-off [7][8][9][10] and its potential power consumption and hardware utilization reductions [11]. However, the minimal trellis construction presented in [6] cannot be readily applied to turbo codes. The reason is that the mapping between information bits and coded bits produced by the minimal trellis corresponds to nonrecursive convolutional encoders, while systematic, recursive mappings are required in turbo coding. This article presents a method which can fill this gap. http://asp.eurasipjournals.com/content/2012/1/243 In this article, we introduce the construction of the minimal trellis for a systematic recursive convolutional encoding matrix, the encoding required for the constituent codes of a turbo code. Our goal is to reduce the decoding complexity of a turbo decoder operating with high-rate constituent encoders. We also conduct a code search to show that a more finely grained decoding complexity-error performance trade-off is achieved with our approach. We tabulate several new encoding matrices with a larger variety of complexities than those in [1], as well as code rates other than k/(k + 1). The proposed minimal trellis can be constructed for systematic recursive convolutional encoders of any rate. Thus, our approach is more general than that in [1], while allowing that a distance spectrum (d i , N i ) better than that of the punctured codes in [4] can be achieved.
The rest of this article is organized as follows. In Section 2, we introduce some basic definitions and notations. Section 3 introduces the minimal trellis construction for a systematic recursive convolutional encoding matrix. In Section 4, we present search code results. Section 5 concludes the article.

Preliminaries
Consider a convolutional code C(n, k, ν), where ν, k and n are the overall constraint length, the number of binary inputs and binary outputs, respectively, while the code rate is R = k/n. Every convolutional code can be represented by a semi-infinite trellis which (apart from a short transient in its beginning) is periodic, the shortest period being a trellis module. The conventional trellis module conv consists of a single trellis section with 2 ν initial states and 2 ν final states; each initial state is connected by 2 k directed branches to final states, and each branch is labeled with n bits.
The minimal trellis module, min , for nonrecursive convolutional codes was developed in [6]. Such a structure has n sections, 2 ν t states at depth t, 2 b t branches emanating from each state at depth t, and one bit labeling each branch, for 0 ≤ t ≤ n − 1. The trellis complexity of the module , TC( ), defined in [6] captures the complexity of trellis-based decoding algorithms [12]. It is shown in [6] that TC( conv ) = n k 2 ν+k and TC( min ) = 1 k n−1 t=0 2 ν t + b t symbols per bit. The state and the branch complexity profiles of the minimal trellis are denoted by ν = ( ν 0 , . . . , ν n−1 ) and b = ( b 0 , . . . , b n−1 ), respectively. It has been shown in [6] that for many nonrecursive convolutional codes the trellis complexity TC( min ) of the minimal trellis module is considerably smaller than the trellis complexity TC( conv ) of the conventional trellis module.
A generator matrix G(D) of a convolutional code C(n, k, ν) is a full-rank k × n polynomial (in D) matrix that encodes/generates C, i.e., is realizable by a linear sequential circuit (called an encoder for C) [13]. Let G(0) denote the binary matrix obtained when substituting D with 0 in the matrix G(D). If G(0) is full-rank, then G(D) is called an encoding matrix and is of particular interest.
where the non-zero elements of (D) are called the invariant factors of G(D), the k × k matrix A(D) and the n × n matrix B(D) are both polynomial with unit determinants. Let ν i be the constraint length for the ith input of a polynomial generator matrix G(D), defined as Then the overall constraint length (already mentioned) is given by . . , m are the k × n (scalar) generator submatrices. The scalar generator matrix G scalar is given by [6] The "matrix module" is the matrixĜ defined as The generator matrix G(D) is said to be in the left-right (LR) (or minimal span or trelis oriented) form, if no column of G scalar contains more than one underlined entry (the Leftmost nonzero entry in its row), or more than one overlined entry (the Rightmost nonzero entry in its row). If G(D) is in LR form, then it produces the minimal trellis for the code [6].

Construction of the minimal trellis for systematic recursive encoding matrices
The construction of the "minimal" trellis for the systematic recursive convolutional encoding matrix G sys (D) involves two main steps. First, we find the minimal trellis of an equivalent nonsystematic nonrecursive minimalbasic generator matrix in LR form. Then, the mapping between the information bits and coded bits in this trellis is changed in order to construct a systematic minimal trellis. The complete algorithm is summarized in the end of this section.

Minimal trellis for an equivalent encoder
Let G sys (D) be a systematic recursive encoding matrix for a rate R = k/n convolutional code and let q(D) be the least common multiple of all denominators of the entries in G sys (D). We construct a nonsystematic nonrecursive basic encoding matrix, denoted by G b (D), equivalent to G sys (D), as follows ( [13], p. 44), ( [14], Theorem 4.6): • Find the Smith form decomposition of the polynomial matrix q(D)G sys (D): where the non-zero elements of (D) are called the invariant factors of q(D) G sys (D), the k × k matrix A(D) and the n × n matrix B(D) are both polynomial with unit determinants. Thus, the invariant factor decomposition of G sys (D) is where (D) = (D)/q(D). • Form the desired encoding matrix G b (D) as the k × n submatrix of B(D) in (5) consisting of the first k rows.
We can then perform a sequence of row operations on G b (D) to construct a nonsystematic nonrecursive minimal-basic encoding matrix G(D) in LR form.  Table IV] The invariant factor decomposition of G sys (D) is given by The basic encoding matrix G b (D) equivalent to G sys (D) is readily obtained from (7). Using the greedy-algorithm [15] we turn G b (D) into the LR form, or equivalently, into the minimal-span form [15,Theorem 6.11], resulting in the following generator matrix with overall constraint length ν = 3. The trellis complexity of the conventional module for G sys (D) is TC( conv ) = 85.33 symbols per bit.
The minimal trellis module for the convolutional code with G(D) given in (8), constructed with the method presented in [6], is shown in Figure 1. It has state and branch complexity profiles given byν = (3, 4, 4, 4) and b = (1, 1, 1, 0), respectively. Thus, the trellis complexity  (n, k, ν) = (4, 3, 3) convolutional code with G(D) given in (8). The solid/blue branches represent "0" codeword bits and the dashed/red branches represent "1" codeword bits. In the first three transitions, the upper (resp. lower) branches correspond to information bit "0" (resp. "1"), i.e., the standard convention. http://asp.eurasipjournals.com/content/2012/1/243 of the minimal trellis is TC( min ) = 32 symbols per bit. In the first three trellis sections, the upper (resp. lower) branches correspond to information bit "0" (resp. "1"), i.e., the standard convention [6]. The solid branches represent "0" codeword bits and the dashed branches represent "1" codeword bits. The mapping between information bits and coded bits in Figure 1 is not systematic, so this trellis does not represent the same encoder as G sys (D). The construction of a minimal trellis for systematic recursive encoders is discussed in the next section.

Minimal trellis for systematic recursive encoders
Originally, minimal trellises have been constructed for codes, not for matrices (or encoders). However, the convention that the upper branches refer to the information bit 0 and the lower branches refer to the information bit 1 yields a particular encoding which, in general, is not systematic. We note that the association of solid/dashed branches with codeword bits can not be changed, as any change in this regard would result in a trellis which would no longer represent the convolutional code. However, by enforcing a different convention on the association of the branches to the information bits, only a different encoding for the same code is obtained. Since in the systematic part the information bit and the coded bit must have the same value, all we need to do is to change, in the information sections of the minimal trellis, the standard convention to that where solid branches refer to the information bit 0 and the dashed branches refer to the information bit 1. Let us refer to this convention as the systematic convention. Getting back to the minimal trellis in Figure 1, for the nonsystematic nonrecursive convolutional encoding matrix G(D) given in (8), we only need to adopt the systematic convention in the first three sections to turn this minimal trellis into a systematic trellis.
It remains to show that the minimal trellis in Figure 1 with the systematic convention is the minimal trellis for the systematic recursive convolutional encoding matrix G sys (D) in (6). We note that the generator matrices G(D) given in (8) and G sys (D) in (6) are equivalent in the sense that both generate the same code. Therefore, except for the edge convention, the two trellises are exactly the same. Assuming that the information bits at the input of the encoder associated with G sys (D) and the information bits associated with the minimal trellis in Figure 1 occupy the same positions, the systematic convention is unique. Consequently, the trellis in Figure 1 with the systematic convention in the first three sections is the minimal trellis for the systematic recursive convolutional encoding matrix G sys (D) in (6).
The complete algorithm for the construction of the minimal trellis for a systematic recursive encoding matrix is summarized as:  (n, k, ν) = (5, 4, 3) systematic recursive encoder. Solid/blue branches represent "0" codeword bits while dashed/red branches represent "1" codeword bits. The same convention (i.e., the systematic convention) applies to the first four trellis sections.

New codes
Graell i Amat et al. [1]  variety of complexities compared to those listed in [1]. The main idea is to propose templates with polynomial generator matrix G(D) in trellis-oriented form [6,7] with fixed TC( min ). This can be done by placing the leading (underlined) and trailing (overlined) 1's of each row of the "matrix module" in specific positions, leaving the other positions free to assume any binary value.
Example 3. The following "matrix module" is associated with an ensemble of nonsystematic nonrecursive convolutional codes of rate 3/4 and trellis complexity of the minimal trellis module TC( min ) = 21.33 symbols per bit.
Remark. In our code search we enforce that the positions of the underlined 1's are in the first k columns of the matrix module in order to assure that the information bits of the corresponding minimal trellis are in the first k sections.
By varying the free positions (marked with an "*") in this matrix, several polynomial generator matrices G(D) are produced. For each matrix G(D) in this ensemble, we apply Steps (3) and (4) of the algorithm, i.e., we construct the minimal trellis module for G(D) and adopt the systematic convention to the information sections, and then calculate the pairs (d i , N i ), for i = 2, . . . , 6. The dominant term, d 2 , is called the effective free distance of the turbo code [16].
Codes of rate R = 2/4, 3/4, 3/5, 4/5 are listed in Table 1, which indicates the relationship between TC( min ) and the error performance expressed in terms of the pairs (d i ,N i ). The matrix G(D) shown in the table together with the systematic convention is used to construct the minimal trellis that attains the corresponding (d i , N i ), i = 2, . . . , 6. The existing codes taken from [1] are also indicated in Table 1. Their (minimal) trellis complexity, shown in the table, was obtained by applying the complete algorithm of Section 3.2 to their respective systematic recursive encoding matrices. For example, the matrices G sys (D) listed in [1, Table IV] for R = 3/4 yield TC( min ) = 10.67 (for ν = 2), TC( min ) = 32 (for ν = 3) and TC( min ) = 64 (for ν = 4). New codes with a variety of trellis complexities are found with our code search procedure. The effective free distance and/or multiplicity of the code are gradually improved as more complex codes are sought.

Conclusions
We present a method to construct the minimal trellis for a recursive systematic convolutional encoding matrix. Such a trellis minimizes the trellis complexity measure introduced by McEliece and Lin [6], which applies to trellisbased decoding algorithms. As a contribution of this work, several new convolutional encoding matrices having an equivalent systematic recursive encoding matrix, optimized for turbo codes, are tabulated. They provide a wide range of performance-complexity trade-offs, to serve several practical applications.