In this paper, only the case that is fixed while can be arbitrary is considered. Hence, the object is to optimize that will minimize the mutual coherence . After reviewing the former related work, the proposed algorithm is introduced.

### 3.1. Elad's Method [7]

Instead of mutual coherence, Elad considered a different coherence—-averaged mutual coherence which reflects the average behavior. The -averaged mutual coherence of is defined as the average of all absolute and normalized inner products between different columns in (denoted as ) that are above . Formally

Putting very simply, the object is to minimize with respect to , assuming that and the parameter are fixed and known. In this algorithm, the main object is the reduction of the absolute inner products that are above . The Gram matrix of the normalized equivalent dictionary is computed and the values above are "shrinked" by multiplying with . In order to preserve the order of the absolute values in the Gram matrix, entries in with magnitude below but above are "shrinked" by a small amount using the following function:

The former shrinking operation causes the resulting Gram matrix to become full rank in general case. Thus, the next steps should mend this by forcing the rank to be and find the matrix that could best describe the squared root of the obtained Gram matrix. The process could be realized using SVD. The details can be found in [7].

### 3.2. Duarte-Carvajalino and Sapiro's Method

Unlike the previous one, Duarte-Carvajalino and Sapiro's method is noniterative. Instead of targeting on -averaged mutual coherence between and , this method addressed the problem by making any subset of columns in as orthogonal as possible, or equivalently, making the Gram matrix as close as possible to an identity matrix. Their approach was carried out as follows.

Consider the Gram matrix of the equivalent dictionary

The object is to find such that makes the Gram matrix as close as possible to identity matrix

Multiplying both sides of (10) with on the left and on the right, it becomes

Now, consider the eigen-decomposition of which is

Then (11) becomes

which is equivalent to

By denoting , they finally formulated the problem to minimize the following function with respect to :

By solving the problem of (15), they achieved to optimize the projection matrix. The details to solve the problem can be found in [8].

### 3.3. The Proposed Method

Elad's method is time-consuming and the shrinkage function creates some large values that are not present in the original Gram matrix. Large off-diagonal values in the Gram matrix ruin completely the worst case guarantees of the reconstruction algorithms. Duarte-Carvajalino and Sapiro's method is noniterative and the reconstruction relative error rate is high. To overcome these drawbacks, a method based on ETF design is proposed in this paper. The object is to find an equivalent dictionary which is as close as possible to an ETF because of the minimum coherence property of ETF, and then from the equivalent dictionary, the optimized projection matrix can be constructed. It is impossible to solve the problem exactly because of the complexity, so an alternative minimization type method is used to find a feasible solution.

Firstly, model the problem as an optimization problem. Let be the Gram matrix of . The mutual coherence of is the maximum absolute value of the off-diagonal entry of , supposing the columns of are normalized. For such , if the magnitudes of all off-diagonal entries of are equal, has minimum coherence [10]. This normalized dictionary is called an ETF. Although this type of frame has many nice properties, ETF does not exist for any arbitrary selection of dimension. Therefore the optimization process aims at finding the nearest admissible solution which is as close as possible to an ETF.

For the normalized equivalent dictionary , the mutual coherence of is defined as

A column normalized dictionary is called ETF when there is a constant that

Strohmer and Heath Jr. in [16] showed that if there is an ETF in the set of uniform frames, it is the solution of

To study the lower bound of , the existence of an ETF and its Gram matrix, Strohmer showed that is lower bounded by

Let be the set of Gram matrices of all ETF. If , then the diagonal elements and the absolute values of the off-diagonal elements of are one and , respectively. A nearness measure of to the set of ETF can be defined as the minimum distance between the Gram matrix of and . To minimize the distance of a dictionary to ETF, it needs to solve

The matrix operator is defined as the maximum absolute value of the elements in the matrix. Instead, it is better to use a different norm space which simplifies the problem. An advantage of using in the given problem is that it considers the errors of all elements. Therefore it forms the following formulation:

where is Frobenius norm. This is a nonconvex optimization problem in general. It might have a set of solutions or have no solution. Extend to a convex set , which is not empty for any ,

Relaxing (21) by replacing with , it gives the following optimization problem:

A standard method to solve (23) is alternating projection [17]. In this work, a different method which has similarities with alternating projection is used. Although the proposed solution has similarities with alternating projection, it does not follow its steps exactly. The difference lies in the stage of updating the current solution with respect to . A point between the current solution and the projection on is chosen, that is because after being projected onto , the structure of the Gram matrix changes significantly and the selection of a new point in the following step is very difficult. After performing alternating minimization, the optimized projection matrix can be constructed from the output Gram matrix with a rank revealing QR factorization with eigenvalue decomposition. The details can be found in [18].

The conditions under which the algorithm converges can be found in [17].

The following are the steps of the proposed algorithm for optimizing the projection matrix, supposing the sparsifying matrix is known.

()Initialize: the projection matrix , sparsifying matrix , and equivalent dictionary , iterative steps

For

()Compute the Gram matrix , denote the element of as

()Project the Gram matrix onto , that is,

()Choose a point between the current solution and the projection on to update the Gram matrix

()Update the projection matrix using QR factorization with eigenvalue decomposition

end