Due to proposition 2.1, it is possible to design LR filters based on HOSVD. This approach does not work when all p-ranks are full (i.e., r
p
=I
p
, p=1…P), since no projection could be done. However, the data may still have a LR structure. This is the case of correlated data where one or more ranks relative to a group of dimensions are deficient. Tensor decompositions allowing to exploit this kind of structure have not been promoted. To fill this gap, we propose to introduce a new tool which will be able to extract this kind of information. This section contains the main contribution of this paper: the derivation of the AU-HOSVD and its principal properties.
3.1 Generalization of standard operators
Notation of indices In order to consider correlated information, we introduce a new notation for the indices of a tensor. We consider , a P th-order tensor. We denote the set of the dimensions and , L subsets of
which define a partition of
. In other words, satisfy the following conditions:
Moreover is denoted . For example, when and , means .
A generalization of unfolding in matrices In order to build our new decomposition, we need a generalized unfolding, adapted from [2]. This operator allows to unfold a tensor into a matrix whose dimensions could be any combination of the tensor dimensions. It is denoted as , and it transforms
into a matrix .
A new unfolding in tensors We denote as Reshape the operator which transforms a tensor
into a tensor Reshape and Reshape−1 the inverse operator.
A new tensor product The n-mode product allows to multiply a tensor with a matrix along one dimension. We propose to extend the n-mode product to multiply a tensor with a matrix along several dimensions, combined in . Let be a square matrix. This new product, called multimode product, is defined as
(11)
The following proposition shows the link between multimode product and n-mode product.
Proposition 3.1(Link between n -mode product and multimode product).
Let be a P th-order tensor, be a partition of
, and be a square matrix. Then, the following equality is verified:
(12)
Proof
3.1
The proof of Theorem 3.1 relies on the following straightforward result:
This leads to and . Applying these two results on (11), we obtain
(13)
From Equation 2, Equation 13 is equivalent to
Finally, one has
Remark Thanks to the previous proposition and the commutative property of n-mode product, multimode product is also commutative.
3.2 AU-HOSVD
With the new tools presented in the previous subsection, we are now able to introduce the AU-HOSVD. This is the purpose of the following theorem.
Theorem 3.1(Alternative unfolding HOSVD).
Let and a partition of
. Then,
may be decomposed as follows:
(14)
where
-
l∈ [ 1,L], is unitary. The matrix is given by the singular value decomposition of the -dimension unfolding, .
-
is the core tensor. It has the same properties as the HOSVD core tensor.
Notice that there are several ways to decompose a tensor with the AU-HOSVD. Each choice of the gives a different decomposition. For a P th-order tensor, the number of different AU-HOSVD is given by the Bell number, B
P
:
The AU-HOSVD associated to the partition is denoted .
Proof 3.2.
First, let us consider , a partition of
. is a L th-order tensor and may be decomposed using the HOSVD:
(15)
where the matrix U(l) is given by the singular value decomposition of the l-dimension unfolding, .
Since the matrices U(l)’s are unitary, Equation 15 is equivalent to
(16)
Then, using proposition 3.1, the following equality is true:
(17)
which leads to
(18)
Finally, the operator Reshape−1 is applied
(19)
which concludes the proof.
Example For a third-order tensor with , , the AU-HOSVD will be written as follows:
(20)
with , and .
Remark Let be a 2P th-order Hermitian tensor. We consider 2L subsets of {I1,…,I
P
,I1,…,I
P
} such as
-
and are two partitions of {I1,…,I
P
}
-
l∈ [ 1,L],
Under these conditions, the AU-HOSVD of
is written:
As discussed previously, the main motivation for introducing the new AU-HOSVD is to extract the correlated information when processing the low-rank decomposition. This is the purpose of the following proposition.
Proposition 3.2(Low-rank approximation).
Let
, , be three P th-order tensors such that
where is a low-rank tensorc. Then, is approximated by
(22)
where , …, minimize the following criterion:
(23)
In this paper, the matrices ’s are given by truncation of the matrices ’s obtained by the of
: . This solution is not optimal in the sense of least squares but is easy to implement. However, thanks to the strong link with HOSVD, it should be a correct approximation. That is why iterative algorithms for AU-HOSVD will not be investigated in this paper.
Proof 3.3.
By applying Reshape to Equation 21, one obtains
Then, is a low-rank tensor . Proposition 2.1 can now be applied, and this leads to
Finally, applying R e s h a p e−1 to the previous equation leads to the end of the proof:
Discussion on choice of partition and complexity As mentioned previously, the total number of AU-HOSVD for a P th-order tensor is equal to B
P
. Since this number could become significant, it is important to have a procedure to find good partitions for the AU-HOSVD computation. We propose a two-step procedure. Since the AU-HOSVD has been developed for LR reduction, the most important criterion is to choose the partitions which emphasize deficient ranks. For some applications, it is possible to use a priori knowledge to select some partitions as will be shown in Section 5 for polarimetric STAP. Next, another step is needed if several partitions induce an AU-HOSVD with a deficient rank. At this point, we propose to maximize a criterion (see Section 5.3 for examples) over the remaining partitions.
Concerning the complexity, the number of operation necessary to compute the HOSVD of a P th-order tensor is equal to [3]. Similarly, the complexity of the AU-HOSVD is equal to .