Skip to main content

Table 1 Notations and preliminaries

From: Three-dimensional SAR imaging with sparse linear array using tensor completion in embedded space

Definitions Notations and formulas Notes
Vector \({\mathbf{b}}\)  
Matrix \({\mathbf{B}}\)  
Tensor \({\boldsymbol{\mathcal{A}}}\) \({\boldsymbol{\mathcal{A}}}^{(i)}\)—the ith matrix in a sequence
Unfolding/matricization The i-mode unfolding/matricization: \({\boldsymbol{\mathcal{A}}}_{(i)}\) Unfolding a tensor to a matrix
Frobenius norm \(\left\| {\boldsymbol{\mathcal{A}}} \right\|_{F} = \sqrt {\sum\nolimits_{{m_{1} }} {\sum\nolimits_{{m_{2} }} { \cdots \sum\nolimits_{{m_{I} }} {a_{{m_{1} m_{2} \cdots m_{I} }}^{2} } } } }\) \({\boldsymbol{\mathcal{A}}} \in {\mathbb{C}}^{{M_{1} \times M_{2} \times \cdots \times M_{I} }}\)
Tensor multiplication The i-mode product of tensor \({\boldsymbol{\mathcal{A}}}\) and matrix \({\mathbf{B}}\): \(({\boldsymbol{\mathcal{A}}} \times_{i} {\mathbf{B}})_{{m_{1} \cdots m_{i - 1} km_{i + 1} \cdots m_{I} }} = \sum\nolimits_{{m_{i} = 1}}^{{M_{i} }} {a_{{m_{1} m_{2} \cdots m_{I} }} b_{{km_{i} }} }\)  
Tucker decomposition \({\boldsymbol{\mathcal{A}}} = {\boldsymbol{\mathcal{C}}} \times_{1} {\mathbf{F}}^{(1)} \times_{2} {\mathbf{F}}^{(2)} \cdots \times_{I} {\mathbf{F}}^{(I)}\) \({\boldsymbol{\mathcal{C}}}\) is core tensor
   \({\mathbf{F}}^{(i)}\) is factor matrix
Multi-linear tensor product Multi-linear tensor product: \({\boldsymbol{\mathcal{C}}} \times \{ {\mathbf{F}}\} = {\boldsymbol{\mathcal{C}}} \times_{1} {\mathbf{F}}^{(1)} \times_{2} {\mathbf{F}}^{(2)} \cdots \times_{I} {\mathbf{F}}^{(I)}\) \(\{ {\mathbf{F}}\} = \{ {\mathbf{F}}^{(i)} \}_{i = 1}^{I}\) is a set of the factor matrices
  Multi-linear tensor product with the i-th mode excluded: \({\boldsymbol{\mathcal{C}}} \times_{ - i} \{ {\mathbf{F}}\} = {\boldsymbol{\mathcal{C}}} \times_{1} {\mathbf{F}}^{(1)} \cdots \times_{i - 1} {\mathbf{F}}^{(i - 1)} \times_{i + 1} {\mathbf{F}}^{(i + 1)} \cdots \times_{I} {\mathbf{F}}^{(I)}\)
Rank-one tensor If \({\boldsymbol{\mathcal{A}}}\) can be expressed as an outer product of vectors, i.e., \({\boldsymbol{\mathcal{A}}} = {\mathbf{x}}^{(1)} \circ {\mathbf{x}}^{(2)} \circ \cdots \circ {\mathbf{x}}^{(I)}\), it satisfies rank-one tensor \(\circ\) denotes the outer product of vectors
Tensor rank \({\text{rank}}({\boldsymbol{\mathcal{A}}})\) The minimum quantity of rank-one tensors which make up \({\boldsymbol{\mathcal{A}}}\) by their sum