**2. Higher-order singular value decomposition and truncation**

## **2.1 Compact HOSVD**

The higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. The classical computation of a HOSVD was introduced by L. R. Tucker [5] and further developed by L. De Lathauwer et al. [6]. Robust computations or improvements have been since proposed [6–8]. For a tensor T of order *d*, the idea is to compute the singular value decomposition of each factor- *k* flattening *T*ð Þ*<sup>k</sup>* of a tensor T , i.e. a "matricisation" of the tensor where the rows of the matrix are related to the *k*-th dimension.

In our case, we consider a third-order tensor and successive spatial, parameter and temporal flattening to find the singular vectors. The spatial tensor flattening *<sup>T</sup>*ð Þ *<sup>X</sup>* <sup>∈</sup> *<sup>I</sup>*�ð Þ *<sup>J</sup>:<sup>N</sup>* of the tensor <sup>T</sup> *<sup>x</sup>* � *<sup>A</sup>*<sup>0</sup> <sup>⊗</sup> *<sup>e</sup><sup>N</sup>* � � is

$$\mathbf{x}\left(T\_{(X)}\right)\_{ip} = \mathbf{x}\left(\mathbf{X}\_i, \mu\_j, t^n\right) - \mathbf{X}\_i, \qquad p = j + (n - 1)f \tag{7}$$

(meaning that the *μ* and *t* dimensions are stacked in columns in the matrix). The SVD of *T*ð Þ *<sup>X</sup>* provides *rx* nonzero singular values with *rx* corresponding singular orthonormal vectors **Φ***k*, *k* ¼ 1, … ,*rx*. Similarly the parameter flattening leads to a rank *r<sup>μ</sup>* with *r<sup>μ</sup>* modes **Ψ**ℓ, ℓ ¼ 1, … ,*r<sup>μ</sup>* and the time flattening a rank *rt* with *rt* modes *wm*, *m* ¼ 1, … ,*rt*. The tuple *rX*,*rμ*,*rt* � � is the multilinear rank of T *<sup>x</sup>*. Then tensor T *<sup>x</sup>* can be written as

$$\mathcal{T}\_{\boldsymbol{x}} = \boldsymbol{A}^{0} \otimes \boldsymbol{e}^{N} + \sum\_{k=1}^{r\_{\boldsymbol{X}}} \sum\_{\ell=1}^{r\_{\boldsymbol{\mu}}} \sum\_{m=1}^{r\_{\boldsymbol{\mu}}} a\_{k\ell m} \boldsymbol{\Phi}\_{k} \otimes \boldsymbol{\Psi}\_{\ell} \otimes \boldsymbol{w}\_{m}. \tag{8}$$

#### **2.2 Approximation**

Among the applications, HOSVD can be used to define a low-order approximation of tensors. The so-called truncated HOSVD [9–11] consists in truncating the expansion (8) at a given multilinear rank ð Þ *K*, *L*, *M* , *K* ≤*rX*, *L*≤*rμ*, *M* ≤*rt*, leading to (6). Let mlrankð Þ T denote the multilinear rank of the tensor T . It has been shown that the approximation (6) returns a quasi-optimal solution of the nonlinear non-convex least-square problem

$$\min\_{\tilde{T}} \quad \frac{1}{2} \| \mathcal{T}\_x - \tilde{T} \|\_F^2 \tag{9}$$

(here <sup>∥</sup>*:*∥*<sup>F</sup>* is the Frobenius norm for tensors) subject to mlrank <sup>~</sup> T � � <sup>¼</sup> ð Þ *<sup>K</sup>*, *<sup>L</sup>*, *<sup>M</sup>* . The truncation ranks can be determined *a priori* according to the classical relative information content (RIC) of the SVD theory.

Eq. (6) can already be used as a compressed representation of the data, allowing for a lower storage complexity and a simpler manipulation, with low information loss if the RIC is high.
