In minisymposium: Plenary TalksFri 08:30–09:15, Auditorium Max Weber, Chair: Eugene Tyrtyshnikov
We give an introduction to tensor decompositions and applications, highlighting some recent trends. We discuss basics such as higher-order variants of "rank" and fundamental decompositions such as the canonical polyadic decomposition and the Tucker decomposition. The uniqueness of CPD, under mild conditions which do not have a matrix counterpart, makes it a powerful tool for signal separation and data analysis. Block term decompositions allow us to retrieve components that are more general and possibly more realistic than rank-1 terms. Multilinear singular value decomposition and low multilinear rank approximation are key in multilinear extensions of subspace techniques. Coupled decompositions express complicated tasks as combinations of pieces that can be handled. We explain why numerical multilinear algebra is very different from numerical linear algebra. Tensor trains and hierarchical Tucker decompositions allow one to break the curse of dimensionality in a numerically reliable manner and show promise for big data analysis in combination with compressed sensing.
- A. Cichocki, D. Mandic, A.-H. Phan, C. Caiafa, G. Zhou, Q. Zhao and L. De Lathauwer, Tensor decompositions for signal processing applications. From two-way to multiway component analysis, IEEE Signal Processing Magazine 32(2) (2015), pp. 145–163.
- N. Vervliet, O. Debals, L. Sorber, M. Van Barel and L. De Lathauwer, Tensorlab v3.0, Available online, March 2016. URL: http://www.tensorlab.net/.