Multilinear rank approximation problems are extremal eigenvalue problems
Bart De Moor


Multi-way arrays of data or numerical tensors are used in many application fields, such as statistics, biomedical and telecommunications signal processing, system identification, datamining and data-assimilation, bio-informatics and scientific computing (For a recent NSF workshop on the topic, see e.g. http://www.cs.cornell.edu/cv/TenWork/Home.htm). It is well known that a 2D data matrix can be approximated, in a least square sense, by a rank reduced one via the singular value decomposition (the Eckart-Mirsky-Young theorem). The result is a sum of rank one matrices, the terms of which are mutually orthogonal.

However, for tensors, the problem is less obvious, as there exist several different notions of the concept of tensor rank. In this talk, we discuss a multilinear generalization of the singular value decomposition, but also briefly touch upon other decompositions. We discuss the approximation of a higher order tensor in a least squares sense, by a tensor that has a lower multilinear rank (which we will also elaborate on). Fitting a tensor in a least squares sense, by some ‘structured, simpler’ tensor, obviously is a nonlinear and non-convex optimization problem.

However, the main contribution of this talk will be the demonstration that multilinear least squares rank approximations of tensors are in essence extremal eigenvalue problems. We will present an outline of this surprising result, using only linear algebra and systems realization theory. This surprising result sheds some new light on existing heuristic algorithms (such as higher-order orthogonal iterations or Newton-like optimization methods) and may open future research paths to find efficient algorithms for multilinear rank tensor approximations.