Tensor completion in hierarchical tensor representations

H Rauhut, R Schneider, Ž Stojanac - Compressed Sensing and its …, 2015 - Springer
H Rauhut, R Schneider, Ž Stojanac
Compressed Sensing and its Applications: MATHEON Workshop 2013, 2015Springer
Compressed sensing extends from the recovery of sparse vectors from undersampled
measurements via efficient algorithms to the recovery of matrices of low rank from
incomplete information. Here we consider a further extension to the reconstruction of tensors
of low multi-linear rank in recently introduced hierarchical tensor formats from a small
number of measurements. Hierarchical tensors are a flexible generalization of the well-
known Tucker representation, which have the advantage that the number of degrees of …
Abstract
Compressed sensing extends from the recovery of sparse vectors from undersampled measurements via efficient algorithms to the recovery of matrices of low rank from incomplete information. Here we consider a further extension to the reconstruction of tensors of low multi-linear rank in recently introduced hierarchical tensor formats from a small number of measurements. Hierarchical tensors are a flexible generalization of the well-known Tucker representation, which have the advantage that the number of degrees of freedom of a low rank tensor does not scale exponentially with the order of the tensor. While corresponding tensor decompositions can be computed efficiently via successive applications of (matrix) singular value decompositions, some important properties of the singular value decomposition do not extend from the matrix to the tensor case. This results in major computational and theoretical difficulties in designing and analyzing algorithms for low rank tensor recovery. For instance, a canonical analogue of the tensor nuclear norm is NP-hard to compute in general, which is in stark contrast to the matrix case. In this book chapter we consider versions of iterative hard thresholding schemes adapted to hierarchical tensor formats. A variant builds on methods from Riemannian optimization and uses a retraction mapping from the tangent space of the manifold of low rank tensors back to this manifold. We provide first partial convergence results based on a tensor version of the restricted isometry property (TRIP) of the measurement map. Moreover, an estimate of the number of measurements is provided that ensures the TRIP of a given tensor rank with high probability for Gaussian measurement maps.
Springer