Wednesday, July 02, 2014

Stable, Robust and Super Fast Reconstruction of Tensors Using Multi-Way Projections - implementation -




We introduce an analytical reconstruction formula that allows one to recover an Nth-order data tensor from a reduced set of multi-way compressive measurements by exploiting its low multilinear-rank structure. Moreover, it is shown that, an interesting property of multi-way measurements allows us to build the reconstruction based on compressive linear measurements of fibers taken only in two selected modes, independently of the tensor order N. In addition, it is proved that, in the matrix and 3rd-order tensor cases, the proposed reconstruction X−−τ is stable in the sense that the approximation error is comparable to the one provided by the best low-multilinear-rank approximation, i.e ∥X−−−X−−^τ∥≤K∥X−−−X−−0∥, where τ is a threshold parameter that controls the approximation error, K is a constant and X−−0 is the best low multilinear-rank approximation of X−−. Through the analysis of the upper bound of the approximation error, we find that, in the 2D case, an optimal value for the threshold parameter τ=τ0≠0 exists. On the other hand, in the 3D case, the optimal value for this parameter is τ=0 which means that this parameter does not need to be tuned in order to obtain good reconstructions. We also present extensive simulation results indicating the stability and robustness of the method when it is applied to real-world 2D and 3D signals. A comparison with state of the arts Compressed Sensing (CS) methods specialized for multidimensional signals is also included. A very attractive characteristic of the proposed method is that it provides a direct computation, i.e. it is non-iterative unlike all existing CS algorithms, thus providing super fast computations, even for large datasets.
I recently asked the authors whether they would make available their important implementations. Cesar Caiafa kindly responded with:

Dear Igor,
I just want to let you know that our preprint is now updated in ArXiv (http://arxiv.org/abs/1406.3295).
Now it includes a link to our Matlab code available at
http://web.fi.uba.ar/~ccaiafa/Cesar/Low-Rank-Tensor-CS.html
Thank you very much.
Best Regards


Cesar
Thank you Cesar !



We review and introduce new representations of tensor train decompositions for large-scale vectors, matrices, or low-order tensors. We provide extended definitions of mathematical multilinear operations such as Kronecker, Hadamard, and contracted products, with their properties for tensor calculus. Then we introduce an effective low-rank tensor approximation technique called the tensor train (TT) format with a number of mathematical and graphical representations. We also provide a brief review of mathematical properties of the TT format as a low-rank approximation technique. With the aim of breaking the curse-of-dimensionality in large-scale numerical analysis, we describe basic operations on large-scale vectors and matrices in TT format. The suggested representations can be used for describing numerical methods based on the TT format for solving large-scale optimization problems such as the system of linear equations and eigenvalue problems.


Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly