Paper

Sparse and Low-Rank High-Order Tensor Regression via Parallel Proximal Method

Recently, tensor data (or multidimensional array) have been generated in many modern applications, such as functional magnetic resonance imaging (fMRI) in neuroscience and videos in video analysis. Many efforts are made in recent years to predict the relationship between tensor features and univariate responses. However, previously proposed methods either lose structural information within tensor data or have prohibitively expensive time costs, especially for large-scale data with high-order structures. To address such problems, we propose the Sparse and Low-rank Tensor Regression (SLTR) model. Our model enforces sparsity and low-rankness of the tensor coefficient by directly applying $\ell_1$ norm and tensor nuclear norm, such that it preserves structural information of the tensor. To make the solving procedure scalable and efficient, SLTR makes use of the proximal gradient method, which can be easily implemented parallelly. We evaluate SLTR on several simulated datasets and one video action recognition dataset. Experiment results show that, compared with previous models, SLTR can obtain a better solution with much fewer time costs. Moreover, our model's predictions exhibit meaningful interpretations on the video dataset.

Results in Papers With Code
(↓ scroll down to see all results)