Statistical inference from noisy data is a central problem in computer science, statistics, and machine learning. For a fixed statistical model, one may be tasked with the problems of distinguishability and recovery. Given an observation, the distinguishability problem asks whether we can determine if the observation is produced by noise or it has an actual planted signal in it. Meanwhile, the recovery problem tries to estimate the underlying planted signal. In some statistical learning settings, there is currently a statistical-computational gap - It is not known if there exists computationally efficient distinguishing and/or recovery algorithms that achieve what is information theoretically possible (via exponential time algorithms).
In this thesis, we define and explore a statistical model called sparse tensor PCA. Our model generalizes the widely studied statistical models of sparse PCA (Wigner version) and tensor PCA, and is closely related to the planted k-densest sub-hypergraph model. We design and analyze distinguishing and recovery algorithms that interpolate within the statistical-computational gap based on a parameter t. When t is small, our algorithms run in polynomial time and match the upper bounds of known polynomial time algorithms. When t is large, our algorithms run in exponential time and matches known information-theoretic lower bounds. We also prove an information-theoretic lower bound for approximate recovery and a low-degree polynomial lower bound for distinguishing. As a side contribution from our analysis, we generalize known bounds on tensor norms when restricted to interactions with sparse vectors, and also prove bounds on the covering number of s-sparse unit sphere.