Time and Location: September 13 (Tuesday) 11:30am-12:30pm in CH 212
Speaker: Arnab Auddy (Columbia University)
Title: Why and how to use orthogonally decomposable tensors for statistical learning
Abstract: As we encounter more and more complex data generating mechanisms, it becomes necessary to model higher order interactions among the observed variables. Orthogonally decomposable tensors provide a unified framework for such modeling in a number of interesting statistical problems. While this is a natural extension of matrix SVD to tensors, they automatically provide much better identifiability properties. Moreover, a small perturbation affects each singular vector in isolation, and hence their recovery does not depend on the gap between consecutive singular values. In addition to the attractive statistical properties, the tensor decomposition problem in this case presents us with intriguing computational challenges. To understand these better, we will explore some statistical-computational tradeoffs, and also describe tractable methods that provide rate optimal estimators for the tensor singular vectors.