CS Events

Faculty Candidate Talk

Rethinking algorithms in Data Science: Scaling up optimization using non-convexity, provably

 

Download as iCal file

Monday, March 27, 2017, 10:30am

 

With the quantity of generated data ever-increasing in most research areas, conventional data analytics run into solid computational, storage, and communication bottlenecks. These obstacles force practitioners to often use algorithmic heuristics, in an attempt to convert data into useful information, fast. It is necessary to rethink the algorithmic design, and devise smarter and provable methods in order to flexibly balance the trade-offs between solution accuracy, efficiency, and data interpretability.

In this talk, I will focus on the problem of low rank matrix inference in large-scale settings. Such problems appear in fundamental applications such as structured inference, recommendation systems and multi-label classification problems. I will introduce a novel theoretical framework for analyzing the performance of non-convex first-order methods, often used as heuristics in practice. These methods lead to computational gains over classic convex approaches, but their analysis is unknown for most problems. This talk will provide precise theoretical guarantees, answering the long-standing question “why such non-convex techniques behave well in practice?” for a wide class of problems. I will discuss implementation details of these ideas and, if time permits, show the superior performance we can obtain in applications found in physical sciences and machine learning.

Speaker: Anastasios Kyrillidis

Bio

Anastasios Kyrillidis received his PhD in Electrical and Computer Engineering from Ecole Polytechnique Federale de Lausanne (EPFL) in 2014. Currently, he is a Simons Fellowship PostDoc researcher at the University of Texas (Austin). His research interests

Location : CoRE A 301

Committee

Vladimir Pavlovic and Dimitri Metaxas

Event Type: Faculty Candidate Talk

Organization

University of Texas (Austin)