CS Events

PhD Defense

Optimization in Sparse Learning: from Convexity to Non-convexity

 

Download as iCal file

Tuesday, October 02, 2018, 12:00pm

 

Powerful machine learning models and large-scale training data motivate the rapid popularization of AI method in various applications such as data science, computer vision and natural language processing. The explosive model complexity and training data scale increase propose an urgent requirement for highly efficient model training algorithms. Optimization algorithm research for model training, as a fundamental issue in machine learning, keeps on getting extensive attention from academia and industry.

In this presentation, I will introduce my research on optimization algorithm design and analysis for sparse model learning problems. The model learning objective includes optimizing convex model with sparse inducing regularizer and model cardinality constrained minimization which is a non-convex problem. In addition to the single machine algorithms, I will also introduce my recent research progress in communication efficient distributed sparse model learning. The designed algorithm targeted for each specific problem significantly improves the model training efficiency compared to baseline algorithms

Speaker: Bo Liu

Bio

NULL

Location : CBIM 22

Committee

Prof, Dimitris Metaxas (Chair), Prof. Vladimir Pavlovic, Prof. Ahmed Elgammal, Prof. Michael Katehakis (Management Science and Information Systems)

Event Type: PhD Defense

Abstract: 

Organization

Dept. of Computer Science