Skip to content Skip to navigation
Qualifying Exam
1/20/2017 10:00 am

Efficient k-Support-Norm Regularized Minimization via Fully Corrective Frank-Wolfe Method

Bo Liu, Ph.D Student

Examination Committee: Professors Dimitris Metaxas, Kostas Bekris, Konstantinos Michmizos and Vinod Ganapathy


The k-support-norm regularized minimization has recently been applied with success to sparse prediction problems. The proximal gradient method is conventionally used to minimize this composite model. However it tends to suffer from expensive iteration cost thus the model solving could be time consuming. In our work, we reformulate the k-support-norm regularized formulation into a constrained formulation and propose a fully corrective Frank-Wolfe type algorithm to minimize the constrained model. The convergence behavior of the proposed algorithm is analyzed. Experiment results demonstrate the use of k-support-norm and superior efficiency of the proposed algorithm..