I added the link to the paper “A Multilevel Framework for Sparse Optimization With Application to Inverse Covariance Estimation and Logistic Regression” soon to appear in SIAM Scientific Computing (SISC) journal. The paper describes a method that accelerates sparse optimization methods that use L1 regularization to achieve sparse solution. We show how to apply this method to the sparse inverse covariance method (also known as GLASSO) and the L1-regularized logistic regression.
New work presented at Optimization workshop at NIPS 2015
At the beginning of November our “A Multilevel Acceleration for l1-regularized Logistic Regression” work on how to accelerate the L1-regularized logistic regression problem was accepted to the Optimization workshop at NIPS 2015. Last week, I presented the work in the Optimization workshop at NIPS 2015. This year the Optimization workshop grew a lot, having about 50 posters in several optimization topics.
This work was a collaboration between Earn Treister (Univ. Of British Columbia) and myself (Intel Labs).