Posts Tagged ‘ Sparsity ’

Sparsity, Sparsity, Sparsity …… Some interesting papers

  1. Jacob, L., Obozinski, G., & Vert, J.-P. (2009). Group lasso with overlap and graph lasso. Proceedings of the 26th Annual International Conference on Machine Learning – ICML ’09, 1-8. New York, New York, USA: ACM Press. doi: 10.1145/1553374.1553431.
  2. D.M. Witten, R. Tibshirani, and T. Hastie, “A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis.,” Biostatistics (Oxford, England), vol. 10, Jul. 2009, pp. 515-34.
  3. R. Jenatton, G. Obozinski, and F. Bach, “Structured Sparse Principal Component Analysis,” Journal of Machine Learning Research: W&CP, vol. 9, Sep. 2009, pp. 366-373.
  4. M. Yuan and Y. Lin, “Model selection and estimation in regression with grouped variables,” Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 68, Feb. 2006, pp. 49-67.
  5. S. Mosci, D.- Universit, A. Verri, D.- Universit, and L. Rosasco, “A primal-dual algorithm for group sparse regularization with overlapping groups,” Advances in Neural Information Processing Systems, 2010.
  6. J. Mairal, F. Bach, J. Ponce, and G. Sapiro, “Online Learning for Matrix Factorization and Sparse Coding,” Journal of Machine Learning Research, vol. 11, Aug. 2009, pp. 19-60.

Questioning Sparsity

Went through Rigamonti’s CVPR 2011 paper “Are Sparse Representations Really Relevant for Image Classification?” {Rigamonti, Brown, Lepetit}

Recently there being quite a lot of papers on the Sparsity, above is a very valid question. They report lot of experiments and compare many different techniques. Their conclusion is sparsity in important while learning Feature Dictionary but not helpful during classification. Although only thing it was able to convince me was that might be in their setting the convexity is not working.

Looking forward to see rebuttals or papers questioning or answering questions raised by Rigamonti; in coming year. Overall this appears to be paper that will be cited quiet a lot.