skip to main content
10.1145/1553374.1553482acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
research-article

An efficient sparse metric learning in high-dimensional space via l1-penalized log-determinant regularization

Published: 14 June 2009 Publication History

Abstract

This paper proposes an efficient sparse metric learning algorithm in high dimensional space via an l1-penalized log-determinant regularization. Compare to the most existing distance metric learning algorithms, the proposed algorithm exploits the sparsity nature underlying the intrinsic high dimensional feature space. This sparsity prior of learning distance metric serves to regularize the complexity of the distance model especially in the "less example number p and high dimension d" setting. Theoretically, by analogy to the covariance estimation problem, we find the proposed distance learning algorithm has a consistent result at rate O (√m2 log d)/n) to the target distance matrix with at most m nonzeros per row. Moreover, from the implementation perspective, this l1-penalized log-determinant formulation can be efficiently optimized in a block coordinate descent fashion which is much faster than the standard semi-definite programming which has been widely adopted in many other advanced distance learning algorithms. We compare this algorithm with other state-of-the-art ones on various datasets and competitive results are obtained.

References

[1]
Banerjee, O., Ghaoui, L. E., & d'Aspremont, A. (2007). Model selection through sparse maximum likelihood estimation. Journal of Machine Learning Research.
[2]
Davis, J. V., Kulis, B., Jain, P., Sra, S., & Dhillon, I. S. (2007). Information-theoretic metric learning. Proc. of ICML.
[3]
Donoho, D. (2006). For most large undertermined systems of linear equations the minimum l1-norm is also the sparsest solution. Comm. On Pure and Applied Math, 59, 797--829.
[4]
Friedman, J., Hastie, T., & Tibshirani, R. (2007). Sparse inverse covariance estimation with the graphical lasso. Biostat.
[5]
Hansen, M., & Yu, B. (2001). Model selection and the minimum description length principle. Journal of the American Statistical Association, 96, 746--774.
[6]
Hoi, S. C. H., Liu, W., & Chang, S.-F. (2008). Semisupervised distance learning for collaborative image retrieval. Proc. of IEEE CVPR.
[7]
Ravikumar, P., Wainwright, M. J., Raskutti, G., & Yu, B. (2008). High-dimensional covariance estimation by minimizing l1-penalized log-determinant divergence (Technical Report 767). Department of Statistics, University of California, Berkeley.
[8]
Schultz, M., & Joachims, T. (2004). Learning a distance metric from relative comparisons. Proc. of NIPS.
[9]
Todd, M. (2001). Semidefinite optimization. Acta Numerica, 515--560.
[10]
Weinberger, K. Q., Blitzer, J., & Saul, L. K. (2005). Distance metric learning for large margin nearest neighbor classification. Proc. of NIPS.
[11]
Wright, J., Yang, A. Y., Ganesh, A., Sastry, S. S., & Ma, Y. (2008). Robust face recognition via sparse representation. IEEE Transactions on Pattern Recognition and Machine Intelligence.
[12]
Xing, E. P., Ng, A. Y., Jordan, M. I., & Russell, S. (2003). Distance metric learning, with application to clustering with side-information. Proc. of NIPS.
[13]
Zhang, W., Xue, X., Sun, Z., Guo, Y.-F., & Lu, H. (2007). Optimal dimensionality of metric space for classification. Proc. of ICML.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICML '09: Proceedings of the 26th Annual International Conference on Machine Learning
June 2009
1331 pages
ISBN:9781605585161
DOI:10.1145/1553374

Sponsors

  • NSF
  • Microsoft Research: Microsoft Research
  • MITACS

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2009

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Conference

ICML '09
Sponsor:
  • Microsoft Research

Acceptance Rates

Overall Acceptance Rate 140 of 548 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)24
  • Downloads (Last 6 weeks)2
Reflects downloads up to 03 Feb 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media