- Technical Program
- Workshops & Tutorials
- At a glance
- Doctoral Consortium
- Opening & Reception
- Best Papers from Sister Conferences Track
- IJCAI Video Track
- Trading Agent Competion (TAC)
- IJCAI-11 Awards
- Funding Opportunities for International Research Collaborations
- General Game Playing Competition
- Banquet
- Ramon Llull Session
- Industry Day
- Closing Event
- List of Accepted Papers
- Poster Boards

Learning a Distance Metric by Empirical Loss Minimization

Wei Bian, Dacheng Tao

In this paper, we study the problem of learning a metric and propose a loss function based metric learning framework, in which the metric is estimated by minimizing an empirical risk over a training set. With mild conditions on the instance distribution and the used loss function, we prove that the empirical risk converges to its expected counterpart at rate O(1/sqrt(n)), wherein n is the cardinality of the training set. In addition, with the assumption that the best metric that minimizes the expected risk is bounded, we prove that the learned metric is consistent. Two example algorithms are presented by using the proposed loss function based metric learning framework, each of which uses a log loss function and a smoothed hinge loss function, respectively.Experimental results on data sets from the UCI machine learning repository suggest the effectiveness of the proposed algorithms.