%0 Conference Proceedings
%F weiaaai15
%A Liu, Wei
%A Mu, Cun
%A Ji, Rongrong
%A Ma, Shiqian
%A Smith, John R.
%A Chang, Shih-Fu
%T Low-Rank Similarity Metric Learning in High Dimensions
%B AAAI Conference on Artificial Intelligence (AAAI)
%C Austin, Texas, USA
%X Metric learning has become a widespreadly used tool in machine learning. To reduce expensive costs brought in by increasing dimensionality, low-rank metric learning arises as it can be more economical in storage and computation. However, existing low-rank metric learning algorithms usually adopt nonconvex objectives, and are hence sensitive to the choice of a heuristic low-rank basis. In this paper, we propose a novel low-rank metric learning algorithm to yield bilinear similarity functions. This algorithm scales linearly with input dimensionality in both space and time, therefore applicable to high-dimensional data domains. A convex objective free of heuristics is formulated by leveraging trace norm regularization to promote low-rankness. Crucially, we prove that all globally optimal metric solutions must retain a certain low-rank structure, which enables our algorithm to decompose the high-dimensional learning task into two steps: an SVD-based projection and a metric learning problem with reduced dimensionality. The latter step can be tackled efficiently through employing a linearized Alternating Direction Method of Multipliers. The efficacy of the proposed algorithm is demonstrated through experiments performed on four benchmark datasets with tens of thousands of dimensions
%U http://www.ee.columbia.edu/ln/dvmm/publications/15/AAAI15_metric.pdf
%D 2015