Constrained metric learning by permutation inducing isometries

Joel Bosveld, A. Mahmood, Du Huynh, Lyle Noakes

    Research output: Contribution to journalArticlepeer-review

    6 Citations (Scopus)
    10 Downloads (Pure)


    © 2015 IEEE.The choice of metric critically affects the performance of classification and clustering algorithms. Metric learning algorithms attempt to improve performance, by learning a more appropriate metric. Unfortunately, most of the current algorithms learn a distance function which is not invariant to rigid transformations of images. Therefore, the distances between two images and their rigidly transformed pair may differ, leading to inconsistent classification or clustering results. We propose to constrain the learned metric to be invariant to the geometry preserving transformations of images that induce permutations in the feature space. The constraint that these transformations are isometries of the metric ensures consistent results and improves accuracy. Our second contribution is a dimension reduction technique that is consistent with the isometry constraints. Our third contribution is the formulation of the isometry constrained logistic discriminant metric learning (IC-LDML) algorithm, by incorporating the isometry constraints within the objective function of the LDML algorithm. The proposed algorithm is compared with the existing techniques on the publicly available labeled faces in the wild, viewpoint-invariant pedestrian recognition, and Toy Cars data sets. The IC-LDML algorithm has outperformed existing techniques for the tasks of face recognition, person identification, and object classification by a significant margin.
    Original languageEnglish
    Article number7331653
    Pages (from-to)92-103
    Number of pages12
    JournalIEEE Transactions on Image Processing
    Issue number1
    Publication statusPublished - Jan 2016


    Dive into the research topics of 'Constrained metric learning by permutation inducing isometries'. Together they form a unique fingerprint.

    Cite this