Reverse training: An efficient approach for image set classification

Munawar Hayat, Mohammed Bennamoun, Senjian An

    Research output: Chapter in Book/Conference paperConference paperpeer-review

    17 Citations (Scopus)
    11 Downloads (Pure)


    This paper introduces a new approach, called reverse training, to efficiently extend binary classifiers for the task of multi-class image set classification. Unlike existing binary to multi-class extension strategies, which require multiple binary classifiers, the proposed approach is very efficient since it trains a single binary classifier to optimally discriminate the class of the query image set from all others. For this purpose, the classifier is trained with the images of the query set (labelled positive) and a randomly sampled subset of the training data (labelled negative). The trained classifier is then evaluated on rest of the training images. The class of these images with their largest percentage classified as positive is predicted as the class of the query image set. The confidence level of the prediction is also computed and integrated into the proposed approach to further enhance its robustness and accuracy. Extensive experiments and comparisons with existing methods show that the proposed approach achieves state of the art performance for face and object recognition on a number of datasets. © 2014 Springer International Publishing.
    Original languageEnglish
    Title of host publicationComputer Vision – ECCV 2014. ECCV 2014. Lecture Notes in Computer Science
    Place of PublicationSwitzerland
    Volume8694 LNCS
    ISBN (Print)9783319105987
    Publication statusPublished - 2014
    Event13th European Conference on Computer Vision - Zurich, Switzerland
    Duration: 6 Sep 201412 Sep 2014
    Conference number: 13


    Conference13th European Conference on Computer Vision
    Abbreviated titleECCV


    Dive into the research topics of 'Reverse training: An efficient approach for image set classification'. Together they form a unique fingerprint.

    Cite this