Learning-Based Confidence Estimation for Multi-modal Classifier Fusion

Uzair Nadeem, Mohammed Bennamoun, Ferdous Sohel, Roberto Togneri

Research output: Chapter in Book/Conference paperConference paperpeer-review

5 Citations (Scopus)

Abstract

We propose a novel confidence estimation method for predictions from a multi-class classifier. Unlike existing methods, we learn a confidence-estimator on the basis of a held-out set from the training data. The predicted confidence values by the proposed system are used to improve the accuracy of multi-modal emotion and sentiment classification. The scores of different classes from the individual modalities are superposed on the basis of confidence values. Experimental results demonstrate that the accuracy of the proposed confidence based fusion method is significantly superior to that of the classifier trained on any modality separately, and achieves superior performance compared to other fusion methods.

Original languageEnglish
Title of host publicationNeural Information Processing - 26th International Conference, ICONIP 2019, Proceedings
Subtitle of host publicationpart II
EditorsTom Gedeon, Kok Wai Wong, Minho Lee
Place of PublicationAustralia
PublisherSpringer
Pages299-312
Number of pages14
ISBN (Print)9783030367107
DOIs
Publication statusPublished - 15 Dec 2019
Event26th International Conference on Neural Information Processing - Sydney, Australia
Duration: 12 Dec 201915 Dec 2019
http://ajiips.com.au/iconip2019/

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11954 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference26th International Conference on Neural Information Processing
Abbreviated titleICONIP 2019
Country/TerritoryAustralia
CitySydney
Period12/12/1915/12/19
Internet address

Fingerprint

Dive into the research topics of 'Learning-Based Confidence Estimation for Multi-modal Classifier Fusion'. Together they form a unique fingerprint.

Cite this