Advanced search
Start date
Betweenand


A decision cognizant Kullback-Leibler divergence

Full text
Author(s):
Ponti, Moacir ; Kittler, Josef ; Riva, Mateus ; de Campos, Teofilo ; Zor, Cemre
Total Authors: 5
Document type: Journal article
Source: PATTERN RECOGNITION; v. 61, p. 9-pg., 2017-01-01.
Abstract

In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback-Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback-Leibler divergence (DC-KL), to reduce the contribution of the minority classes, which obscure the true degree of classifier incongruence. We investigate the properties of the novel divergence measure analytically and by simulation studies. The proposed measure is demonstrated to be more robust to minority class clutter. Its sensitivity to estimation noise is also shown to be considerably lower than that of the classical KL divergence. These properties render the DC-KL divergence a much better statistic for discriminating between classifier congruence and incongruence in pattern recognition systems. (C) 2016 Elsevier Ltd. All rights reserved. (AU)

FAPESP's process: 15/13504-2 - Anomaly detection using divergence measures and feature decomposition in video
Grantee:Moacir Antonelli Ponti
Support Opportunities: Scholarships abroad - Research
FAPESP's process: 15/24652-2 - Anomaly detection using an incremental learning algorithm based on minimum spanning tree
Grantee:Mateus Riva
Support Opportunities: Scholarships abroad - Research Internship - Scientific Initiation