Busca avançada
Ano de início
Entree


A decision cognizant Kullback-Leibler divergence

Texto completo
Autor(es):
Ponti, Moacir ; Kittler, Josef ; Riva, Mateus ; de Campos, Teofilo ; Zor, Cemre
Número total de Autores: 5
Tipo de documento: Artigo Científico
Fonte: PATTERN RECOGNITION; v. 61, p. 9-pg., 2017-01-01.
Resumo

In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback-Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback-Leibler divergence (DC-KL), to reduce the contribution of the minority classes, which obscure the true degree of classifier incongruence. We investigate the properties of the novel divergence measure analytically and by simulation studies. The proposed measure is demonstrated to be more robust to minority class clutter. Its sensitivity to estimation noise is also shown to be considerably lower than that of the classical KL divergence. These properties render the DC-KL divergence a much better statistic for discriminating between classifier congruence and incongruence in pattern recognition systems. (C) 2016 Elsevier Ltd. All rights reserved. (AU)

Processo FAPESP: 15/13504-2 - Detecção de anomalias por medidas de divergência e decomposição de características de vídeo
Beneficiário:Moacir Antonelli Ponti
Modalidade de apoio: Bolsas no Exterior - Pesquisa
Processo FAPESP: 15/24652-2 - Detecção de anomalia utilizando um algoritmo de aprendizado incremental baseado em floresta de caminhos mínimos
Beneficiário:Mateus Riva
Modalidade de apoio: Bolsas no Exterior - Estágio de Pesquisa - Iniciação Científica