Busca avançada
Ano de início
Entree


A Comparison of Two Methods for Obtaining a Collective Posterior Distribution

Texto completo
Autor(es):
Pulgrossi, Rafael Catoia ; Oliveira, Natalia Lombardi ; Polpo, Adriano ; Izbicki, Rafael ; Polpo, A ; Stern, J ; Louzada, F ; Izbicki, R ; Takada, H
Número total de Autores: 9
Tipo de documento: Artigo Científico
Fonte: BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, MAXENT 37; v. 239, p. 10-pg., 2018-01-01.
Resumo

Bayesian inference is a powerful method that allows individuals to update their knowledge about any phenomenon when more information about it becomes available. In this paradigm, before data is observed, an individual expresses his uncertainty about the phenomenon of interest through a prior probability distribution. Then, after data is observed, this distribution is updated using Bayes theorem. In many situations, however, one desires to evaluate the knowledge of a group rather than of a single individual. In this case, a way to combine information from different sources is by mixing their uncertainty. The mixture can be done in two ways: before or after the data is observed. Although in both cases, we achieve a collective posterior distribution, they can be substantially different. In this work, we present several comparisons between these two approaches with noninformative priors and use the Kullback-Leibler's divergence to quantify the amount of information that is gained by each collective distribution. (AU)

Processo FAPESP: 17/03363-8 - Interpretabilidade e eficiência em testes de hipótese
Beneficiário:Rafael Izbicki
Modalidade de apoio: Auxílio à Pesquisa - Regular