Advanced search
Start date
Betweenand


Bayesian influence diagnostics using normalized functional Bregman divergence

Full text
Author(s):
Danilevicz, Ian M. ; Ehlers, Ricardo S.
Total Authors: 2
Document type: Journal article
Source: COMMUNICATIONS IN STATISTICS-THEORY AND METHODS; v. 51, n. 6, p. 16-pg., 2020-05-13.
Abstract

Ideally, any statistical inference should be robust to local influences. Although there are simple ways to check about leverage points in independent and linear problems, more complex models require more sophisticated methods. Kullback-Leiber and Bregman divergences were already applied in Bayesian inference to measure the isolated impact of each observation in a model. We extend these ideas to models for dependent and independent data with normal and non-normal probability distributions. We also propose a strategy to rescale the functional Bregman divergence to lie in the (0,1) interval thus facilitating interpretation and comparison. This is accomplished with a minimal computational effort while maintaining all theoretical properties. For computational efficiency, we take advantage of Hamiltonian Monte Carlo methods to draw samples from the posterior distribution of model parameters. The resulting Markov chains are then directly connected with Bregman calculus, which results in fast computation. We check our proposed strategies in both simulated and empirical studies. (AU)

FAPESP's process: 16/21137-2 - Bayesian Computation through Geometric and Variance Reduction Methods.
Grantee:Ricardo Sandes Ehlers
Support Opportunities: Regular Research Grants