The paper by Stephen Walker offers an interesting view of the rationale of Bayesian inference with misspecified models. The author resorts to the representation theorem of de Finetti to justify a more flexible use of Bayes theorem, flexible in that it requires less assumptions on the data generating process. Predictive densities are seen as guesses obeying some form of symmetry when learning from past observations. They can be chosen to define an exchangeable law for the observables which does not need to conform to the way the data are actually generated. Through the representation theorem, it is possible to separate the statistical model (that is, the likelihood) from the prior and look at the former as a suitable approximation for the stochastic phenomena of interest. Posterior inference has then to be validated in terms of its asymptotic behavior with respect to the data generating process. In this note we would like to address two related issues. In Section 1 some results on the predictive construction of parametric models are reviewed; they help to gain some insight on Walker's use of Bayes theorem in case of misspecification. In Section 2 a parametric family of densities is considered. Given that the interest is in finding through posterior inference the parameter value that minimizes the divergence with the true density, it is worth to consider estimation of the minimum divergence with Bayesian nonparametric methods.

Discussion on article “Bayesian inference with misspecified models” by Stephen G. Walker

DE BLASI, Pierpaolo
2013-01-01

Abstract

The paper by Stephen Walker offers an interesting view of the rationale of Bayesian inference with misspecified models. The author resorts to the representation theorem of de Finetti to justify a more flexible use of Bayes theorem, flexible in that it requires less assumptions on the data generating process. Predictive densities are seen as guesses obeying some form of symmetry when learning from past observations. They can be chosen to define an exchangeable law for the observables which does not need to conform to the way the data are actually generated. Through the representation theorem, it is possible to separate the statistical model (that is, the likelihood) from the prior and look at the former as a suitable approximation for the stochastic phenomena of interest. Posterior inference has then to be validated in terms of its asymptotic behavior with respect to the data generating process. In this note we would like to address two related issues. In Section 1 some results on the predictive construction of parametric models are reviewed; they help to gain some insight on Walker's use of Bayes theorem in case of misspecification. In Section 2 a parametric family of densities is considered. Given that the interest is in finding through posterior inference the parameter value that minimizes the divergence with the true density, it is worth to consider estimation of the minimum divergence with Bayesian nonparametric methods.
2013
143
10
1634
1637
http://www.sciencedirect.com/science/journal/03783758/143/10
Bayesian asymptotics, consistency, misspecified model
De Blasi P
File in questo prodotto:
File Dimensione Formato  
deblasi-2013pre.pdf

Accesso aperto

Descrizione: Post-prints dell'autore
Tipo di file: POSTPRINT (VERSIONE FINALE DELL’AUTORE)
Dimensione 211.98 kB
Formato Adobe PDF
211.98 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/140475
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact