Discrete random probability measures and the exchangeable random partitions they induce are key tools for addressing a variety of estimation and prediction problems in Bayesian inference. Here we focus on the family of Gibbs–type priors, a recent elegant generalization of the Dirichlet and the Pitman–Yor process priors. These priors share properties that are appealing both from a theoretical and an applied point of view: (i) they admit an intuitive predictive characterization justifying their use in terms of a precise assumption on the learning mechanism; (ii) they stand out in terms of mathematical tractability; (iii) they include several interesting special cases besides the Dirichlet and the Pitman–Yor processes. The goal of our paper is to provide a systematic and unified treatment of Gibbs–type priors and highlight their implications for Bayesian nonparametric inference. We deal with their distributional properties, the resulting estimators, frequentist asymptotic validation and the construction of time–dependent versions. Applications, mainly concerning mixture models and species sampling, serve to convey the main ideas. The intuition inherent to this class of priors and the neat results they lead to make one wonder whether it actually represents the most natural generalization of the Dirichlet process.

Are Gibbs-type priors the most natural generalization of the Dirichlet process?

DE BLASI, Pierpaolo;FAVARO, STEFANO;PRUENSTER, Igor;RUGGIERO, MATTEO
2015

Abstract

Discrete random probability measures and the exchangeable random partitions they induce are key tools for addressing a variety of estimation and prediction problems in Bayesian inference. Here we focus on the family of Gibbs–type priors, a recent elegant generalization of the Dirichlet and the Pitman–Yor process priors. These priors share properties that are appealing both from a theoretical and an applied point of view: (i) they admit an intuitive predictive characterization justifying their use in terms of a precise assumption on the learning mechanism; (ii) they stand out in terms of mathematical tractability; (iii) they include several interesting special cases besides the Dirichlet and the Pitman–Yor processes. The goal of our paper is to provide a systematic and unified treatment of Gibbs–type priors and highlight their implications for Bayesian nonparametric inference. We deal with their distributional properties, the resulting estimators, frequentist asymptotic validation and the construction of time–dependent versions. Applications, mainly concerning mixture models and species sampling, serve to convey the main ideas. The intuition inherent to this class of priors and the neat results they lead to make one wonder whether it actually represents the most natural generalization of the Dirichlet process.
37
212
229
http://www.computer.org/portal/web/tpami
Bayesian Nonparametrics; Clustering; discrete random probability measure; Consistency; Gibbs-type priors; Pitman-Yor process
P. De Blasi; S. Favaro; A. Lijoi; R.H. Mena; I. Pruenster; M. Ruggiero
File in questo prodotto:
File Dimensione Formato  
2015-IEEE.pdf

Accesso riservato

Tipo di file: PDF EDITORIALE
Dimensione 1.02 MB
Formato Adobe PDF
1.02 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/2318/144723
Citazioni
  • ???jsp.display-item.citation.pmc??? 6
  • Scopus 76
  • ???jsp.display-item.citation.isi??? 72
social impact