Nonetheless the central role of the Box-Jenkins Gaussian autoregressive moving average models for continuous time series, there is no such a leading technique for count time series. In this paper we introduce a Bayesian nonparametric methodology for producing coherent predictions of a count time series $\{X_t\}$ using the nonnegative INteger-valued AutoRegressive process of the order 1 (INAR(1)) introduced by Al-Osh and Alzaid (1987) and McKenzie (1988). INAR models evolve as a birth-and-death process where the value at time $t$ can be modeled as the sum of the survivors from time $t-1$ and the outcome of an innovation process with a certain discrete distribution. Obviously such components are not observable. Our predictions are based on estimates of the $p$-step ahead predictive mass functions assuming a nonparametric prior distribution for the innovation process. Precisely we model this distribution with a Dirichlet process mixture of rounded Gaussians (Canale and Dunson, 2011). This class of prior has large support on the space of probability mass functions and is able to generate almost any count distribution including over/under-dispersion or multimodality. An efficient Gibbs sampler is developed for posterior computation and the methodology is used to analyze real data sets

Bayesian nonparametric predictions for count time series

CANALE, Antonio
2012-01-01

Abstract

Nonetheless the central role of the Box-Jenkins Gaussian autoregressive moving average models for continuous time series, there is no such a leading technique for count time series. In this paper we introduce a Bayesian nonparametric methodology for producing coherent predictions of a count time series $\{X_t\}$ using the nonnegative INteger-valued AutoRegressive process of the order 1 (INAR(1)) introduced by Al-Osh and Alzaid (1987) and McKenzie (1988). INAR models evolve as a birth-and-death process where the value at time $t$ can be modeled as the sum of the survivors from time $t-1$ and the outcome of an innovation process with a certain discrete distribution. Obviously such components are not observable. Our predictions are based on estimates of the $p$-step ahead predictive mass functions assuming a nonparametric prior distribution for the innovation process. Precisely we model this distribution with a Dirichlet process mixture of rounded Gaussians (Canale and Dunson, 2011). This class of prior has large support on the space of probability mass functions and is able to generate almost any count distribution including over/under-dispersion or multimodality. An efficient Gibbs sampler is developed for posterior computation and the methodology is used to analyze real data sets
2012
14
49
52
INAR(1); Dirichlet process mixtures; Gibbs sampling algorithm
Luisa Bisaglia; Antonio Canale
File in questo prodotto:
File Dimensione Formato  
quadstat2012.pdf

Accesso aperto

Tipo di file: POSTPRINT (VERSIONE FINALE DELL’AUTORE)
Dimensione 399.89 kB
Formato Adobe PDF
399.89 kB Adobe PDF Visualizza/Apri
bisagliacanale - iris.pdf

Accesso riservato

Tipo di file: POSTPRINT (VERSIONE FINALE DELL’AUTORE)
Dimensione 241.67 kB
Formato Adobe PDF
241.67 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/129526
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact