Browsing by Subject "mathematics, interdisciplinary applications"
Now showing 1 - 13 of 13
- Results Per Page
- Sort Options
Item Bayesian Estimation of Intensity Surfaces on the Sphere via Needlet Shrinkage and Selection(2011) Scott, James G.; Scott, James G.This paper describes an approach for Bayesian modeling in spherical datasets. Our method is based upon a recent construction called the needlet, which is a particular form of spherical wavelet with many favorable statistical and computational properties. We perform shrinkage and selection of needlet coefficients, focusing on two main alternatives: empirical-Bayes thresholding, and Bayesian local shrinkage rules. We study the performance of the proposed methodology both on simulated data and on two real data sets: one involving the cosmic microwave background radiation, and one involving the reconstruction of a global news intensity surface inferred from published Reuters articles in August, 1996. The fully Bayesian approach based on robust, sparse shrinkage priors seems to outperform other alternatives.Item Bayesian Estimation of the Discrepancy with Misspecified Parametric Models(2013) De Blasi, Pierpaolo; Walker, Stephen G.; Walker, Stephen G.We study a Bayesian model where we have made specific requests about the parameter values to be estimated. The aim is to find the parameter of a parametric family which minimizes a distance to the data generating density and then to estimate the discrepancy using nonparametric methods. We illustrate how coherent updating can proceed given that the standard Bayesian posterior from an unidentifiable model is inappropriate. Our updating is performed using Markov Chain Monte Carlo methods and in particular a novel method for dealing with intractable normalizing constants is required. Illustrations using synthetic data are provided.Item Bayesian forecasting of Prepayment Rates for Individual Pools of Mortgages(2008) Popova, Ivillina; Popova, Elmira; George, Edward I.; Popova, ElmiraThis paper proposes a novel approach for modeling prepayment rates of individual pools of mortgages. The model incorporates the empirical evidence that prepayment is past dependent via Bayesian methodology. There are many factors that influence the prepayment behavior and for many of them there is no available (or impossible to gather) information. We implement this issue by creating a Bayesian mixture model and construct a Markov Chain Monte Carlo algorithm to estimate the parameters. We assess the model on a data set from the Bloomberg Database. Our results show that the burnout effect is a significant variable for explaining normal prepayment activities. This result does not hold when prepayment is triggered by non-pool dependent events. We show how to use the new model to compute prices for Mortgage Backed Securities. Monte Carlo simulation is the traditional method for obtaining such prices and the proposed model can be easily incorporated within simulation pricing framework. Prices for standard Pass-Throughs are obtained using simulation.Item Computational Methods for Parameter Estimation in Climate Models(2008) Villagran, Ale; Huerta, Gabriel; Jackson, Charles S.; Sen, Mrinal K.; Jackson, Charles S.; Sen, Mrinal K.Intensive computational methods have been used by Earth scientists in a wide range of problems in data inversion and uncertainty quantification such as earthquake epicenter location and climate projections. To quantify the uncertainties resulting from a range of plausible model configurations it is necessary to estimate a multidimensional probability distribution. The computational cost of estimating these distributions for geoscience applications is impractical using traditional methods such as Metropolis/Gibbs algorithms as simulation costs limit the number of experiments that can be obtained reasonably. Several alternate sampling strategies have been proposed that could improve on the sampling efficiency including Multiple Very Fast Simulated Annealing (MVFSA) and Adaptive Metropolis algorithms. The performance of these proposed sampling strategies are evaluated with a surrogate climate model that is able to approximate the noise and response behavior of a realistic atmospheric general circulation model (AGCM). The surrogate model is fast enough that its evaluation can be embedded in these Monte Carlo algorithms. We show that adaptive methods can be superior to MVFSA to approximate the known posterior distribution with fewer forward evaluations. However the adaptive methods can also be limited by inadequate sample mixing. The Single Component and Delayed Rejection Adaptive Metropolis algorithms were found to resolve these limitations, although challenges remain to approximating multi-modal distributions. The results show that these advanced methods of statistical inference can provide practical solutions to the climate model calibration problem and challenges in quantifying climate projection uncertainties. The computational methods would also be useful to problems outside climate prediction, particularly those where sampling is limited by availability of computational resources.Item Dynamic Financial Index Models: Modeling Conditional Dependencies Via Graphs(2011) Wang, Hao; Reeson, Craig; Carvalho, Carlos M.; Carvalho, Carlos M.We discuss the development and application of dynamic graphical models for multivariate financial time series in the context of Financial Index Models. The use of graphs generalizes the independence residual variation assumption of index models with a more complex yet still parsimonious alternative. Working with the dynamic matrix-variate graphical model framework, we develop general time-varying index models that are analytically tractable. In terms of methodology, we carefully explore strategies to deal with graph uncertainty and discuss the implementation of a novel computational tool to sequentially learn about the conditional independence relationships defining the model. Additionally, motivated by our applied context, we extend the DGM framework to accommodate random regressors. Finally, in a case study involving 100 stocks, we show that our proposed methodology is able to generate improvements in covariance forecasting and portfolio optimization problems.Item Modeling Space-Time Data Using Stochastic Differential Equations(2009) Duan, Jason A.; Gelfand, Alan E.; Sirmans, C. F.; Duan, Jason A.This paper demonstrates the use and value of stochastic differential equations for modeling space-time data in two common settings. The first consists of point-referenced or geostatistical data where observations are collected at fixed locations and times. The second considers random point pattern data where the emergence of locations and times is random. For both cases, we employ stochastic differential equations to describe a latent process within a hierarchical model for the data. The intent is to view this latent process mechanistically and endow it with appropriate simple features and interpretable parameters. A motivating problem for the second setting is to model urban development through observed locations and times of new home construction; this gives rise to a space-time point pattern. We show that a spatio-temporal Cox process whose intensity is driven by a stochastic logistic equation is a viable mechanistic model that affords meaningful interpretation for the results of statistical inference. Other applications of stochastic logistic differential equations with space-time varying parameters include modeling population growth and product diffusion, which motivate our first, point-referenced data application. We propose a method to discretize both time and space in order to fit the model. We demonstrate the inference for the geostatistical model through a simulated dataset. Then, we fit the Cox process model to a real dataset taken from the greater Dallas metropolitan area.Item Nonparametric Bayesian Bi-Clustering for Next Generation Sequencing Count Data(2013) Xu, Yanxun; Lee, Juhee; Yuan, Yuan; Mitra, Riten; Liang, Shoudan; Muller, Peter; Ji, Yi; Mitra, Riten; Muller, PeterHistone modifications (HMs) play important roles in transcription through post-translational modifications. Combinations of HMs, known as chromatin signatures, encode specific messages for gene regulation. We therefore expect that inference on possible clustering of HMs and an annotation of genomic locations on the basis of such clustering can contribute new insights about the functions of regulatory elements and their relationships to combinations of HMs. We propose a nonparametric Bayesian local clustering Poisson model (NoB-LCP) to facilitate posterior inference on two-dimensional clustering of HMs and genomic locations. The NoB-LCP clusters HMs into HM sets and lets each HM set define its own clustering of genomic locations. Furthermore, it probabilistically excludes HMs and genomic locations that are irrelevant to clustering. By doing so, the proposed model effectively identifies important sets of HMs and groups regulatory elements with similar functionality based on HM patterns.Item Objective Prior for the Number of Degrees of Freedom of a T Distribution(2014) Villa, Cristiano; Walker, Stephen G.; Walker, Stephen G.In this paper, we construct an objective prior for the degrees of freedom of a t distribution, when the parameter is taken to be discrete. This parameter is typically problematic to estimate and a problem in objective Bayesian inference since improper priors lead to improper posteriors, whilst proper priors may dominate the data likelihood. We find an objective criterion, based on loss functions, instead of trying to define objective probabilities directly. Truncating the prior on the degrees of freedom is necessary, as the t distribution, above a certain number of degrees of freedom, becomes the normal distribution. The defined prior is tested in simulation scenarios, including linear regression with t-distributed errors, and on real data: the daily returns of the closing Dow Jones index over a period of 98 days.Item On the Half-Cauchy Prior for a Global Scale Parameter(2012) Polson, Nicholas G.; Scott, James G.; Scott, James G.This paper argues that the half-Cauchy distribution should replace the inverse-Gamma distribution as a default prior for a top-level scale parameter in Bayesian hierarchical models, at least for cases where a proper prior is necessary. Our arguments involve a blend of Bayesian and frequentist reasoning, and are intended to complement the case made by Gelman (2006) in support of folded-t priors. First, we generalize the half-Cauchy prior to the wider class of hypergeometric inverted-beta priors. We derive expressions for posterior moments and marginal densities when these priors are used for a top-level normal variance in a Bayesian hierarchical model. We go on to prove a proposition that, together with the results for moments and marginals, allows us to characterize the frequentist risk of the Bayes estimators under all global-shrinkage priors in the class. These results, in turn, allow us to study the frequentist properties of the half-Cauchy prior versus a wide class of alternatives. The half-Cauchy occupies a sensible middle ground within this class: it performs well near the origin, but does not lead to drastic compromises in other parts of the parameter space. This provides an alternative, classical justification for the routine use of this prior. We also consider situations where the underlying mean vector is sparse, where we argue that the usual conjugate choice of an inverse-gamma prior is particularly inappropriate, and can severely distort inference. Finally, we summarize some open issues in the specification of default priors for scale terms in hierarchical models.Item Particle Learning for General Mixtures(2010) Carvalho, Carlos M.; Lopes, Hedibert F.; Polson, Nicholas G.; Taddy, Matt A.; Carvalho, Carlos M.This paper develops particle learning (PL) methods for the estimation of general mixture models. The approach is distinguished from alternative particle filtering methods in two major ways. First, each iteration begins by resampling particles according to posterior predictive probability, leading to a more efficient set for propagation. Second, each particle tracks only the "essential state vector" thus leading to reduced dimensional inference. In addition, we describe how the approach will apply to more general mixture models of current interest in the literature; it is hoped that this will inspire a greater number of researchers to adopt sequential Monte Carlo methods for fitting their sophisticated mixture based models. Finally, we show that PL leads to straight forward tools for marginal likelihood calculation and posterior cluster allocation.Item Rejoinder(2014-12) Windle, Jesse; Carvalho, Carlos M.; Carvalho, Carlos M.Item A Simple Class of Bayesian Nonparametric Autoregression Models(2013) Di Lucca, Maria Anna; Guglielmi, Alessandra; Mueller, Peter; Quintana, Fernando A.; Mueller, PeterWe introduce a model for a time series of continuous outcomes, that can be expressed as fully nonparametric regression or density regression on lagged terms. The model is based on a dependent Dirichlet process prior on a family of random probability measures indexed by the lagged covariates. The approach is also extended to sequences of binary responses. We discuss implementation and applications of the models to a sequence of waiting times between eruptions of the Old Faithful Geyser, and to a dataset consisting of sequences of recurrence indicators for tumors in the bladder of several patients.Item A Tractable State-Space Model for Symmetric Positive-Definite Matrices(2014-12) Windle, Jesse; Carvalho, Carlos M.; Carvalho, Carlos M.The Bayesian analysis of a state-space model includes computing the posterior distribution of the system's parameters as well as its latent states. When the latent states wander around R-n there are several well-known modeling components and computational tools that may be profitably combined to achieve this task. When the latent states are constrained to a strict subset of R-n these models and tools are either impaired or break down completely. State-space models whose latent states are covariance matrices arise in finance and exemplify the challenge of devising tractable models in the constrained setting. To that end, we present a state-space model whose observations and latent states take values on the manifold of symmetric positive-definite matrices and for which one may easily compute the posterior distribution of the latent states and the system's parameters as well as filtered distributions and one-step ahead predictions. Employing the model within the context of finance, we show how one can use realized covariance matrices as data to predict latent time-varying covariance matrices. This approach out-performs factor stochastic volatility.