## SEARCH

#### Institution

##### ( see all 8)

- University of Warwick 3 (%)
- University of British Columbia 2 (%)
- Department of Engineering 1 (%)
- LAGIS UMR 8146 1 (%)
- University of Bristol 1 (%)

#### Author

##### ( see all 12)

- Johansen, Adam M. [x] 5 (%)
- Doucet, Arnaud 2 (%)
- Aston, John A. D. 1 (%)
- Davy, Manuel 1 (%)
- Evdemon-Hogan, Melina 1 (%)

#### Publication

## CURRENTLY DISPLAYING:

Most articles

Fewest articles

# Search Results

Showing 1 to 5 of 5 matching Articles
Results per page:

## Bayesian model comparison with un-normalised likelihoods

### Statistics and Computing (2017-03-01) 27: 403-422 , March 01, 2017

Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of *biased* weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.

## Particle methods for maximum likelihood estimation in latent variable models

### Statistics and Computing (2008-03-01) 18: 47-57 , March 01, 2008

Standard methods for maximum likelihood parameter estimation in latent variable models rely on the Expectation-Maximization algorithm and its Monte Carlo variants. Our approach is different and motivated by similar considerations to simulated annealing; that is we build a sequence of artificial distributions whose support concentrates itself on the set of maximum likelihood estimates. We sample from these distributions using a sequential Monte Carlo approach. We demonstrate state-of-the-art performance for several applications of the proposed approach.

## Convergence of the SMC Implementation of the PHD Filte

### Methodology and Computing in Applied Probability (2006-06-01) 8: 265-291 , June 01, 2006

The probability hypothesis density (PHD) filter is a first moment approximation to the evolution of a dynamic point process which can be used to approximate the optimal filtering equations of the multiple-object tracking problem. We show that, under reasonable assumptions, a sequential Monte Carlo (SMC) approximation of the PHD filter converges in mean of order $$p \geq 1$$ , and hence almost surely, to the true PHD filter. We also present a central limit theorem for the SMC approximation, show that the variance is finite under similar assumptions and establish a recursion for the asymptotic variance. This provides a theoretical justification for this implementation of a tractable multiple-object filtering methodology and generalises some results from sequential Monte Carlo theory.

## Parallel sequential Monte Carlo samplers and estimation of the number of states in a Hidden Markov Model

### Annals of the Institute of Statistical Mathematics (2014-06-01) 66: 553-575 , June 01, 2014

The majority of modelling and inference regarding Hidden Markov Models (HMMs) assumes that the number of underlying states is known a priori. However, this is often not the case and thus determining the appropriate number of underlying states for a HMM is of considerable interest. This paper proposes the use of a parallel sequential Monte Carlo samplers framework to approximate the posterior distribution of the number of states. This requires no additional computational effort if approximating parameter posteriors conditioned on the number of states is also necessary. The proposed strategy is evaluated on a comprehensive set of simulated data and shown to outperform the state of the art in this area: although the approach is simple, it provides good performance by fully exploiting the particular structure of the problem. An application to business cycle analysis is also presented.

## Static-parameter estimation in piecewise deterministic processes using particle Gibbs samplers

### Annals of the Institute of Statistical Mathematics (2014-06-01) 66: 577-609 , June 01, 2014

We develop particle Gibbs samplers for static-parameter estimation in discretely observed piecewise deterministic process (PDPs). PDPs are stochastic processes that jump randomly at a countable number of stopping times but otherwise evolve deterministically in continuous time. A sequential Monte Carlo (SMC) sampler for filtering in PDPs has recently been proposed. We first provide new insight into the consequences of an approximation inherent within that algorithm. We then derive a new representation of the algorithm. It simplifies ensuring that the importance weights exist and also allows the use of variance-reduction techniques known as backward and ancestor sampling. Finally, we propose a novel Gibbs step that improves mixing in particle Gibbs samplers whose SMC algorithms make use of large collections of auxiliary variables, such as many instances of SMC samplers. We provide a comparison between the two particle Gibbs samplers for PDPs developed in this paper. Simulation results indicate that they can outperform reversible-jump MCMC approaches.