## SEARCH

#### Country

##### ( see all 261)

- United States 3820 (%)
- Germany 1564 (%)
- Canada 1070 (%)
- United Kingdom 1059 (%)

#### Institution

##### ( see all 9889)

- The Institute of Statistical Mathematics 212 (%)
- University of California 206 (%)
- Indian Statistical Institute 182 (%)
- McMaster University 119 (%)
- University of Washington 107 (%)

#### Author

##### ( see all 32053)

- Balakrishnan, N. 82 (%)
- Nadarajah, Saralees 82 (%)
- Akaike, Hirotugu 46 (%)
- Grams, Ralph R. 46 (%)
- Heyer, H. 44 (%)

#### Publication

##### ( see all 51)

- Annals of the Institute of Statistical Mathematics 2988 (%)
- Journal of Medical Systems 2297 (%)
- Metrika 2043 (%)
- Statistical Papers 1702 (%)
- Statistics and Computing 1244 (%)

#### Subject

##### ( see all 53)

- Statistics [x] 21342 (%)
- Statistics, general 14180 (%)
- Statistics for Business/Economics/Mathematical Finance/Insurance 10940 (%)
- Probability Theory and Stochastic Processes 5775 (%)
- Economic Theory 5110 (%)

## CURRENTLY DISPLAYING:

Most articles

Fewest articles

Showing 1 to 10 of 21342 matching Articles
Results per page:

## Bayesian model comparison with un-normalised likelihoods

### Statistics and Computing (2017-03-01) 27: 403-422 , March 01, 2017

Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of *biased* weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.

## Estimating conception statistics using gestational age information from NHS Numbers for Babies data

### Health Statistics Quarterly (2009-05-01) 41: 21-27 , May 01, 2009

Conception statistics routinely published for England and Wales include pregnancies that result in one or more live- or stillbirths (a maternity) or an abortion. All live births are assumed to be 38 weeks gestation as information on gestation is not collected at birth registration. For the first time, gestational age information from the National Health Service (NHS) Numbers for Babies (NN4B) data has been used to re-estimate conception statistics for 2005. This shows that 72 per cent of conceptions leading to a maternity in fact have a gestation period that differs from 38 weeks and most of these fall at either 37 or 39 weeks. The age-specific conception rates using this revised method are not significantly different to those produced using the current method.

## Sequential Monte Carlo for counting vertex covers in general graphs

### Statistics and Computing (2016-05-01) 26: 591-607 , May 01, 2016

In this paper we describe a sequential importance sampling (SIS) procedure for counting the number of vertex covers in general graphs. The optimal SIS proposal distribution is the uniform over a suitably restricted set, but is not implementable. We will consider two proposal distributions as approximations to the optimal. Both proposals are based on randomization techniques. The first randomization is the classic probability model of random graphs, and in fact, the resulting SIS algorithm shows polynomial complexity for random graphs. The second randomization introduces a probabilistic relaxation technique that uses Dynamic Programming. The numerical experiments show that the resulting SIS algorithm enjoys excellent practical performance in comparison with existing methods. In particular the method is compared with *cachet*—an exact model counter, and the state of the art *SampleSearch*, which is based on Belief Networks and importance sampling.

## Editorial

### Statistical Methods and Applications (2007-06-01) 16: 5 , June 01, 2007

## The MAP test for multimodality

### Journal of Classification (1994-03-01) 11: 5-36 , March 01, 1994

We introduce a test for detecting multimodality in distributions based on minimal constrained spanning trees. We define a Minimal Ascending Path Spanning Tree (MAPST) on a set of points as a spanning tree that has the minimal possible sum of lengths of links with the constraint that starting from any link, the lengths of the links are non-increasing towards a root node. We define similarly MAPSTs with more than one root. We present some algorithms for finding such trees. Based on these trees, we devise a test for multimodality, called the MAP Test (for Minimal Ascending Path). Using simulations, we estimate percentage points of the MAP statistic and assess the power of the test. Finally, we illustrate the use of MAPSTs for determining the number of modes in a distribution of positions of galaxies on photographic plates from a rich galaxy cluster.

## A Cautionary Note on Likelihood Ratio Tests in Mixture Models

### Annals of the Institute of Statistical Mathematics (2000-09-01) 52: 481-487 , September 01, 2000

We show that iterative methods for maximizing the likelihood in a mixture of exponentials model depend strongly on their particular implementation. Different starting strategies and stopping rules yield completely different estimators of the parameters. This is demonstrated for the likelihood ratio test of homogeneity against two-component exponential mixtures, when the test statistic is calculated by the EM algorithm.

## Local expectations of the population spectral distribution of a high-dimensional covariance matrix

### Statistical Papers (2014-05-01) 55: 563-573 , May 01, 2014

This paper discusses the relationship between the population spectral distribution and the limit of the empirical spectral distribution in high-dimensional situations. When the support of the limiting spectral distribution is split into several intervals, the population one gains a meaningful division, and general functional expectations of each part from the division, referred as local expectations, can be formulated as contour integrals around these intervals. Basing on these knowledge we present consistent estimators of the local expectations and prove a central limit theorem for them. The results are then used to analyze an estimator of the population spectral distribution in recent literature.

## Computational testing algorithmic procedure of assessment for lifetime performance index of Pareto products under progressive type I interval censoring

### Computational Statistics (2017-03-09): 1-20 , March 09, 2017

Process capability indices had been widely used to evaluate the process performance to the continuous improvement of quality and productivity. When the lifetime of products possesses a one-parameter Pareto distribution, the larger-the-better lifetime performance index is considered. The maximum likelihood estimator is used to estimate the lifetime performance index based on the progressive type I interval censored sample. The asymptotic distribution of this estimator is also investigated. We use this estimator to develop the new hypothesis testing algorithmic procedure in the condition of known lower specification limit. Finally, two practical examples are given to illustrate the use of this testing algorithmic procedure to determine whether the process is capable.

## Statistics of random processes I: General theory

### Metrika (1983-12-01) 30: 100 , December 01, 1983

## Goodness-of-fit tests for semiparametric and parametric hypotheses based on the probability weighted empirical characteristic function

### Statistical Papers (2016-12-01) 57: 957-976 , December 01, 2016

We investigate the finite-sample properties of certain procedures which employ the novel notion of the probability weighted empirical characteristic function. The procedures considered are: (1) Testing for symmetry in regression, (2) Testing for multivariate normality with independent observations, and (3) Testing for multivariate normality of random effects in mixed models. Along with the new tests alternative methods based on the ordinary empirical characteristic function as well as other more well known procedures are implemented for the purpose of comparison.