## SEARCH

#### Country

##### ( see all 118)

- United States 3881 (%)
- Germany 1527 (%)
- United Kingdom 1090 (%)
- Canada 1075 (%)

#### Institution

##### ( see all 9655)

- The Institute of Statistical Mathematics 214 (%)
- University of California 193 (%)
- Indian Statistical Institute 183 (%)
- McMaster University 124 (%)
- University of Connecticut 106 (%)

#### Author

##### ( see all 22504)

- Balakrishnan, N. 86 (%)
- Nadarajah, Saralees 84 (%)
- Akaike, Hirotugu 46 (%)
- Heyer, H. 44 (%)
- Krämer, Walter 44 (%)

#### Publication

##### ( see all 50)

- Annals of the Institute of Statistical Mathematics 3001 (%)
- Metrika 2254 (%)
- Statistical Papers 1746 (%)
- Statistics and Computing 1274 (%)
- Journal of Medical Systems 1076 (%)

#### Subject

##### ( see all 49)

- Statistics [x] 20924 (%)
- Statistics, general 14709 (%)
- Statistics for Business/Economics/Mathematical Finance/Insurance 9478 (%)
- Probability Theory and Stochastic Processes 6254 (%)
- Economic Theory/Quantitative Economics/Mathematical Methods 3559 (%)

## CURRENTLY DISPLAYING:

Most articles

Fewest articles

Showing 1 to 10 of 20924 matching Articles
Results per page:

## Bayesian model comparison with un-normalised likelihoods

### Statistics and Computing (2017-03-01) 27: 403-422 , March 01, 2017

Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of *biased* weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.

## Estimating conception statistics using gestational age information from NHS Numbers for Babies data

### Health Statistics Quarterly (2009-05-01) 41: 21-27 , May 01, 2009

Conception statistics routinely published for England and Wales include pregnancies that result in one or more live- or stillbirths (a maternity) or an abortion. All live births are assumed to be 38 weeks gestation as information on gestation is not collected at birth registration. For the first time, gestational age information from the National Health Service (NHS) Numbers for Babies (NN4B) data has been used to re-estimate conception statistics for 2005. This shows that 72 per cent of conceptions leading to a maternity in fact have a gestation period that differs from 38 weeks and most of these fall at either 37 or 39 weeks. The age-specific conception rates using this revised method are not significantly different to those produced using the current method.

## Sequential Monte Carlo for counting vertex covers in general graphs

### Statistics and Computing (2016-05-01) 26: 591-607 , May 01, 2016

In this paper we describe a sequential importance sampling (SIS) procedure for counting the number of vertex covers in general graphs. The optimal SIS proposal distribution is the uniform over a suitably restricted set, but is not implementable. We will consider two proposal distributions as approximations to the optimal. Both proposals are based on randomization techniques. The first randomization is the classic probability model of random graphs, and in fact, the resulting SIS algorithm shows polynomial complexity for random graphs. The second randomization introduces a probabilistic relaxation technique that uses Dynamic Programming. The numerical experiments show that the resulting SIS algorithm enjoys excellent practical performance in comparison with existing methods. In particular the method is compared with *cachet*—an exact model counter, and the state of the art *SampleSearch*, which is based on Belief Networks and importance sampling.

## Editorial

### Statistical Methods and Applications (2007-06-01) 16: 5 , June 01, 2007

## The MAP test for multimodality

### Journal of Classification (1994-03-01) 11: 5-36 , March 01, 1994

We introduce a test for detecting multimodality in distributions based on minimal constrained spanning trees. We define a Minimal Ascending Path Spanning Tree (MAPST) on a set of points as a spanning tree that has the minimal possible sum of lengths of links with the constraint that starting from any link, the lengths of the links are non-increasing towards a root node. We define similarly MAPSTs with more than one root. We present some algorithms for finding such trees. Based on these trees, we devise a test for multimodality, called the MAP Test (for Minimal Ascending Path). Using simulations, we estimate percentage points of the MAP statistic and assess the power of the test. Finally, we illustrate the use of MAPSTs for determining the number of modes in a distribution of positions of galaxies on photographic plates from a rich galaxy cluster.

## A Cautionary Note on Likelihood Ratio Tests in Mixture Models

### Annals of the Institute of Statistical Mathematics (2000-09-01) 52: 481-487 , September 01, 2000

We show that iterative methods for maximizing the likelihood in a mixture of exponentials model depend strongly on their particular implementation. Different starting strategies and stopping rules yield completely different estimators of the parameters. This is demonstrated for the likelihood ratio test of homogeneity against two-component exponential mixtures, when the test statistic is calculated by the EM algorithm.

## Local expectations of the population spectral distribution of a high-dimensional covariance matrix

### Statistical Papers (2014-05-01) 55: 563-573 , May 01, 2014

This paper discusses the relationship between the population spectral distribution and the limit of the empirical spectral distribution in high-dimensional situations. When the support of the limiting spectral distribution is split into several intervals, the population one gains a meaningful division, and general functional expectations of each part from the division, referred as local expectations, can be formulated as contour integrals around these intervals. Basing on these knowledge we present consistent estimators of the local expectations and prove a central limit theorem for them. The results are then used to analyze an estimator of the population spectral distribution in recent literature.

## Statistics of random processes I: General theory

### Metrika (1983-12-01) 30: 100 , December 01, 1983

## A semiparametric regression cure model for doubly censored data

### Lifetime Data Analysis (2017-09-01): 1-17 , September 01, 2017

This paper discusses regression analysis of doubly censored failure time data when there may exist a cured subgroup. By doubly censored data, we mean that the failure time of interest denotes the elapsed time between two related events and the observations on both event times can suffer censoring (Sun in The statistical analysis of interval-censored failure time data. Springer, New York, 2006). One typical example of such data is given by an acquired immune deficiency syndrome cohort study. Although many methods have been developed for their analysis (De Gruttola and Lagakos in Biometrics 45:1–12, 1989; Sun et al. in Biometrics 55:909–914, 1999; 60:637–643, 2004; Pan in Biometrics 57:1245–1250, 2001), it does not seem to exist an established method for the situation with a cured subgroup. This paper discusses this later problem and presents a sieve approximation maximum likelihood approach. In addition, the asymptotic properties of the resulting estimators are established and an extensive simulation study indicates that the method seems to work well for practical situations. An application is also provided.

## Goodness-of-fit tests for semiparametric and parametric hypotheses based on the probability weighted empirical characteristic function

### Statistical Papers (2016-12-01) 57: 957-976 , December 01, 2016

We investigate the finite-sample properties of certain procedures which employ the novel notion of the probability weighted empirical characteristic function. The procedures considered are: (1) Testing for symmetry in regression, (2) Testing for multivariate normality with independent observations, and (3) Testing for multivariate normality of random effects in mixed models. Along with the new tests alternative methods based on the ordinary empirical characteristic function as well as other more well known procedures are implemented for the purpose of comparison.