## SEARCH

#### Institution

##### ( see all 264)

- University of Warwick 21 (%)
- Duke University 19 (%)
- Oxford Polytechnic 19 (%)
- Australian Bureau of Agricultural and Resource Economics 11 (%)
- Tbilisi State University 10 (%)

#### Author

##### ( see all 475)

- Harrison, Jeff 19 (%)
- Rees, D. G. 19 (%)
- West, Mike 19 (%)
- Breckling, Jens 11 (%)
- Nadaraya, E. A. 10 (%)

## CURRENTLY DISPLAYING:

Most articles

Fewest articles

Showing 1 to 10 of 428 matching Articles
Results per page:

## Have Very Large Data Bases Methodological Relevance?

### Conceptual and Numerical Analysis of Data (1989-01-01): 311-326 , January 01, 1989

The Humanities, like all provinces of Academia, have been considerably influenced by the “micro computer revolution” of the last few decades. While, as other fields, the Humanities, too, have their fair share of persons who are doubtful about the praiseworthiness of these developments on more general grounds, a specific argument has been raised in the Humanities, which, in our opinion, is fairly unique to them.

## Sampling Forest Canopy Arthropods Available to Birds as Prey

### Estimation and Analysis of Insect Populations (1989-01-01) 55: 436-444 , January 01, 1989

A method of sampling canopy arthropods using direct observation is presented whereby the abundance of different types of arthropods (flying, foliage dwelling, etc.) can be directly compared because the same sampling unit (numbers/leaf area) is used. Total leaf surface area in patches of directly observable foliage was determined by using leaf length/surface area equations. Estimates of relative abundance of different arthropod taxa were used to compare with percent composition of those taxa in diets of canopy foraging bird species. A total of 6,802 arthropods was identified in 1404 samples containing 197,964 leaves. Examples of arthropod abundance estimation and assessment of use versus availability of arthropod prey are provided from one month (May, 1986) of the study.

## Gravitational Antenna Bandwidths and Cross Sections

### Gravitational Wave Data Analysis (1989-01-01) 253: 195-199 , January 01, 1989

Elastic solid antennas are considered. It is shown that many body coherence effects lead to cross sections for pulses, much larger than earlier estimates. For monochromatic continuous signals, the new approach gives the same cross sections as derived in 1960.

A new method for data recording and analysis is proposed. This gives bandwidths several orders larger than are available by recording the output of an optimal filter.

Means are suggested for cascading half acoustic wavelength antennas to give increased cross sections.

Very low temperatures offer methods for selection of antenna quantum states, which imply that the signal to noise ratio can be increased beyond any presently conceived limit.

## Small sample properties of asymptotically equivalent tests for autoregressive conditional heteroskedasticity

### Statistical Papers (1989-12-01) 30: 105-131 , December 01, 1989

Models that allow for autoregressive conditional heteroskedasticity (ARCH) in the error process have recently found widespread application. The purpose of this paper is to evaluate, through Monte Carlo analysis, the small sample properties of an exact Lagrange multiplier test for the presence of ARCH, and to compare the power of this test with that of an asymptotically equivalent TR^{2} version. The comparison involves first-and higher-order variants of these processes. The results indicate substantial power differentials in favor of the exact LM test, by up to 15% for sample sizes smaller than 100.

## Introduction

### Nonparametric Estimation of Probability Densities and Regression Curves (1989-01-01) 20: 1-17 , January 01, 1989

The theory of statistical estimation is one of the basic branches of mathematical statistics. This theory is subdivided into parametric and nonparametric estimation. A nonparametric procedure is usually defined as a procedure which is valid independently of the distribution of the sampled observations. The problem of estimating the functional characteristics of a distribution law of observations belongs to problems of nonparametric estimation. In particular, in recent years there has been a growing interest in problems of estimating probability density and regression curves. This monograph is devoted to a study of these problems.

## Stopping rules, permutation invariance and sufficiency principle

### Annals of the Institute of Statistical Mathematics (1989-03-01) 41: 121-138 , March 01, 1989

In the context of sequential (point as well as interval) estimation, a general formulation of permutation-invariant stopping rules is considered. These stopping rules lead to savings in the ASN at the cost of some elevation of the associated risk—a phenomenon which may be attributed to the violation of the sufficiency principle. For the (point and interval) sequential estimation of the mean of a normal distribution, it is shown that such permutation-invariant stopping rules may lead to a substantial saving in the ASN with only a small increase in the associated risk.

## Intervention Analysis in Multivariate Time Series via the Kalman Filter

### Estimation and Analysis of Insect Populations (1989-01-01) 55: 389-403 , January 01, 1989

The Kalman filter has been found useful in fitting ARIMA models. This paper uses the multivariate extension of the state-space approach to fit multivariate time series which include covariates and periodic interventions. An example will be presented in which these techniques are applied to the study of the pink cotton boll worm moth, *Pectinophora gossypiella* (Saunders), (*Lepidoptera: Gelechiidae*). Moth counts were taken at various locations in a large cotton field over time. A special type of multivariate time series will be used, namely a spatial time series. The amount of irrigation water at each location will be included in the model as a covariate and periodic applications of insecticide will be included as interventions. Previously developed techniques for handling missing data, aggregate data, and nonlinear data transformation are also incorporated. Maximum likelihood estimates of the model parameters are obtained by imbedding the filter in a quasi-Newton optimization routine.

## Closer estimators of a common mean in the sense of Pitman

### Annals of the Institute of Statistical Mathematics (1989-09-01) 41: 477-484 , September 01, 1989

Consider the problem of estimating the common mean of two normal populations with different unknown variances. Suppose a random sample of size*m* is drawn from the first population and a random sample of size*n* is drawn from the second population. The paper gives a family of estimators closer than the sample mean of the first population in the sense of Pitman (1937,*Proc. Cambridge Phil. Soc.*,*33*, 212–222). In particular, the Graybill-Deal estimator (1959,*Biometrics*,*15*, 543–550) is shown to be closer than each of the sample means if*m*≥5 and*n*≥5.

## Continuous Effluent Consents: Modelling and Compliance Testing

### Statistical Methods for the Assessment of Point Source Pollution (1989-01-01): 49-64 , January 01, 1989

The control of continuous polluting discharges in the U.K. was placed in a framework of statistically based legislation nearly 10 years ago. Both discharge standards and desired river quality class objectives are assessed within a probabilistic system of pollution control whereby a minimum level of 95 percent compliance with standards has to be achieved. The use of such a statistical framework permits occasional infringements of what would otherwise be fixed standards. This enables the Water Industry to manage river quality without undue risk of prosecution or unnecessary capital expenditure on effluent treatment.

The change to a statistically based system has been a slow process carried out in stages: from the introduction of a river quality classification; setting long term river quality objectives; then setting discharge consents to achieve these objectives; and finally monitoring compliance with the consents and objectives. Each of these stages has required the development of the necessary statistical tools for river quality planning and management. Due to the decentralised nature of the currently catchment — based Water Authorities, several different statistical approaches have been adopted. However, if, as is planned for the near future, river quality management is carried out by a national regulatory body, then some rationalisation of current methodologies will have to be undertaken.

This paper introduces and examines current U.K. approaches to river quality management and pollution control. Particular emphasis is placed on the statistical modelling techniques used for consent setting and compliance testing. Some of the commonly used techniques are compared and evaluated. A description is presented of work that is underway to develop a framework for the establishment and assessment of intermittent pollution control criteria.