## SEARCH

#### Institution

##### ( see all 185)

- Indian Statistical Institute 7 (%)
- Universidad de Valladolid 5 (%)
- Vienna University of Technology 5 (%)
- Chinese Academy of Sciences 4 (%)
- TU Dortmund University 4 (%)

#### Author

##### ( see all 320)

- Croux, Christophe 5 (%)
- García-Escudero, L. A. 5 (%)
- Gordaliza, A. 5 (%)
- Mayo-Iscar, A. 5 (%)
- Filzmoser, Peter 4 (%)

## CURRENTLY DISPLAYING:

Most articles

Fewest articles

Showing 1 to 10 of 125 matching Articles
Results per page:

## Robust estimation of generalized partially linear model for longitudinal data with dropouts

### Annals of the Institute of Statistical Mathematics (2016-10-01) 68: 977-1000 , October 01, 2016

In this paper, we study the robust estimation of generalized partially linear models (GPLMs) for longitudinal data with dropouts. We aim at achieving robustness against outliers. To this end, a weighted likelihood method is first proposed to obtain the robust estimation of the parameters involved in the dropout model for describing the missing process. Then, a robust inverse probability-weighted generalized estimating equation is developed to achieve robust estimation of the mean model. To approximate the nonparametric function in the GPLM, a regression spline smoothing method is adopted which can linearize the nonparametric function such that statistical inference can be conducted operationally as if a generalized linear model was used. The asymptotic properties of the proposed estimator are established under some regularity conditions, and simulation studies show the robustness of the proposed estimator. In the end, the proposed method is applied to analyze a real data set.

## The minimum S-divergence estimator under continuous models: the Basu–Lindsay approach

### Statistical Papers (2015-07-07): 1-32 , July 07, 2015

Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to the classical maximum likelihood based techniques. Recently Ghosh et al. (A Generalized Divergence for Statistical Inference, 2013a) proposed a general class of divergence measures for robust statistical inference, named the *S*-divergence family. Ghosh (Sankhya A, doi:
10.1007/s13171-014-0063-2
, 2014) discussed its asymptotic properties for the discrete model of densities. In the present paper, we develop the asymptotic properties of the minimum *S*-divergence estimators under continuous models. Here we use the Basu–Lindsay approach (Ann Inst Stat Math 46:683–705, 1994) of smoothing the model densities that, unlike previous approaches, avoids much of the complications of the kernel bandwidth selection. Illustrations are presented to support the performance of the resulting estimators both in terms of efficiency and robustness through extensive simulation studies and real data examples.

## On influence diagnostic in univariate elliptical linear regression models

### Statistical Papers (2003-01-01) 44: 23-45 , January 01, 2003

We discuss in this paper the assessment of local influence in univariate elliptical linear regression models. This class includes all symmetric continuous distributions, such as normal, Student-t, Pearson VII, exponential power and logistic, among others. We derive the appropriate matrices for assessing the local influence on the parameter estimates and on predictions by considering as influence measures the likelihood displacement and a distance based on the Pearson residual. Two examples with real data are given for illustration.

## Optimal designs with string property under asymmetric errors and SLS estimation

### Statistical Papers (2016-08-24): 1-14 , August 24, 2016

We consider the optimal design problem when the design space consists of binary vectors with a string property, i.e., a single stretch of ones. This is done in the framework of second-order least squares estimation which is known to outperform ordinary least squares estimation when the error distribution is asymmetric. Analytical as well as computational results on optimal design measures, under the *D*- and *A*-criteria, are obtained. The issue of robustness to the unknown skewness parameter of the error distribution is also explored. Finally, we present several procedures which entail *N*-run designs that are highly efficient, if not optimal.

## Robust canonical correlations: A comparative study

### Computational Statistics (2005-06-01) 20: 203-229 , June 01, 2005

### Summary

Several approaches for robust canonical correlation analysis will be presented and discussed. A first method is based on the definition of canonical correlation analysis as looking for linear combinations of two sets of variables having maximal (robust) correlation. A second method is based on alternating robust regressions. These methods are discussed in detail and compared with the more traditional approach to robust canonical correlation via covariance matrix estimates. A simulation study compares the performance of the different estimators under several kinds of sampling schemes. Robustness is studied as well by breakdown plots.

## Bias-corrected and robust estimation of the bivariate stable tail dependence function

### TEST (2016-11-19): 1-24 , November 19, 2016

The stable tail dependence function gives a full characterisation of the extremal dependence between two or more random variables. In this paper, we propose an estimator for this function which is robust against outliers in the sample. The estimator is derived from a bivariate second-order tail model together with a proper transformation of the bivariate observations, and its asymptotic properties are studied under some suitable regularity conditions. Our estimation procedure depends on two parameters: $$\alpha $$ , which controls the trade-off between efficiency and robustness of the estimator, and a second-order parameter $$\tau $$ , which can be replaced by a fixed value or by an estimate. In case where $$\tau $$ has been replaced by the true value or by an external consistent estimator, our robust estimator is asymptotically unbiased, whereas in case where $$\tau $$ is mis-specified, one loses this property, but still our estimator performs quite well with respect to bias. The finite sample performance of our robust and bias-corrected estimator of the stable tail dependence function is examined on a simulation study involving uncontaminated and contaminated samples. In particular, its behavior is illustrated for different values of the pair $$(\alpha , \tau )$$ and is compared with alternative estimators from the extreme value literature.

## Identifying Gene-Environment Interactions with a Least Relative Error Approach

### Statistical Applications from Clinical Trials and Personalized Medicine to Finance and Business Analytics (2016-01-01): 305-321 , January 01, 2016

For complex diseases, the interactions between genetic and environmental risk factors can have important implications beyond the main effects. Many of the existing interaction analyses conduct marginal analysis and cannot accommodate the joint effects of multiple main effects and interactions. In this study, we conduct joint analysis which can simultaneously accommodate a large number of effects. Significantly different from the existing studies, we adopt loss functions based on relative errors, which offer a useful alternative to the “classic” methods such as the least squares and least absolute deviation. Further to accommodate censoring in the response variable, we adopt a weighted approach. Penalization is used for identification and regularized estimation. Computationally, we develop an effective algorithm which combines the majorize-minimization and coordinate descent. Simulation shows that the proposed approach has satisfactory performance. We also analyze lung cancer prognosis data with gene expression measurements.

## Partial least squares classification for high dimensional data using the PCOUT algorithm

### Computational Statistics (2013-04-01) 28: 771-788 , April 01, 2013

Classification of samples into two or multi-classes is to interest of scientists in almost every field. Traditional statistical methodology for classification does not work well when there are more variables (*p*) than there are samples (*n*) and it is highly sensitive to outlying observations. In this study, a robust partial least squares based classification method is proposed to handle data containing outliers where
$$n\ll p.$$
The proposed method is applied to well-known benchmark datasets and its properties are explored by an extensive simulation study.

## Robust estimation of multivariate regression model

### Statistical Papers (2009-01-01) 50: 81-100 , January 01, 2009

This paper studies robust estimation of multivariate regression model using kernel weighted local linear regression. A robust estimation procedure is proposed for estimating the regression function and its partial derivatives. The proposed estimators are jointly asymptotically normal and attain nonparametric optimal convergence rate. One-step approximations to the robust estimators are introduced to reduce computational burden. The one-step local M-estimators are shown to achieve the same efficiency as the fully iterative local M-estimators as long as the initial estimators are good enough. The proposed estimators inherit the excellent edge-effect behavior of the local polynomial methods in the univariate case and at the same time overcome the disadvantages of the local least-squares based smoothers. Simulations are conducted to demonstrate the performance of the proposed estimators. Real data sets are analyzed to illustrate the practical utility of the proposed methodology.

## Testing for cointegration using induced-order statistics

### Computational Statistics (2008-01-01) 23: 131-151 , January 01, 2008

In this paper we explore the usefulness of induced-order statistics in the characterization of integrated series and of cointegration relationships. We propose a non-parametric test statistic for testing the null hypothesis of two independent random walks against wide cointegrating alternatives including monotonic nonlinearities and certain types of level shifts in the cointegration relationship. We call our testing device the induced-order Kolmogorov–Smirnov cointegration test (KS), since it is constructed from the induced-order statistics of the series, and we derive its limiting distribution. This non-parametric statistic endows the test with a number of desirable properties: invariance to monotonic transformations of the series, and robustness for the presence of important parameter shifts. By Monte Carlo simulations we analyze the small sample properties of this test. Our simulation results show the robustness of the induced order cointegration test against departures from linear and constant parameter models.