• Title/Summary/Keyword: robust regression

Search Result 357, Processing Time 0.025 seconds

Influence Assessment in Robust Regression

  • Sohn, Bang-Yong;Huh, Myung-Hoe
    • Communications for Statistical Applications and Methods
    • /
    • v.4 no.1
    • /
    • pp.21-32
    • /
    • 1997
  • Robust regression based on M-estimator reduces and/or bounds the influence of outliers in the y-direction only. Therefore, when several influential observations exist, diagnostics in the robust regression is required in order to detect them. In this paper, we propose influence diagnostics in the robust regression based on M-estimator and its one-step version. Noting that M-estimator can be obtained through iterative weighted least squares regression by using internal weights, we apply the weighted least squares (WLS) regression diagnostics to robust regression.

  • PDF

Self-tuning Robust Regression Estimation

  • Park, You-Sung;Lee, Dong-Hee
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.10a
    • /
    • pp.257-262
    • /
    • 2003
  • We introduce a new robust regression estimator, self-tuning regression estimator. Various robust estimators have been developed with discovery for theories and applications since Huber introduced M-estimator at 1960's. We start by announcing various robust estimators and their properties, including their advantages and disadvantages, and furthermore, new estimator overcomes drawbacks of other robust regression estimators, such as ineffective computation on preserving robustness properties.

  • PDF

ROBUST FUZZY LINEAR REGRESSION BASED ON M-ESTIMATORS

  • SOHN BANG-YONG
    • Journal of applied mathematics & informatics
    • /
    • v.18 no.1_2
    • /
    • pp.591-601
    • /
    • 2005
  • The results of fuzzy linear regression are very sensitive to irregular data. When this points exist in a set of data, a fuzzy linear regression model can be incorrectly interpreted. The purpose of this paper is to detect irregular data and to propose robust fuzzy linear regression based on M-estimators with triangular fuzzy regression coefficients for crisp input-output data. Numerical example shows that irregular data can be detected by using the residuals based on M-estimators, and the proposed robust fuzzy linear regression is very resistant to this points.

Robust Cross Validation Score

  • Park, Dong-Ryeon
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.2
    • /
    • pp.413-423
    • /
    • 2005
  • Consider the problem of estimating the underlying regression function from a set of noisy data which is contaminated by a long tailed error distribution. There exist several robust smoothing techniques and these are turned out to be very useful to reduce the influence of outlying observations. However, no matter what kind of robust smoother we use, we should choose the smoothing parameter and relatively less attention has been made for the robust bandwidth selection method. In this paper, we adopt the idea of robust location parameter estimation technique and propose the robust cross validation score functions.

ROBUST CROSS VALIDATIONS IN RIDGE REGRESSION

  • Jung, Kang-Mo
    • Journal of applied mathematics & informatics
    • /
    • v.27 no.3_4
    • /
    • pp.903-908
    • /
    • 2009
  • The shrink parameter in ridge regression may be contaminated by outlying points. We propose robust cross validation scores in ridge regression instead of classical cross validation. We use robust location estimators such as median, least trimmed squares, absolute mean for robust cross validation scores. The robust scores have global robustness. Simulations are performed to show the effectiveness of the proposed estimators.

  • PDF

Robust Nonparametric Regression Method using Rank Transformation

    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.2
    • /
    • pp.574-574
    • /
    • 2000
  • Consider the problem of estimating regression function from a set of data which is contaminated by a long-tailed error distribution. The linear smoother is a kind of a local weighted average of response, so it is not robust against outliers. The kernel M-smoother and the lowess attain robustness against outliers by down-weighting outliers. However, the kernel M-smoother and the lowess requires the iteration for computing the robustness weights, and as Wang and Scott(1994) pointed out, the requirement of iteration is not a desirable property. In this article, we propose the robust nonparametic regression method which does not require the iteration. Robustness can be achieved not only by down-weighting outliers but also by transforming outliers. The rank transformation is a simple procedure where the data are replaced by their corresponding ranks. Iman and Conover(1979) showed the fact that the rank transformation is a robust and powerful procedure in the linear regression. In this paper, we show that we can also use the rank transformation to nonparametric regression to achieve the robustness.

Robust Nonparametric Regression Method using Rank Transformation

  • Park, Dongryeon
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.2
    • /
    • pp.575-583
    • /
    • 2000
  • Consider the problem of estimating regression function from a set of data which is contaminated by a long-tailed error distribution. The linear smoother is a kind of a local weighted average of response, so it is not robust against outliers. The kernel M-smoother and the lowess attain robustness against outliers by down-weighting outliers. However, the kernel M-smoother and the lowess requires the iteration for computing the robustness weights, and as Wang and Scott(1994) pointed out, the requirement of iteration is not a desirable property. In this article, we propose the robust nonparametic regression method which does not require the iteration. Robustness can be achieved not only by down-weighting outliers but also by transforming outliers. The rank transformation is a simple procedure where the data are replaced by their corresponding ranks. Iman and Conover(1979) showed the fact that the rank transformation is a robust and powerful procedure in the linear regression. In this paper, we show that we can also use the rank transformation to nonparametric regression to achieve the robustness.

  • PDF

ROBUST REGRESSION ESTIMATION BASED ON DATA PARTITIONING

  • Lee, Dong-Hee;Park, You-Sung
    • Journal of the Korean Statistical Society
    • /
    • v.36 no.2
    • /
    • pp.299-320
    • /
    • 2007
  • We introduce a high breakdown point estimator referred to as data partitioning robust regression estimator (DPR). Since the DPR is obtained by partitioning observations into a finite number of subsets, it has no computational problem unlike the previous robust regression estimators. Empirical and extensive simulation studies show that the DPR is superior to the previous robust estimators. This is much so in large samples.

ROBUST REGRESSION SMOOTHING FOR DEPENDENT OBSERVATIONS

  • Kim, Tae-Yoon;Song, Gyu-Moon;Kim, Jang-Han
    • Communications of the Korean Mathematical Society
    • /
    • v.19 no.2
    • /
    • pp.345-354
    • /
    • 2004
  • Boente and Fraiman [2] studied robust nonparametric estimators for regression or autoregression problems when the observations exhibit serial dependence. They established strong consistency of two families of M-type robust equivariant estimators for $\phi$-mixing processes. In this paper we extend their results to weaker $\alpha$$alpha$-mixing processes.

Penalized rank regression estimator with the smoothly clipped absolute deviation function

  • Park, Jong-Tae;Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.6
    • /
    • pp.673-683
    • /
    • 2017
  • The least absolute shrinkage and selection operator (LASSO) has been a popular regression estimator with simultaneous variable selection. However, LASSO does not have the oracle property and its robust version is needed in the case of heavy-tailed errors or serious outliers. We propose a robust penalized regression estimator which provide a simultaneous variable selection and estimator. It is based on the rank regression and the non-convex penalty function, the smoothly clipped absolute deviation (SCAD) function which has the oracle property. The proposed method combines the robustness of the rank regression and the oracle property of the SCAD penalty. We develop an efficient algorithm to compute the proposed estimator that includes a SCAD estimate based on the local linear approximation and the tuning parameter of the penalty function. Our estimate can be obtained by the least absolute deviation method. We used an optimal tuning parameter based on the Bayesian information criterion and the cross validation method. Numerical simulation shows that the proposed estimator is robust and effective to analyze contaminated data.