A model selection method for S-estimation

Cleaning data or removing some data periods in least squares (LS) regression analysis is not unusual. This practice indicates that a researcher sometimes desires to estimate the parameter value, with which the regression function fits a large fraction of individuals or events in the population (behi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The econometrics journal 2007-01, Vol.10 (2), p.294-319
Hauptverfasser: Preminger, Arie, Sakata, Shinichi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Cleaning data or removing some data periods in least squares (LS) regression analysis is not unusual. This practice indicates that a researcher sometimes desires to estimate the parameter value, with which the regression function fits a large fraction of individuals or events in the population (behind the original data set), possibly exhibiting poor fits to some atypical individuals or events. The S-estimators are a class of estimators that are consistent with the researcher's desire in such situations. In this paper, we propose a method of model selection suitable in the S-estimation. The proposed method chooses a model that minimizes a criterion named the penalised S-scale criterion (PSC), which is decreasing in the sample S-scale of fitted residuals and increasing in the number of parameters. We study the large sample behavior of the PSC in nonlinear regression with dependent, heterogeneous data, to establish sets of conditions sufficient for the PSC to consistently select the best-fitting, most parsimonious model. Our analysis allows for partial unidentifiability, which is an important possibility when selecting one among non-linear regression models. We conduct Monte Carlo simulations to verify that a particular PSC called the PSC-S is at least as trustworthy as the Schwarz information criterion, often used in the LS regression.
ISSN:1368-4221
1368-423X
DOI:10.1111/j.1368-423X.2007.00209.x