Elemental Subsets: The Building Blocks of Regression
In a regression dataset an elemental subset consists of the minimum number of cases required to estimate the unknown parameters of a regression model. The resulting elemental regression provides an exact fit to the cases in the elemental subset. Early methods of regression estimation were based on c...
Gespeichert in:
Veröffentlicht in: | The American statistician 1997-05, Vol.51 (2), p.122-129 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 129 |
---|---|
container_issue | 2 |
container_start_page | 122 |
container_title | The American statistician |
container_volume | 51 |
creator | Mayo, Matthew S. Gray, J. Brian |
description | In a regression dataset an elemental subset consists of the minimum number of cases required to estimate the unknown parameters of a regression model. The resulting elemental regression provides an exact fit to the cases in the elemental subset. Early methods of regression estimation were based on combining the results of elemental regressions. This approach was abandoned because of its computational infeasibility in all but the smallest datasets and because of the arrival of the least squares method. With the computing power available today, there has been renewed interest in making use of the elemental regressions for model fitting and diagnostic purposes. In this paper we consider the elemental subsets and their associated elemental regressions as useful "building blocks" for the estimation of regression models. Many existing estimators can be expressed in terms of the elemental regressions. We introduce a new classification of regression estimators that generalizes a characterization of ordinary least squares (OLS) based on elemental regressions. Estimators in this class are weighted averages of the elemental regressions, where the weights are determined by leverage and residual information associated with the elemental subsets. The new classification incorporates many existing estimators and provides a framework for developing new alternatives to least squares regression, including the trimmed elemental estimators (TEE) proposed in this paper. |
doi_str_mv | 10.1080/00031305.1997.10473944 |
format | Article |
fullrecord | <record><control><sourceid>jstor_proqu</sourceid><recordid>TN_cdi_proquest_journals_1750837409</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><jstor_id>2685402</jstor_id><sourcerecordid>2685402</sourcerecordid><originalsourceid>FETCH-LOGICAL-c339t-f27a92d94138aff60937ff723d0c23354a54c47f3722a39dbabbd7833786b5923</originalsourceid><addsrcrecordid>eNqFkE9LxDAQxYMouK5-BSnqtZpk0k7izRX_wYKg6zmkbaJdu40mLeK3t2Vd8SKehhl-783MI-SQ0VNGJT2jlAIDmp0ypXAYCQQlxBaZsAww5Qhsm0xGKB2pXbIX43JoKeZ8QsRVY1e27UyTPPZFtF08TxYvNpn1dVPV7XMya3z5GhPvkgf7HGyMtW_3yY4zTbQH33VKnq6vFpe36fz-5u7yYp6WAKpLHUejeKUEA2mcy6kCdA45VLTkAJkwmSgFOkDODaiqMEVRoQRAmReZ4jAlR2vft-Dfexs7vfR9aIeVmnOZMWRCDdDxXxDDjEpAQUcqX1Nl8DEG6_RbqFcmfGpG9Rij3sSoxxj1JsZBePJtb2JpGhdMW9bxR81RUZn_wpax8-G3OQeKmucyE3T86GKN1a3zYWU-fGgq3ZnPxoeNNfxz0RelU44S</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>228517149</pqid></control><display><type>article</type><title>Elemental Subsets: The Building Blocks of Regression</title><source>Periodicals Index Online</source><source>JSTOR Mathematics & Statistics</source><source>Jstor Complete Legacy</source><creator>Mayo, Matthew S. ; Gray, J. Brian</creator><creatorcontrib>Mayo, Matthew S. ; Gray, J. Brian</creatorcontrib><description>In a regression dataset an elemental subset consists of the minimum number of cases required to estimate the unknown parameters of a regression model. The resulting elemental regression provides an exact fit to the cases in the elemental subset. Early methods of regression estimation were based on combining the results of elemental regressions. This approach was abandoned because of its computational infeasibility in all but the smallest datasets and because of the arrival of the least squares method. With the computing power available today, there has been renewed interest in making use of the elemental regressions for model fitting and diagnostic purposes. In this paper we consider the elemental subsets and their associated elemental regressions as useful "building blocks" for the estimation of regression models. Many existing estimators can be expressed in terms of the elemental regressions. We introduce a new classification of regression estimators that generalizes a characterization of ordinary least squares (OLS) based on elemental regressions. Estimators in this class are weighted averages of the elemental regressions, where the weights are determined by leverage and residual information associated with the elemental subsets. The new classification incorporates many existing estimators and provides a framework for developing new alternatives to least squares regression, including the trimmed elemental estimators (TEE) proposed in this paper.</description><identifier>ISSN: 0003-1305</identifier><identifier>EISSN: 1537-2731</identifier><identifier>DOI: 10.1080/00031305.1997.10473944</identifier><identifier>CODEN: ASTAAJ</identifier><language>eng</language><publisher>Alexandria, VA: Taylor & Francis Group</publisher><subject>Algorithms ; Datasets ; Elemental regression ; Estimates ; Estimation methods ; Estimators ; Exact sciences and technology ; Least squares ; Leverage ; Linear inference, regression ; Linear programming ; Linear regression ; Mathematical models ; Mathematics ; Preliminary estimates ; Probability and statistics ; Regression analysis ; Residual ; Robust regression ; Sciences and techniques of general use ; Statistics ; Weighted averages ; Weighting functions</subject><ispartof>The American statistician, 1997-05, Vol.51 (2), p.122-129</ispartof><rights>Copyright Taylor & Francis Group, LLC 1997</rights><rights>Copyright 1997 American Statistical Association</rights><rights>1997 INIST-CNRS</rights><rights>Copyright American Statistical Association May 1997</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c339t-f27a92d94138aff60937ff723d0c23354a54c47f3722a39dbabbd7833786b5923</citedby><cites>FETCH-LOGICAL-c339t-f27a92d94138aff60937ff723d0c23354a54c47f3722a39dbabbd7833786b5923</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.jstor.org/stable/pdf/2685402$$EPDF$$P50$$Gjstor$$H</linktopdf><linktohtml>$$Uhttps://www.jstor.org/stable/2685402$$EHTML$$P50$$Gjstor$$H</linktohtml><link.rule.ids>315,781,785,804,833,27873,27928,27929,58021,58025,58254,58258</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=2790864$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Mayo, Matthew S.</creatorcontrib><creatorcontrib>Gray, J. Brian</creatorcontrib><title>Elemental Subsets: The Building Blocks of Regression</title><title>The American statistician</title><description>In a regression dataset an elemental subset consists of the minimum number of cases required to estimate the unknown parameters of a regression model. The resulting elemental regression provides an exact fit to the cases in the elemental subset. Early methods of regression estimation were based on combining the results of elemental regressions. This approach was abandoned because of its computational infeasibility in all but the smallest datasets and because of the arrival of the least squares method. With the computing power available today, there has been renewed interest in making use of the elemental regressions for model fitting and diagnostic purposes. In this paper we consider the elemental subsets and their associated elemental regressions as useful "building blocks" for the estimation of regression models. Many existing estimators can be expressed in terms of the elemental regressions. We introduce a new classification of regression estimators that generalizes a characterization of ordinary least squares (OLS) based on elemental regressions. Estimators in this class are weighted averages of the elemental regressions, where the weights are determined by leverage and residual information associated with the elemental subsets. The new classification incorporates many existing estimators and provides a framework for developing new alternatives to least squares regression, including the trimmed elemental estimators (TEE) proposed in this paper.</description><subject>Algorithms</subject><subject>Datasets</subject><subject>Elemental regression</subject><subject>Estimates</subject><subject>Estimation methods</subject><subject>Estimators</subject><subject>Exact sciences and technology</subject><subject>Least squares</subject><subject>Leverage</subject><subject>Linear inference, regression</subject><subject>Linear programming</subject><subject>Linear regression</subject><subject>Mathematical models</subject><subject>Mathematics</subject><subject>Preliminary estimates</subject><subject>Probability and statistics</subject><subject>Regression analysis</subject><subject>Residual</subject><subject>Robust regression</subject><subject>Sciences and techniques of general use</subject><subject>Statistics</subject><subject>Weighted averages</subject><subject>Weighting functions</subject><issn>0003-1305</issn><issn>1537-2731</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1997</creationdate><recordtype>article</recordtype><sourceid>K30</sourceid><sourceid>8G5</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqFkE9LxDAQxYMouK5-BSnqtZpk0k7izRX_wYKg6zmkbaJdu40mLeK3t2Vd8SKehhl-783MI-SQ0VNGJT2jlAIDmp0ypXAYCQQlxBaZsAww5Qhsm0xGKB2pXbIX43JoKeZ8QsRVY1e27UyTPPZFtF08TxYvNpn1dVPV7XMya3z5GhPvkgf7HGyMtW_3yY4zTbQH33VKnq6vFpe36fz-5u7yYp6WAKpLHUejeKUEA2mcy6kCdA45VLTkAJkwmSgFOkDODaiqMEVRoQRAmReZ4jAlR2vft-Dfexs7vfR9aIeVmnOZMWRCDdDxXxDDjEpAQUcqX1Nl8DEG6_RbqFcmfGpG9Rij3sSoxxj1JsZBePJtb2JpGhdMW9bxR81RUZn_wpax8-G3OQeKmucyE3T86GKN1a3zYWU-fGgq3ZnPxoeNNfxz0RelU44S</recordid><startdate>19970501</startdate><enddate>19970501</enddate><creator>Mayo, Matthew S.</creator><creator>Gray, J. Brian</creator><general>Taylor & Francis Group</general><general>American Statistical Association</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>JTYFY</scope><scope>K30</scope><scope>PAAUG</scope><scope>PAWHS</scope><scope>PAWZZ</scope><scope>PAXOH</scope><scope>PBHAV</scope><scope>PBQSW</scope><scope>PBYQZ</scope><scope>PCIWU</scope><scope>PCMID</scope><scope>PCZJX</scope><scope>PDGRG</scope><scope>PDWWI</scope><scope>PETMR</scope><scope>PFVGT</scope><scope>PGXDX</scope><scope>PIHIL</scope><scope>PISVA</scope><scope>PJCTQ</scope><scope>PJTMS</scope><scope>PLCHJ</scope><scope>PMHAD</scope><scope>PNQDJ</scope><scope>POUND</scope><scope>PPLAD</scope><scope>PQAPC</scope><scope>PQCAN</scope><scope>PQCMW</scope><scope>PQEME</scope><scope>PQHKH</scope><scope>PQMID</scope><scope>PQNCT</scope><scope>PQNET</scope><scope>PQSCT</scope><scope>PQSET</scope><scope>PSVJG</scope><scope>PVMQY</scope><scope>PZGFC</scope><scope>0-V</scope><scope>0U~</scope><scope>1-H</scope><scope>3V.</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>88C</scope><scope>88F</scope><scope>88I</scope><scope>88J</scope><scope>8AF</scope><scope>8C1</scope><scope>8FE</scope><scope>8FG</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8FL</scope><scope>8G5</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ALSLI</scope><scope>AZQEC</scope><scope>BEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>FYUFA</scope><scope>F~G</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>K60</scope><scope>K6~</scope><scope>K9-</scope><scope>L.-</scope><scope>L.0</scope><scope>L6V</scope><scope>M0C</scope><scope>M0R</scope><scope>M0T</scope><scope>M1Q</scope><scope>M2O</scope><scope>M2P</scope><scope>M2R</scope><scope>M7S</scope><scope>MBDVC</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>S0X</scope></search><sort><creationdate>19970501</creationdate><title>Elemental Subsets: The Building Blocks of Regression</title><author>Mayo, Matthew S. ; Gray, J. Brian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c339t-f27a92d94138aff60937ff723d0c23354a54c47f3722a39dbabbd7833786b5923</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1997</creationdate><topic>Algorithms</topic><topic>Datasets</topic><topic>Elemental regression</topic><topic>Estimates</topic><topic>Estimation methods</topic><topic>Estimators</topic><topic>Exact sciences and technology</topic><topic>Least squares</topic><topic>Leverage</topic><topic>Linear inference, regression</topic><topic>Linear programming</topic><topic>Linear regression</topic><topic>Mathematical models</topic><topic>Mathematics</topic><topic>Preliminary estimates</topic><topic>Probability and statistics</topic><topic>Regression analysis</topic><topic>Residual</topic><topic>Robust regression</topic><topic>Sciences and techniques of general use</topic><topic>Statistics</topic><topic>Weighted averages</topic><topic>Weighting functions</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mayo, Matthew S.</creatorcontrib><creatorcontrib>Gray, J. Brian</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Periodicals Index Online Segment 37</collection><collection>Periodicals Index Online</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - West</collection><collection>Primary Sources Access (Plan D) - International</collection><collection>Primary Sources Access & Build (Plan A) - MEA</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - Midwest</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - Northeast</collection><collection>Primary Sources Access (Plan D) - Southeast</collection><collection>Primary Sources Access (Plan D) - North Central</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - Southeast</collection><collection>Primary Sources Access (Plan D) - South Central</collection><collection>Primary Sources Access & Build (Plan A) - UK / I</collection><collection>Primary Sources Access (Plan D) - Canada</collection><collection>Primary Sources Access (Plan D) - EMEALA</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - North Central</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - South Central</collection><collection>Primary Sources Access & Build (Plan A) - International</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - International</collection><collection>Primary Sources Access (Plan D) - West</collection><collection>Periodicals Index Online Segments 1-50</collection><collection>Primary Sources Access (Plan D) - APAC</collection><collection>Primary Sources Access (Plan D) - Midwest</collection><collection>Primary Sources Access (Plan D) - MEA</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - Canada</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - UK / I</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - EMEALA</collection><collection>Primary Sources Access & Build (Plan A) - APAC</collection><collection>Primary Sources Access & Build (Plan A) - Canada</collection><collection>Primary Sources Access & Build (Plan A) - West</collection><collection>Primary Sources Access & Build (Plan A) - EMEALA</collection><collection>Primary Sources Access (Plan D) - Northeast</collection><collection>Primary Sources Access & Build (Plan A) - Midwest</collection><collection>Primary Sources Access & Build (Plan A) - North Central</collection><collection>Primary Sources Access & Build (Plan A) - Northeast</collection><collection>Primary Sources Access & Build (Plan A) - South Central</collection><collection>Primary Sources Access & Build (Plan A) - Southeast</collection><collection>Primary Sources Access (Plan D) - UK / I</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - APAC</collection><collection>Primary Sources Access—Foundation Edition (Plan E) - MEA</collection><collection>ProQuest Social Sciences Premium Collection</collection><collection>Global News & ABI/Inform Professional</collection><collection>Trade PRO</collection><collection>ProQuest Central (Corporate)</collection><collection>Access via ABI/INFORM (ProQuest)</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Healthcare Administration Database (Alumni)</collection><collection>Military Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>Social Science Database (Alumni Edition)</collection><collection>STEM Database</collection><collection>Public Health Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Research Library (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Social Science Premium Collection</collection><collection>ProQuest Central Essentials</collection><collection>eLibrary</collection><collection>Proquest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>Health Research Premium Collection</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Consumer Health Database (Alumni Edition)</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ABI/INFORM Professional Standard</collection><collection>ProQuest Engineering Collection</collection><collection>ABI/INFORM Global</collection><collection>Consumer Health Database</collection><collection>Healthcare Administration Database</collection><collection>Military Database</collection><collection>Research Library</collection><collection>Science Database</collection><collection>Social Science Database</collection><collection>Engineering Database</collection><collection>Research Library (Corporate)</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><collection>SIRS Editorial</collection><jtitle>The American statistician</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Mayo, Matthew S.</au><au>Gray, J. Brian</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Elemental Subsets: The Building Blocks of Regression</atitle><jtitle>The American statistician</jtitle><date>1997-05-01</date><risdate>1997</risdate><volume>51</volume><issue>2</issue><spage>122</spage><epage>129</epage><pages>122-129</pages><issn>0003-1305</issn><eissn>1537-2731</eissn><coden>ASTAAJ</coden><abstract>In a regression dataset an elemental subset consists of the minimum number of cases required to estimate the unknown parameters of a regression model. The resulting elemental regression provides an exact fit to the cases in the elemental subset. Early methods of regression estimation were based on combining the results of elemental regressions. This approach was abandoned because of its computational infeasibility in all but the smallest datasets and because of the arrival of the least squares method. With the computing power available today, there has been renewed interest in making use of the elemental regressions for model fitting and diagnostic purposes. In this paper we consider the elemental subsets and their associated elemental regressions as useful "building blocks" for the estimation of regression models. Many existing estimators can be expressed in terms of the elemental regressions. We introduce a new classification of regression estimators that generalizes a characterization of ordinary least squares (OLS) based on elemental regressions. Estimators in this class are weighted averages of the elemental regressions, where the weights are determined by leverage and residual information associated with the elemental subsets. The new classification incorporates many existing estimators and provides a framework for developing new alternatives to least squares regression, including the trimmed elemental estimators (TEE) proposed in this paper.</abstract><cop>Alexandria, VA</cop><pub>Taylor & Francis Group</pub><doi>10.1080/00031305.1997.10473944</doi><tpages>8</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0003-1305 |
ispartof | The American statistician, 1997-05, Vol.51 (2), p.122-129 |
issn | 0003-1305 1537-2731 |
language | eng |
recordid | cdi_proquest_journals_1750837409 |
source | Periodicals Index Online; JSTOR Mathematics & Statistics; Jstor Complete Legacy |
subjects | Algorithms Datasets Elemental regression Estimates Estimation methods Estimators Exact sciences and technology Least squares Leverage Linear inference, regression Linear programming Linear regression Mathematical models Mathematics Preliminary estimates Probability and statistics Regression analysis Residual Robust regression Sciences and techniques of general use Statistics Weighted averages Weighting functions |
title | Elemental Subsets: The Building Blocks of Regression |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-17T03%3A46%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-jstor_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Elemental%20Subsets:%20The%20Building%20Blocks%20of%20Regression&rft.jtitle=The%20American%20statistician&rft.au=Mayo,%20Matthew%20S.&rft.date=1997-05-01&rft.volume=51&rft.issue=2&rft.spage=122&rft.epage=129&rft.pages=122-129&rft.issn=0003-1305&rft.eissn=1537-2731&rft.coden=ASTAAJ&rft_id=info:doi/10.1080/00031305.1997.10473944&rft_dat=%3Cjstor_proqu%3E2685402%3C/jstor_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=228517149&rft_id=info:pmid/&rft_jstor_id=2685402&rfr_iscdi=true |