The Epic Story of Maximum Likelihood

At a superficial level, the idea of maximum likelihood must be prehistoric: early hunters and gatherers may not have used the words "method of maximum likelihood" to describe their choice of where and how to hunt and gather, but it is hard to believe they would have been surprised if their...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Statistical science 2007-11, Vol.22 (4), p.598-620
1. Verfasser: Stigler, Stephen M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 620
container_issue 4
container_start_page 598
container_title Statistical science
container_volume 22
creator Stigler, Stephen M.
description At a superficial level, the idea of maximum likelihood must be prehistoric: early hunters and gatherers may not have used the words "method of maximum likelihood" to describe their choice of where and how to hunt and gather, but it is hard to believe they would have been surprised if their method had been described in those terms. It seems a simple, even unassailable idea: Who would rise to argue in favor of a method of minimum likelihood, or even mediocre likelihood? And yet the mathematical history of the topic shows this "simple idea" is really anything but simple. Joseph Louis Lagrange, Daniel Bernoulli, Leonard Euler, Pierre Simon Laplace and Carl Friedrich Gauss are only some of those who explored the topic, not always in ways we would sanction today. In this article, that history is reviewed from back well before Fisher to the time of Lucien Le Cam's dissertation. In the process Fisher's unpublished 1930 characterization of conditions for the consistency and efficiency of maximum likelihood estimates is presented, and the mathematical basis of his three proofs discussed. In particular, Fisher's derivation of the information inequality is seen to be derived from his work on the analysis of variance, and his later approach via estimating functions was derived from Euler's Relation for homogeneous functions. The reaction to Fisher's work is reviewed, and some lessons drawn.
doi_str_mv 10.1214/07-STS249
format Article
fullrecord <record><control><sourceid>jstor_proje</sourceid><recordid>TN_cdi_projecteuclid_primary_oai_CULeuclid_euclid_ss_1207580174</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><jstor_id>27645865</jstor_id><sourcerecordid>27645865</sourcerecordid><originalsourceid>FETCH-LOGICAL-c342t-8633e6c60e452faaf1a01eb83c0a74097436b4ac07395dd85c0f289caaf294e73</originalsourceid><addsrcrecordid>eNo9kEtLw0AUhQdRsEYX_gAhCzcuonfek5VIqA-IuEi6HqaTCZ2YMiWTgv33jaR0deDynQ_uQegewzMmmL2AzKq6Iiy_QAuChcqUZPwSLUApmjFC5TW6ibEDAC4wW6DHeuPS5c7btBrDcEhDm36bP7_db9PS_7reb0JobtFVa_ro7k6ZoNX7si4-s_Ln46t4KzNLGRkzJSh1wgpwjJPWmBYbwG6tqAUjGeSSUbFmxoKkOW8axS20ROV2IknOnKQJep29uyF0zo5ub3vf6N3gt2Y46GC8Llbl6XqKGDUmILkCPPkT9DQb7BBiHFx7LmPQ_wNpkHoeaGIfZraL0-tnkEjBuBKcHgFc0mDj</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>The Epic Story of Maximum Likelihood</title><source>Jstor Complete Legacy</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Project Euclid Complete</source><source>JSTOR Mathematics &amp; Statistics</source><creator>Stigler, Stephen M.</creator><creatorcontrib>Stigler, Stephen M.</creatorcontrib><description>At a superficial level, the idea of maximum likelihood must be prehistoric: early hunters and gatherers may not have used the words "method of maximum likelihood" to describe their choice of where and how to hunt and gather, but it is hard to believe they would have been surprised if their method had been described in those terms. It seems a simple, even unassailable idea: Who would rise to argue in favor of a method of minimum likelihood, or even mediocre likelihood? And yet the mathematical history of the topic shows this "simple idea" is really anything but simple. Joseph Louis Lagrange, Daniel Bernoulli, Leonard Euler, Pierre Simon Laplace and Carl Friedrich Gauss are only some of those who explored the topic, not always in ways we would sanction today. In this article, that history is reviewed from back well before Fisher to the time of Lucien Le Cam's dissertation. In the process Fisher's unpublished 1930 characterization of conditions for the consistency and efficiency of maximum likelihood estimates is presented, and the mathematical basis of his three proofs discussed. In particular, Fisher's derivation of the information inequality is seen to be derived from his work on the analysis of variance, and his later approach via estimating functions was derived from Euler's Relation for homogeneous functions. The reaction to Fisher's work is reviewed, and some lessons drawn.</description><identifier>ISSN: 0883-4237</identifier><identifier>EISSN: 2168-8745</identifier><identifier>DOI: 10.1214/07-STS249</identifier><language>eng</language><publisher>Institute of Mathematical Statistics</publisher><subject>Abraham Wald ; Consistent estimators ; efficiency ; Estimation methods ; Harold Hotelling ; history of statistics ; Jerzy Neyman ; Karl Pearson ; Mathematical constants ; Mathematical functions ; maximum likelihood ; Maximum likelihood estimation ; Maximum likelihood estimators ; Probabilities ; R. A. Fisher ; Statistical discrepancies ; Statistics ; sufficiency ; superefficiency</subject><ispartof>Statistical science, 2007-11, Vol.22 (4), p.598-620</ispartof><rights>Copyright 2007 Institute of Mathematical Statistics</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c342t-8633e6c60e452faaf1a01eb83c0a74097436b4ac07395dd85c0f289caaf294e73</citedby><cites>FETCH-LOGICAL-c342t-8633e6c60e452faaf1a01eb83c0a74097436b4ac07395dd85c0f289caaf294e73</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.jstor.org/stable/pdf/27645865$$EPDF$$P50$$Gjstor$$H</linktopdf><linktohtml>$$Uhttps://www.jstor.org/stable/27645865$$EHTML$$P50$$Gjstor$$H</linktohtml><link.rule.ids>230,314,776,780,799,828,881,921,27901,27902,57992,57996,58225,58229</link.rule.ids></links><search><creatorcontrib>Stigler, Stephen M.</creatorcontrib><title>The Epic Story of Maximum Likelihood</title><title>Statistical science</title><description>At a superficial level, the idea of maximum likelihood must be prehistoric: early hunters and gatherers may not have used the words "method of maximum likelihood" to describe their choice of where and how to hunt and gather, but it is hard to believe they would have been surprised if their method had been described in those terms. It seems a simple, even unassailable idea: Who would rise to argue in favor of a method of minimum likelihood, or even mediocre likelihood? And yet the mathematical history of the topic shows this "simple idea" is really anything but simple. Joseph Louis Lagrange, Daniel Bernoulli, Leonard Euler, Pierre Simon Laplace and Carl Friedrich Gauss are only some of those who explored the topic, not always in ways we would sanction today. In this article, that history is reviewed from back well before Fisher to the time of Lucien Le Cam's dissertation. In the process Fisher's unpublished 1930 characterization of conditions for the consistency and efficiency of maximum likelihood estimates is presented, and the mathematical basis of his three proofs discussed. In particular, Fisher's derivation of the information inequality is seen to be derived from his work on the analysis of variance, and his later approach via estimating functions was derived from Euler's Relation for homogeneous functions. The reaction to Fisher's work is reviewed, and some lessons drawn.</description><subject>Abraham Wald</subject><subject>Consistent estimators</subject><subject>efficiency</subject><subject>Estimation methods</subject><subject>Harold Hotelling</subject><subject>history of statistics</subject><subject>Jerzy Neyman</subject><subject>Karl Pearson</subject><subject>Mathematical constants</subject><subject>Mathematical functions</subject><subject>maximum likelihood</subject><subject>Maximum likelihood estimation</subject><subject>Maximum likelihood estimators</subject><subject>Probabilities</subject><subject>R. A. Fisher</subject><subject>Statistical discrepancies</subject><subject>Statistics</subject><subject>sufficiency</subject><subject>superefficiency</subject><issn>0883-4237</issn><issn>2168-8745</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2007</creationdate><recordtype>article</recordtype><recordid>eNo9kEtLw0AUhQdRsEYX_gAhCzcuonfek5VIqA-IuEi6HqaTCZ2YMiWTgv33jaR0deDynQ_uQegewzMmmL2AzKq6Iiy_QAuChcqUZPwSLUApmjFC5TW6ibEDAC4wW6DHeuPS5c7btBrDcEhDm36bP7_db9PS_7reb0JobtFVa_ro7k6ZoNX7si4-s_Ln46t4KzNLGRkzJSh1wgpwjJPWmBYbwG6tqAUjGeSSUbFmxoKkOW8axS20ROV2IknOnKQJep29uyF0zo5ub3vf6N3gt2Y46GC8Llbl6XqKGDUmILkCPPkT9DQb7BBiHFx7LmPQ_wNpkHoeaGIfZraL0-tnkEjBuBKcHgFc0mDj</recordid><startdate>20071101</startdate><enddate>20071101</enddate><creator>Stigler, Stephen M.</creator><general>Institute of Mathematical Statistics</general><general>The Institute of Mathematical Statistics</general><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20071101</creationdate><title>The Epic Story of Maximum Likelihood</title><author>Stigler, Stephen M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c342t-8633e6c60e452faaf1a01eb83c0a74097436b4ac07395dd85c0f289caaf294e73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2007</creationdate><topic>Abraham Wald</topic><topic>Consistent estimators</topic><topic>efficiency</topic><topic>Estimation methods</topic><topic>Harold Hotelling</topic><topic>history of statistics</topic><topic>Jerzy Neyman</topic><topic>Karl Pearson</topic><topic>Mathematical constants</topic><topic>Mathematical functions</topic><topic>maximum likelihood</topic><topic>Maximum likelihood estimation</topic><topic>Maximum likelihood estimators</topic><topic>Probabilities</topic><topic>R. A. Fisher</topic><topic>Statistical discrepancies</topic><topic>Statistics</topic><topic>sufficiency</topic><topic>superefficiency</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Stigler, Stephen M.</creatorcontrib><collection>CrossRef</collection><jtitle>Statistical science</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Stigler, Stephen M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The Epic Story of Maximum Likelihood</atitle><jtitle>Statistical science</jtitle><date>2007-11-01</date><risdate>2007</risdate><volume>22</volume><issue>4</issue><spage>598</spage><epage>620</epage><pages>598-620</pages><issn>0883-4237</issn><eissn>2168-8745</eissn><abstract>At a superficial level, the idea of maximum likelihood must be prehistoric: early hunters and gatherers may not have used the words "method of maximum likelihood" to describe their choice of where and how to hunt and gather, but it is hard to believe they would have been surprised if their method had been described in those terms. It seems a simple, even unassailable idea: Who would rise to argue in favor of a method of minimum likelihood, or even mediocre likelihood? And yet the mathematical history of the topic shows this "simple idea" is really anything but simple. Joseph Louis Lagrange, Daniel Bernoulli, Leonard Euler, Pierre Simon Laplace and Carl Friedrich Gauss are only some of those who explored the topic, not always in ways we would sanction today. In this article, that history is reviewed from back well before Fisher to the time of Lucien Le Cam's dissertation. In the process Fisher's unpublished 1930 characterization of conditions for the consistency and efficiency of maximum likelihood estimates is presented, and the mathematical basis of his three proofs discussed. In particular, Fisher's derivation of the information inequality is seen to be derived from his work on the analysis of variance, and his later approach via estimating functions was derived from Euler's Relation for homogeneous functions. The reaction to Fisher's work is reviewed, and some lessons drawn.</abstract><pub>Institute of Mathematical Statistics</pub><doi>10.1214/07-STS249</doi><tpages>23</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0883-4237
ispartof Statistical science, 2007-11, Vol.22 (4), p.598-620
issn 0883-4237
2168-8745
language eng
recordid cdi_projecteuclid_primary_oai_CULeuclid_euclid_ss_1207580174
source Jstor Complete Legacy; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Project Euclid Complete; JSTOR Mathematics & Statistics
subjects Abraham Wald
Consistent estimators
efficiency
Estimation methods
Harold Hotelling
history of statistics
Jerzy Neyman
Karl Pearson
Mathematical constants
Mathematical functions
maximum likelihood
Maximum likelihood estimation
Maximum likelihood estimators
Probabilities
R. A. Fisher
Statistical discrepancies
Statistics
sufficiency
superefficiency
title The Epic Story of Maximum Likelihood
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T09%3A49%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-jstor_proje&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20Epic%20Story%20of%20Maximum%20Likelihood&rft.jtitle=Statistical%20science&rft.au=Stigler,%20Stephen%20M.&rft.date=2007-11-01&rft.volume=22&rft.issue=4&rft.spage=598&rft.epage=620&rft.pages=598-620&rft.issn=0883-4237&rft.eissn=2168-8745&rft_id=info:doi/10.1214/07-STS249&rft_dat=%3Cjstor_proje%3E27645865%3C/jstor_proje%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_jstor_id=27645865&rfr_iscdi=true