The MinMax information measure
The importance of finding minimum entropy probability distributions and the value of minimum entropy for a probabilistic system is discussed. A method to calculate these when there are both moment and inequality constraints on probabilities is given and illustrated with examples. It is shown that: i...
Gespeichert in:
Veröffentlicht in: | International journal of systems science 1995-01, Vol.26 (1), p.1-12 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 12 |
---|---|
container_issue | 1 |
container_start_page | 1 |
container_title | International journal of systems science |
container_volume | 26 |
creator | KAPUR, J. N. BACIU, G. KESAVAN, H. K. |
description | The importance of finding minimum entropy probability distributions and the value of minimum entropy for a probabilistic system is discussed. A method to calculate these when there are both moment and inequality constraints on probabilities is given and illustrated with examples. It is shown that: information given by moments or inequalities on probabilities can be measured by the reduction in the uncertainty gap (S
max
- S
min
); and in certain circumstances the inequalities on probabilities can provide significant information about probabilistic systems. |
doi_str_mv | 10.1080/00207729508929020 |
format | Article |
fullrecord | <record><control><sourceid>pascalfrancis_infor</sourceid><recordid>TN_cdi_pascalfrancis_primary_3470291</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3470291</sourcerecordid><originalsourceid>FETCH-LOGICAL-c325t-293d6b10abb11df8fbc23029edba5675a27533e7dd0f470837f6475fbe92a19e3</originalsourceid><addsrcrecordid>eNp1j01LxDAQhoMoWFd_gBfZg9fqTNI0DXiRxVVhFy_ruUyaBCv9WJKK7r-3S1cv4mkY5nnm5WXsEuEGoYBbAA5KcS2h0FyPyxFLMMuzVArUxyzZ39MRwFN2FuM7AEjJIWFXmzc3X9fdmr7mdef70NJQ9928dRQ_gjtnJ56a6C4Oc8Zelw-bxVO6enl8Xtyv0kpwOaRcC5sbBDIG0frCm4oL4NpZQzJXkriSQjhlLfhMQSGUzzMlvXGaE2onZgynv1XoYwzOl9tQtxR2JUK5L1j-KTg615OzpVhR4wN1VR1_RTEGcY0jdjdhh3qffWhsOdCu6cOPI_5P-QYb7mDJ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>The MinMax information measure</title><source>Access via Taylor & Francis</source><creator>KAPUR, J. N. ; BACIU, G. ; KESAVAN, H. K.</creator><creatorcontrib>KAPUR, J. N. ; BACIU, G. ; KESAVAN, H. K.</creatorcontrib><description>The importance of finding minimum entropy probability distributions and the value of minimum entropy for a probabilistic system is discussed. A method to calculate these when there are both moment and inequality constraints on probabilities is given and illustrated with examples. It is shown that: information given by moments or inequalities on probabilities can be measured by the reduction in the uncertainty gap (S
max
- S
min
); and in certain circumstances the inequalities on probabilities can provide significant information about probabilistic systems.</description><identifier>ISSN: 0020-7721</identifier><identifier>EISSN: 1464-5319</identifier><identifier>DOI: 10.1080/00207729508929020</identifier><identifier>CODEN: IJSYA9</identifier><language>eng</language><publisher>London: Taylor & Francis Group</publisher><subject>Applied sciences ; Exact sciences and technology ; Information theory ; Information, signal and communications theory ; Telecommunications and information theory</subject><ispartof>International journal of systems science, 1995-01, Vol.26 (1), p.1-12</ispartof><rights>Copyright Taylor & Francis Group, LLC 1995</rights><rights>1995 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c325t-293d6b10abb11df8fbc23029edba5675a27533e7dd0f470837f6475fbe92a19e3</citedby><cites>FETCH-LOGICAL-c325t-293d6b10abb11df8fbc23029edba5675a27533e7dd0f470837f6475fbe92a19e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.tandfonline.com/doi/pdf/10.1080/00207729508929020$$EPDF$$P50$$Ginformaworld$$H</linktopdf><linktohtml>$$Uhttps://www.tandfonline.com/doi/full/10.1080/00207729508929020$$EHTML$$P50$$Ginformaworld$$H</linktohtml><link.rule.ids>314,780,784,4024,27923,27924,27925,59647,60436</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=3470291$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>KAPUR, J. N.</creatorcontrib><creatorcontrib>BACIU, G.</creatorcontrib><creatorcontrib>KESAVAN, H. K.</creatorcontrib><title>The MinMax information measure</title><title>International journal of systems science</title><description>The importance of finding minimum entropy probability distributions and the value of minimum entropy for a probabilistic system is discussed. A method to calculate these when there are both moment and inequality constraints on probabilities is given and illustrated with examples. It is shown that: information given by moments or inequalities on probabilities can be measured by the reduction in the uncertainty gap (S
max
- S
min
); and in certain circumstances the inequalities on probabilities can provide significant information about probabilistic systems.</description><subject>Applied sciences</subject><subject>Exact sciences and technology</subject><subject>Information theory</subject><subject>Information, signal and communications theory</subject><subject>Telecommunications and information theory</subject><issn>0020-7721</issn><issn>1464-5319</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1995</creationdate><recordtype>article</recordtype><recordid>eNp1j01LxDAQhoMoWFd_gBfZg9fqTNI0DXiRxVVhFy_ruUyaBCv9WJKK7r-3S1cv4mkY5nnm5WXsEuEGoYBbAA5KcS2h0FyPyxFLMMuzVArUxyzZ39MRwFN2FuM7AEjJIWFXmzc3X9fdmr7mdef70NJQ9928dRQ_gjtnJ56a6C4Oc8Zelw-bxVO6enl8Xtyv0kpwOaRcC5sbBDIG0frCm4oL4NpZQzJXkriSQjhlLfhMQSGUzzMlvXGaE2onZgynv1XoYwzOl9tQtxR2JUK5L1j-KTg615OzpVhR4wN1VR1_RTEGcY0jdjdhh3qffWhsOdCu6cOPI_5P-QYb7mDJ</recordid><startdate>19950101</startdate><enddate>19950101</enddate><creator>KAPUR, J. N.</creator><creator>BACIU, G.</creator><creator>KESAVAN, H. K.</creator><general>Taylor & Francis Group</general><general>Taylor & Francis</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>19950101</creationdate><title>The MinMax information measure</title><author>KAPUR, J. N. ; BACIU, G. ; KESAVAN, H. K.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c325t-293d6b10abb11df8fbc23029edba5675a27533e7dd0f470837f6475fbe92a19e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1995</creationdate><topic>Applied sciences</topic><topic>Exact sciences and technology</topic><topic>Information theory</topic><topic>Information, signal and communications theory</topic><topic>Telecommunications and information theory</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>KAPUR, J. N.</creatorcontrib><creatorcontrib>BACIU, G.</creatorcontrib><creatorcontrib>KESAVAN, H. K.</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><jtitle>International journal of systems science</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>KAPUR, J. N.</au><au>BACIU, G.</au><au>KESAVAN, H. K.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The MinMax information measure</atitle><jtitle>International journal of systems science</jtitle><date>1995-01-01</date><risdate>1995</risdate><volume>26</volume><issue>1</issue><spage>1</spage><epage>12</epage><pages>1-12</pages><issn>0020-7721</issn><eissn>1464-5319</eissn><coden>IJSYA9</coden><abstract>The importance of finding minimum entropy probability distributions and the value of minimum entropy for a probabilistic system is discussed. A method to calculate these when there are both moment and inequality constraints on probabilities is given and illustrated with examples. It is shown that: information given by moments or inequalities on probabilities can be measured by the reduction in the uncertainty gap (S
max
- S
min
); and in certain circumstances the inequalities on probabilities can provide significant information about probabilistic systems.</abstract><cop>London</cop><pub>Taylor & Francis Group</pub><doi>10.1080/00207729508929020</doi><tpages>12</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0020-7721 |
ispartof | International journal of systems science, 1995-01, Vol.26 (1), p.1-12 |
issn | 0020-7721 1464-5319 |
language | eng |
recordid | cdi_pascalfrancis_primary_3470291 |
source | Access via Taylor & Francis |
subjects | Applied sciences Exact sciences and technology Information theory Information, signal and communications theory Telecommunications and information theory |
title | The MinMax information measure |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T16%3A11%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-pascalfrancis_infor&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20MinMax%20information%20measure&rft.jtitle=International%20journal%20of%20systems%20science&rft.au=KAPUR,%20J.%20N.&rft.date=1995-01-01&rft.volume=26&rft.issue=1&rft.spage=1&rft.epage=12&rft.pages=1-12&rft.issn=0020-7721&rft.eissn=1464-5319&rft.coden=IJSYA9&rft_id=info:doi/10.1080/00207729508929020&rft_dat=%3Cpascalfrancis_infor%3E3470291%3C/pascalfrancis_infor%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |