Quantitative Information Flow - Verification Hardness and Possibilities

Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring qu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Yasuoka, Hirotoshi, Terauchi, Tachio
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 27
container_issue
container_start_page 15
container_title
container_volume
creator Yasuoka, Hirotoshi
Terauchi, Tachio
description Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a k-safety property for any k, and therefore is not amenable to the self-composition technique that has been successfully applied to precisely checking non-interference. We also show a complexity theoretic gap with non-interference by proving that, for loop-free boolean programs whose non-interference is coNP-complete, the comparison problem is #P-hard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definitions on which has the larger flow for all distributions, is a 2-safety problem in general and is coNP-complete when restricted for loop-free boolean programs. We prove this by showing that the problem is equivalent to a simple relation naturally expressing the fact that one program is more secure than the other. We prove that the relation also refines the channel-capacity based definition, and that it can be precisely checked via the self-composition as well as the "interleaved" self-composition technique.
doi_str_mv 10.1109/CSF.2010.9
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_5552655</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5552655</ieee_id><sourcerecordid>5552655</sourcerecordid><originalsourceid>FETCH-LOGICAL-c282t-f86f6a1b0bdad827c0a2b9b306024e3e44cd358b6ada46f7b21a4917039420533</originalsourceid><addsrcrecordid>eNpVjEtLAzEYReOjYFu7cesmfyD1yztZyuC0hYKKj21JJglEpjMyGRX_vZW6cXXvPRcOQlcUlpSCvame6iWDw7InaGG1oYIJoSWl-hRNGdeaSCHt2b8P4BxNKShOlAWYoNmvwHKgxlygWSlvANRazaZo9fjhujGPbsyfEW-61A_7Q-87XLf9Fyb4NQ455ebI1m4IXSwFuy7gh76U7HObxxzLJZok15a4-Ms5eqnvnqs12d6vNtXtljTMsJEko5Jy1IMPLhimG3DMW89BARORRyGawKXxygUnVNKeUScs1cCtYCA5n6ProzfHGHfvQ9674XsnpWRKSv4Dbl1RGQ</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Quantitative Information Flow - Verification Hardness and Possibilities</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Yasuoka, Hirotoshi ; Terauchi, Tachio</creator><creatorcontrib>Yasuoka, Hirotoshi ; Terauchi, Tachio</creatorcontrib><description>Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a k-safety property for any k, and therefore is not amenable to the self-composition technique that has been successfully applied to precisely checking non-interference. We also show a complexity theoretic gap with non-interference by proving that, for loop-free boolean programs whose non-interference is coNP-complete, the comparison problem is #P-hard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definitions on which has the larger flow for all distributions, is a 2-safety problem in general and is coNP-complete when restricted for loop-free boolean programs. We prove this by showing that the problem is equivalent to a simple relation naturally expressing the fact that one program is more secure than the other. We prove that the relation also refines the channel-capacity based definition, and that it can be precisely checked via the self-composition as well as the "interleaved" self-composition technique.</description><identifier>ISSN: 1063-6900</identifier><identifier>ISBN: 9781424475100</identifier><identifier>ISBN: 1424475104</identifier><identifier>EISSN: 2377-5459</identifier><identifier>EISBN: 9781424475117</identifier><identifier>EISBN: 1424475112</identifier><identifier>DOI: 10.1109/CSF.2010.9</identifier><identifier>LCCN: 2010930188</identifier><language>eng</language><publisher>IEEE</publisher><subject>Entropy ; Mutual information ; Probability distribution ; Random variables ; Safety ; Security ; Uncertainty</subject><ispartof>2010 23rd IEEE Computer Security Foundations Symposium, 2010, p.15-27</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c282t-f86f6a1b0bdad827c0a2b9b306024e3e44cd358b6ada46f7b21a4917039420533</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5552655$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2051,27904,54899</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5552655$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Yasuoka, Hirotoshi</creatorcontrib><creatorcontrib>Terauchi, Tachio</creatorcontrib><title>Quantitative Information Flow - Verification Hardness and Possibilities</title><title>2010 23rd IEEE Computer Security Foundations Symposium</title><addtitle>CSF</addtitle><description>Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a k-safety property for any k, and therefore is not amenable to the self-composition technique that has been successfully applied to precisely checking non-interference. We also show a complexity theoretic gap with non-interference by proving that, for loop-free boolean programs whose non-interference is coNP-complete, the comparison problem is #P-hard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definitions on which has the larger flow for all distributions, is a 2-safety problem in general and is coNP-complete when restricted for loop-free boolean programs. We prove this by showing that the problem is equivalent to a simple relation naturally expressing the fact that one program is more secure than the other. We prove that the relation also refines the channel-capacity based definition, and that it can be precisely checked via the self-composition as well as the "interleaved" self-composition technique.</description><subject>Entropy</subject><subject>Mutual information</subject><subject>Probability distribution</subject><subject>Random variables</subject><subject>Safety</subject><subject>Security</subject><subject>Uncertainty</subject><issn>1063-6900</issn><issn>2377-5459</issn><isbn>9781424475100</isbn><isbn>1424475104</isbn><isbn>9781424475117</isbn><isbn>1424475112</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2010</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNpVjEtLAzEYReOjYFu7cesmfyD1yztZyuC0hYKKj21JJglEpjMyGRX_vZW6cXXvPRcOQlcUlpSCvame6iWDw7InaGG1oYIJoSWl-hRNGdeaSCHt2b8P4BxNKShOlAWYoNmvwHKgxlygWSlvANRazaZo9fjhujGPbsyfEW-61A_7Q-87XLf9Fyb4NQ455ebI1m4IXSwFuy7gh76U7HObxxzLJZok15a4-Ms5eqnvnqs12d6vNtXtljTMsJEko5Jy1IMPLhimG3DMW89BARORRyGawKXxygUnVNKeUScs1cCtYCA5n6ProzfHGHfvQ9674XsnpWRKSv4Dbl1RGQ</recordid><startdate>201007</startdate><enddate>201007</enddate><creator>Yasuoka, Hirotoshi</creator><creator>Terauchi, Tachio</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201007</creationdate><title>Quantitative Information Flow - Verification Hardness and Possibilities</title><author>Yasuoka, Hirotoshi ; Terauchi, Tachio</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c282t-f86f6a1b0bdad827c0a2b9b306024e3e44cd358b6ada46f7b21a4917039420533</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2010</creationdate><topic>Entropy</topic><topic>Mutual information</topic><topic>Probability distribution</topic><topic>Random variables</topic><topic>Safety</topic><topic>Security</topic><topic>Uncertainty</topic><toplevel>online_resources</toplevel><creatorcontrib>Yasuoka, Hirotoshi</creatorcontrib><creatorcontrib>Terauchi, Tachio</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Yasuoka, Hirotoshi</au><au>Terauchi, Tachio</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Quantitative Information Flow - Verification Hardness and Possibilities</atitle><btitle>2010 23rd IEEE Computer Security Foundations Symposium</btitle><stitle>CSF</stitle><date>2010-07</date><risdate>2010</risdate><spage>15</spage><epage>27</epage><pages>15-27</pages><issn>1063-6900</issn><eissn>2377-5459</eissn><isbn>9781424475100</isbn><isbn>1424475104</isbn><eisbn>9781424475117</eisbn><eisbn>1424475112</eisbn><abstract>Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a k-safety property for any k, and therefore is not amenable to the self-composition technique that has been successfully applied to precisely checking non-interference. We also show a complexity theoretic gap with non-interference by proving that, for loop-free boolean programs whose non-interference is coNP-complete, the comparison problem is #P-hard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definitions on which has the larger flow for all distributions, is a 2-safety problem in general and is coNP-complete when restricted for loop-free boolean programs. We prove this by showing that the problem is equivalent to a simple relation naturally expressing the fact that one program is more secure than the other. We prove that the relation also refines the channel-capacity based definition, and that it can be precisely checked via the self-composition as well as the "interleaved" self-composition technique.</abstract><pub>IEEE</pub><doi>10.1109/CSF.2010.9</doi><tpages>13</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1063-6900
ispartof 2010 23rd IEEE Computer Security Foundations Symposium, 2010, p.15-27
issn 1063-6900
2377-5459
language eng
recordid cdi_ieee_primary_5552655
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Entropy
Mutual information
Probability distribution
Random variables
Safety
Security
Uncertainty
title Quantitative Information Flow - Verification Hardness and Possibilities
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T21%3A06%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Quantitative%20Information%20Flow%20-%20Verification%20Hardness%20and%20Possibilities&rft.btitle=2010%2023rd%20IEEE%20Computer%20Security%20Foundations%20Symposium&rft.au=Yasuoka,%20Hirotoshi&rft.date=2010-07&rft.spage=15&rft.epage=27&rft.pages=15-27&rft.issn=1063-6900&rft.eissn=2377-5459&rft.isbn=9781424475100&rft.isbn_list=1424475104&rft_id=info:doi/10.1109/CSF.2010.9&rft_dat=%3Cieee_6IE%3E5552655%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=9781424475117&rft.eisbn_list=1424475112&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=5552655&rfr_iscdi=true