Memory Deduplication: An Effective Approach to Improve the Memory System

Programs now have more aggressive demands of memory to hold their data than before. This paper analyzes the characteristics of memory data by using seven real memory traces. It observes that there are a large volume of memory pages with identical contents contained in the traces. Furthermore, the un...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of information science and engineering 2018-07, Vol.34 (4), p.1103
Hauptverfasser: Deng, Yuhui, Huang, Xinyu, Song, Liangshan, Zhou, Yongtao, Wang, Frank
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 4
container_start_page 1103
container_title Journal of information science and engineering
container_volume 34
creator Deng, Yuhui
Huang, Xinyu
Song, Liangshan
Zhou, Yongtao
Wang, Frank
description Programs now have more aggressive demands of memory to hold their data than before. This paper analyzes the characteristics of memory data by using seven real memory traces. It observes that there are a large volume of memory pages with identical contents contained in the traces. Furthermore, the unique memory content accessed are much less than the unique memory address accessed. This is incurred by the traditional address-based cache replacement algorithms that replace memory pages by checking the addresses rather than the contents of those pages, thus resulting in many identical memory contents with different addresses stored in the memory. For example, in the same file system, opening two identical files stored in different directories, or opening two similar files that share a certain amount of contents in the same directory, will result in identical data blocks stored in the cache due to the traditional address-based cache replacement algorithms. Based on the observations, this paper evaluates memory compression and memory deduplication. As expected, memory deduplication greatly outperforms memory compression. For example, the best deduplication ratio is 4.6 times higher than the best compression ratio. The deduplication time and restore time are 121 times and 427 times faster than the compression time and decompression time, respectively. The experimental results in this paper should be able to offer useful insights for designing systems that require abundant memory to improve the system performance.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2100878634</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2100878634</sourcerecordid><originalsourceid>FETCH-proquest_journals_21008786343</originalsourceid><addsrcrecordid>eNpjYeA0NDA00zUyNjPhYOAqLs4yMDAyMzUx4WTw8E3NzS-qVHBJTSktyMlMTizJzM-zUnDMU3BNS0tNLsksS1VwLCgoyk9MzlAoyVfwzAWygWIlGakKUK3BlcUlqbk8DKxpiTnFqbxQmptB2c01xNlDF6i-sDS1uCQ-K7-0KA8oFW9kaGBgYW5hZmxiTJwqAJ1JOwk</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2100878634</pqid></control><display><type>article</type><title>Memory Deduplication: An Effective Approach to Improve the Memory System</title><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Deng, Yuhui ; Huang, Xinyu ; Song, Liangshan ; Zhou, Yongtao ; Wang, Frank</creator><creatorcontrib>Deng, Yuhui ; Huang, Xinyu ; Song, Liangshan ; Zhou, Yongtao ; Wang, Frank</creatorcontrib><description>Programs now have more aggressive demands of memory to hold their data than before. This paper analyzes the characteristics of memory data by using seven real memory traces. It observes that there are a large volume of memory pages with identical contents contained in the traces. Furthermore, the unique memory content accessed are much less than the unique memory address accessed. This is incurred by the traditional address-based cache replacement algorithms that replace memory pages by checking the addresses rather than the contents of those pages, thus resulting in many identical memory contents with different addresses stored in the memory. For example, in the same file system, opening two identical files stored in different directories, or opening two similar files that share a certain amount of contents in the same directory, will result in identical data blocks stored in the cache due to the traditional address-based cache replacement algorithms. Based on the observations, this paper evaluates memory compression and memory deduplication. As expected, memory deduplication greatly outperforms memory compression. For example, the best deduplication ratio is 4.6 times higher than the best compression ratio. The deduplication time and restore time are 121 times and 427 times faster than the compression time and decompression time, respectively. The experimental results in this paper should be able to offer useful insights for designing systems that require abundant memory to improve the system performance.</description><identifier>ISSN: 1016-2364</identifier><language>eng</language><publisher>Taipei: Institute of Information Science, Academia Sinica</publisher><subject>Algorithms ; Compression ratio ; Compression tests ; Computer memory ; Directories ; Time compression ; Uniqueness</subject><ispartof>Journal of information science and engineering, 2018-07, Vol.34 (4), p.1103</ispartof><rights>Copyright Institute of Information Science, Academia Sinica Jul 2018</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780</link.rule.ids></links><search><creatorcontrib>Deng, Yuhui</creatorcontrib><creatorcontrib>Huang, Xinyu</creatorcontrib><creatorcontrib>Song, Liangshan</creatorcontrib><creatorcontrib>Zhou, Yongtao</creatorcontrib><creatorcontrib>Wang, Frank</creatorcontrib><title>Memory Deduplication: An Effective Approach to Improve the Memory System</title><title>Journal of information science and engineering</title><description>Programs now have more aggressive demands of memory to hold their data than before. This paper analyzes the characteristics of memory data by using seven real memory traces. It observes that there are a large volume of memory pages with identical contents contained in the traces. Furthermore, the unique memory content accessed are much less than the unique memory address accessed. This is incurred by the traditional address-based cache replacement algorithms that replace memory pages by checking the addresses rather than the contents of those pages, thus resulting in many identical memory contents with different addresses stored in the memory. For example, in the same file system, opening two identical files stored in different directories, or opening two similar files that share a certain amount of contents in the same directory, will result in identical data blocks stored in the cache due to the traditional address-based cache replacement algorithms. Based on the observations, this paper evaluates memory compression and memory deduplication. As expected, memory deduplication greatly outperforms memory compression. For example, the best deduplication ratio is 4.6 times higher than the best compression ratio. The deduplication time and restore time are 121 times and 427 times faster than the compression time and decompression time, respectively. The experimental results in this paper should be able to offer useful insights for designing systems that require abundant memory to improve the system performance.</description><subject>Algorithms</subject><subject>Compression ratio</subject><subject>Compression tests</subject><subject>Computer memory</subject><subject>Directories</subject><subject>Time compression</subject><subject>Uniqueness</subject><issn>1016-2364</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><recordid>eNpjYeA0NDA00zUyNjPhYOAqLs4yMDAyMzUx4WTw8E3NzS-qVHBJTSktyMlMTizJzM-zUnDMU3BNS0tNLsksS1VwLCgoyk9MzlAoyVfwzAWygWIlGakKUK3BlcUlqbk8DKxpiTnFqbxQmptB2c01xNlDF6i-sDS1uCQ-K7-0KA8oFW9kaGBgYW5hZmxiTJwqAJ1JOwk</recordid><startdate>20180701</startdate><enddate>20180701</enddate><creator>Deng, Yuhui</creator><creator>Huang, Xinyu</creator><creator>Song, Liangshan</creator><creator>Zhou, Yongtao</creator><creator>Wang, Frank</creator><general>Institute of Information Science, Academia Sinica</general><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20180701</creationdate><title>Memory Deduplication: An Effective Approach to Improve the Memory System</title><author>Deng, Yuhui ; Huang, Xinyu ; Song, Liangshan ; Zhou, Yongtao ; Wang, Frank</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_21008786343</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Algorithms</topic><topic>Compression ratio</topic><topic>Compression tests</topic><topic>Computer memory</topic><topic>Directories</topic><topic>Time compression</topic><topic>Uniqueness</topic><toplevel>online_resources</toplevel><creatorcontrib>Deng, Yuhui</creatorcontrib><creatorcontrib>Huang, Xinyu</creatorcontrib><creatorcontrib>Song, Liangshan</creatorcontrib><creatorcontrib>Zhou, Yongtao</creatorcontrib><creatorcontrib>Wang, Frank</creatorcontrib><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of information science and engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Deng, Yuhui</au><au>Huang, Xinyu</au><au>Song, Liangshan</au><au>Zhou, Yongtao</au><au>Wang, Frank</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Memory Deduplication: An Effective Approach to Improve the Memory System</atitle><jtitle>Journal of information science and engineering</jtitle><date>2018-07-01</date><risdate>2018</risdate><volume>34</volume><issue>4</issue><spage>1103</spage><pages>1103-</pages><issn>1016-2364</issn><abstract>Programs now have more aggressive demands of memory to hold their data than before. This paper analyzes the characteristics of memory data by using seven real memory traces. It observes that there are a large volume of memory pages with identical contents contained in the traces. Furthermore, the unique memory content accessed are much less than the unique memory address accessed. This is incurred by the traditional address-based cache replacement algorithms that replace memory pages by checking the addresses rather than the contents of those pages, thus resulting in many identical memory contents with different addresses stored in the memory. For example, in the same file system, opening two identical files stored in different directories, or opening two similar files that share a certain amount of contents in the same directory, will result in identical data blocks stored in the cache due to the traditional address-based cache replacement algorithms. Based on the observations, this paper evaluates memory compression and memory deduplication. As expected, memory deduplication greatly outperforms memory compression. For example, the best deduplication ratio is 4.6 times higher than the best compression ratio. The deduplication time and restore time are 121 times and 427 times faster than the compression time and decompression time, respectively. The experimental results in this paper should be able to offer useful insights for designing systems that require abundant memory to improve the system performance.</abstract><cop>Taipei</cop><pub>Institute of Information Science, Academia Sinica</pub></addata></record>
fulltext fulltext
identifier ISSN: 1016-2364
ispartof Journal of information science and engineering, 2018-07, Vol.34 (4), p.1103
issn 1016-2364
language eng
recordid cdi_proquest_journals_2100878634
source Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals
subjects Algorithms
Compression ratio
Compression tests
Computer memory
Directories
Time compression
Uniqueness
title Memory Deduplication: An Effective Approach to Improve the Memory System
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T18%3A47%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Memory%20Deduplication:%20An%20Effective%20Approach%20to%20Improve%20the%20Memory%20System&rft.jtitle=Journal%20of%20information%20science%20and%20engineering&rft.au=Deng,%20Yuhui&rft.date=2018-07-01&rft.volume=34&rft.issue=4&rft.spage=1103&rft.pages=1103-&rft.issn=1016-2364&rft_id=info:doi/&rft_dat=%3Cproquest%3E2100878634%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2100878634&rft_id=info:pmid/&rfr_iscdi=true