Detecting Video Inter-Frame Forgeries Based on Convolutional Neural Network Model
In the era of information extension today, videos are easily captured and made viral in a short time, and video tampering has become more comfortable due to editing software. So, the authenticity of videos becomes more essential. Video inter-frame forgeries are the most common type of video forgery...
Gespeichert in:
Veröffentlicht in: | International journal of image, graphics and signal processing graphics and signal processing, 2020-06, Vol.12 (3), p.1-12 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 12 |
---|---|
container_issue | 3 |
container_start_page | 1 |
container_title | International journal of image, graphics and signal processing |
container_volume | 12 |
creator | Hau Nguyen, Xuan Hu, Yongjian Ahmad Amin, Muhmmad Gohar Hayat, Khan Thinh Le, Van Truong, Dinh-Tu |
description | In the era of information extension today, videos are easily captured and made viral in a short time, and video tampering has become more comfortable due to editing software. So, the authenticity of videos becomes more essential. Video inter-frame forgeries are the most common type of video forgery methods, which are difficult to detect by the naked eye. Until now, some algorithms have been suggested for detecting inter-frame forgeries based on handicraft features, but the accuracy and processing speed of those algorithms are still challenging. In this paper, we are going to put forward a video forgery detection method for detecting video inter-frame forgeries based on convolutional neural network (CNN) models by retraining the available CNN model trained on ImageNet dataset. The proposed method based on state-the-art CNN models, which are retrained to exploit spatial-temporal relationships in a video to detect inter-frame forgeries robustly and we have also proposed a confidence score instead of the raw output score based on these networks for increasing accuracy of the proposed method. Through the experiments, the detection accuracy of the proposed method is 99.17%. This result has shown that the proposed method has significantly higher efficiency and accuracy than other recent methods. |
doi_str_mv | 10.5815/ijigsp.2020.03.01 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2419973881</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2419973881</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2311-89a1a1d63c1613a863bfeeb8a2ede2d89ad94bf9a6bd7550d7d1e704f816bab13</originalsourceid><addsrcrecordid>eNo9kFFLwzAUhYMoOOZ-gG8Bn1tzk7ZJH3U6HUxFUF9D2tyOzK6ZSav47-2ceF_OhXM4HD5CzoGluYL80m3cOu5SzjhLmUgZHJEJZzJLSqb48f8vs1Myi3HDxityEDKbkOcb7LHuXbemb86ip8uux5AsgtkiXfiwxuAw0msT0VLf0bnvPn079M53pqWPOIRf6b98eKcP3mJ7Rk4a00ac_emUvC5uX-b3yerpbjm_WiU1FwCJKg0YsIWooQBhVCGqBrFShqNFbkfbllnVlKaorMxzZqUFlCxrFBSVqUBMycWhdxf8x4Cx1xs_hHFV1DyDspRCqX0KDqk6-BgDNnoX3NaEbw1M7-HpAzy9h6eZ0AzED5y9Y-M</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2419973881</pqid></control><display><type>article</type><title>Detecting Video Inter-Frame Forgeries Based on Convolutional Neural Network Model</title><source>EZB-FREE-00999 freely available EZB journals</source><creator>Hau Nguyen, Xuan ; Hu, Yongjian ; Ahmad Amin, Muhmmad ; Gohar Hayat, Khan ; Thinh Le, Van ; Truong, Dinh-Tu</creator><creatorcontrib>Hau Nguyen, Xuan ; Hu, Yongjian ; Ahmad Amin, Muhmmad ; Gohar Hayat, Khan ; Thinh Le, Van ; Truong, Dinh-Tu ; School of Electronics and Information Engineering, South China University of Technology, Guangzhou 510640, P.R.China</creatorcontrib><description>In the era of information extension today, videos are easily captured and made viral in a short time, and video tampering has become more comfortable due to editing software. So, the authenticity of videos becomes more essential. Video inter-frame forgeries are the most common type of video forgery methods, which are difficult to detect by the naked eye. Until now, some algorithms have been suggested for detecting inter-frame forgeries based on handicraft features, but the accuracy and processing speed of those algorithms are still challenging. In this paper, we are going to put forward a video forgery detection method for detecting video inter-frame forgeries based on convolutional neural network (CNN) models by retraining the available CNN model trained on ImageNet dataset. The proposed method based on state-the-art CNN models, which are retrained to exploit spatial-temporal relationships in a video to detect inter-frame forgeries robustly and we have also proposed a confidence score instead of the raw output score based on these networks for increasing accuracy of the proposed method. Through the experiments, the detection accuracy of the proposed method is 99.17%. This result has shown that the proposed method has significantly higher efficiency and accuracy than other recent methods.</description><identifier>ISSN: 2074-9074</identifier><identifier>EISSN: 2074-9082</identifier><identifier>DOI: 10.5815/ijigsp.2020.03.01</identifier><language>eng</language><publisher>Hong Kong: Modern Education and Computer Science Press</publisher><subject>Accuracy ; Algorithms ; Artificial neural networks ; Forgery ; Neural networks ; Retraining</subject><ispartof>International journal of image, graphics and signal processing, 2020-06, Vol.12 (3), p.1-12</ispartof><rights>2020. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the associated terms available at http://www.mecs-press.org/ijcnis/terms.html</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c2311-89a1a1d63c1613a863bfeeb8a2ede2d89ad94bf9a6bd7550d7d1e704f816bab13</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Hau Nguyen, Xuan</creatorcontrib><creatorcontrib>Hu, Yongjian</creatorcontrib><creatorcontrib>Ahmad Amin, Muhmmad</creatorcontrib><creatorcontrib>Gohar Hayat, Khan</creatorcontrib><creatorcontrib>Thinh Le, Van</creatorcontrib><creatorcontrib>Truong, Dinh-Tu</creatorcontrib><creatorcontrib>School of Electronics and Information Engineering, South China University of Technology, Guangzhou 510640, P.R.China</creatorcontrib><title>Detecting Video Inter-Frame Forgeries Based on Convolutional Neural Network Model</title><title>International journal of image, graphics and signal processing</title><description>In the era of information extension today, videos are easily captured and made viral in a short time, and video tampering has become more comfortable due to editing software. So, the authenticity of videos becomes more essential. Video inter-frame forgeries are the most common type of video forgery methods, which are difficult to detect by the naked eye. Until now, some algorithms have been suggested for detecting inter-frame forgeries based on handicraft features, but the accuracy and processing speed of those algorithms are still challenging. In this paper, we are going to put forward a video forgery detection method for detecting video inter-frame forgeries based on convolutional neural network (CNN) models by retraining the available CNN model trained on ImageNet dataset. The proposed method based on state-the-art CNN models, which are retrained to exploit spatial-temporal relationships in a video to detect inter-frame forgeries robustly and we have also proposed a confidence score instead of the raw output score based on these networks for increasing accuracy of the proposed method. Through the experiments, the detection accuracy of the proposed method is 99.17%. This result has shown that the proposed method has significantly higher efficiency and accuracy than other recent methods.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Forgery</subject><subject>Neural networks</subject><subject>Retraining</subject><issn>2074-9074</issn><issn>2074-9082</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNo9kFFLwzAUhYMoOOZ-gG8Bn1tzk7ZJH3U6HUxFUF9D2tyOzK6ZSav47-2ceF_OhXM4HD5CzoGluYL80m3cOu5SzjhLmUgZHJEJZzJLSqb48f8vs1Myi3HDxityEDKbkOcb7LHuXbemb86ip8uux5AsgtkiXfiwxuAw0msT0VLf0bnvPn079M53pqWPOIRf6b98eKcP3mJ7Rk4a00ac_emUvC5uX-b3yerpbjm_WiU1FwCJKg0YsIWooQBhVCGqBrFShqNFbkfbllnVlKaorMxzZqUFlCxrFBSVqUBMycWhdxf8x4Cx1xs_hHFV1DyDspRCqX0KDqk6-BgDNnoX3NaEbw1M7-HpAzy9h6eZ0AzED5y9Y-M</recordid><startdate>20200601</startdate><enddate>20200601</enddate><creator>Hau Nguyen, Xuan</creator><creator>Hu, Yongjian</creator><creator>Ahmad Amin, Muhmmad</creator><creator>Gohar Hayat, Khan</creator><creator>Thinh Le, Van</creator><creator>Truong, Dinh-Tu</creator><general>Modern Education and Computer Science Press</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7XB</scope><scope>8AL</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BVBZV</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>M0N</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope></search><sort><creationdate>20200601</creationdate><title>Detecting Video Inter-Frame Forgeries Based on Convolutional Neural Network Model</title><author>Hau Nguyen, Xuan ; Hu, Yongjian ; Ahmad Amin, Muhmmad ; Gohar Hayat, Khan ; Thinh Le, Van ; Truong, Dinh-Tu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2311-89a1a1d63c1613a863bfeeb8a2ede2d89ad94bf9a6bd7550d7d1e704f816bab13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Forgery</topic><topic>Neural networks</topic><topic>Retraining</topic><toplevel>online_resources</toplevel><creatorcontrib>Hau Nguyen, Xuan</creatorcontrib><creatorcontrib>Hu, Yongjian</creatorcontrib><creatorcontrib>Ahmad Amin, Muhmmad</creatorcontrib><creatorcontrib>Gohar Hayat, Khan</creatorcontrib><creatorcontrib>Thinh Le, Van</creatorcontrib><creatorcontrib>Truong, Dinh-Tu</creatorcontrib><creatorcontrib>School of Electronics and Information Engineering, South China University of Technology, Guangzhou 510640, P.R.China</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>East & South Asia Database</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Computing Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><jtitle>International journal of image, graphics and signal processing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hau Nguyen, Xuan</au><au>Hu, Yongjian</au><au>Ahmad Amin, Muhmmad</au><au>Gohar Hayat, Khan</au><au>Thinh Le, Van</au><au>Truong, Dinh-Tu</au><aucorp>School of Electronics and Information Engineering, South China University of Technology, Guangzhou 510640, P.R.China</aucorp><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Detecting Video Inter-Frame Forgeries Based on Convolutional Neural Network Model</atitle><jtitle>International journal of image, graphics and signal processing</jtitle><date>2020-06-01</date><risdate>2020</risdate><volume>12</volume><issue>3</issue><spage>1</spage><epage>12</epage><pages>1-12</pages><issn>2074-9074</issn><eissn>2074-9082</eissn><abstract>In the era of information extension today, videos are easily captured and made viral in a short time, and video tampering has become more comfortable due to editing software. So, the authenticity of videos becomes more essential. Video inter-frame forgeries are the most common type of video forgery methods, which are difficult to detect by the naked eye. Until now, some algorithms have been suggested for detecting inter-frame forgeries based on handicraft features, but the accuracy and processing speed of those algorithms are still challenging. In this paper, we are going to put forward a video forgery detection method for detecting video inter-frame forgeries based on convolutional neural network (CNN) models by retraining the available CNN model trained on ImageNet dataset. The proposed method based on state-the-art CNN models, which are retrained to exploit spatial-temporal relationships in a video to detect inter-frame forgeries robustly and we have also proposed a confidence score instead of the raw output score based on these networks for increasing accuracy of the proposed method. Through the experiments, the detection accuracy of the proposed method is 99.17%. This result has shown that the proposed method has significantly higher efficiency and accuracy than other recent methods.</abstract><cop>Hong Kong</cop><pub>Modern Education and Computer Science Press</pub><doi>10.5815/ijigsp.2020.03.01</doi><tpages>12</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2074-9074 |
ispartof | International journal of image, graphics and signal processing, 2020-06, Vol.12 (3), p.1-12 |
issn | 2074-9074 2074-9082 |
language | eng |
recordid | cdi_proquest_journals_2419973881 |
source | EZB-FREE-00999 freely available EZB journals |
subjects | Accuracy Algorithms Artificial neural networks Forgery Neural networks Retraining |
title | Detecting Video Inter-Frame Forgeries Based on Convolutional Neural Network Model |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T05%3A18%3A07IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Detecting%20Video%20Inter-Frame%20Forgeries%20Based%20on%20Convolutional%20Neural%20Network%20Model&rft.jtitle=International%20journal%20of%20image,%20graphics%20and%20signal%20processing&rft.au=Hau%20Nguyen,%20Xuan&rft.aucorp=School%20of%20Electronics%20and%20Information%20Engineering,%20South%20China%20University%20of%20Technology,%20Guangzhou%20510640,%20P.R.China&rft.date=2020-06-01&rft.volume=12&rft.issue=3&rft.spage=1&rft.epage=12&rft.pages=1-12&rft.issn=2074-9074&rft.eissn=2074-9082&rft_id=info:doi/10.5815/ijigsp.2020.03.01&rft_dat=%3Cproquest_cross%3E2419973881%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2419973881&rft_id=info:pmid/&rfr_iscdi=true |