Towards Exploring Fairness in Visual Transformer based Natural and GAN Image Detection Systems

Image forensics research has recently witnessed a lot of advancements towards developing computational models capable of accurately detecting natural images captured by cameras and GAN generated images. However, it is also important to ensure whether these computational models are fair enough and do...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-11
Hauptverfasser: Gangan, Manjary P, Kadan, Anoop, Lajish, V L
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Gangan, Manjary P
Kadan, Anoop
Lajish, V L
description Image forensics research has recently witnessed a lot of advancements towards developing computational models capable of accurately detecting natural images captured by cameras and GAN generated images. However, it is also important to ensure whether these computational models are fair enough and do not produce biased outcomes that could eventually harm certain societal groups or cause serious security threats. Exploring fairness in image forensic algorithms is an initial step towards mitigating these biases. This study explores bias in visual transformer based image forensic algorithms that classify natural and GAN images, since visual transformers are recently being widely used in image classification based tasks, including in the area of image forensics. The proposed study procures bias evaluation corpora to analyze bias in gender, racial, affective, and intersectional domains using a wide set of individual and pairwise bias evaluation measures. Since the robustness of the algorithms against image compression is an important factor to be considered in forensic tasks, this study also analyzes the impact of image compression on model bias. Hence to study the impact of image compression on model bias, a two-phase evaluation setting is followed, where the experiments are carried out in uncompressed and compressed evaluation settings. The study could identify bias existences in the visual transformer based models distinguishing natural and GAN images, and also observes that image compression impacts model biases, predominantly amplifying the presence of biases in class GAN predictions.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2878917380</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2878917380</sourcerecordid><originalsourceid>FETCH-proquest_journals_28789173803</originalsourceid><addsrcrecordid>eNqNys8OwUAQgPGNRELwDpM4S2pXtY7i_8VF40iGTpuVdpeZbfD2HDyA03f4fS3V1caMR-lE644aiNyiKNLTRMex6apT5p_IucDqda88W1fCGi07EgHr4GilwQoyRieF55oYLiiUwx5Dw19Bl8NmvoddjSXBkgJdg_UODm8JVEtftQushAa_9tRwvcoW29Gd_aMhCeebb9h96azTJJ2NE5NG5r_rAzTsQ9s</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2878917380</pqid></control><display><type>article</type><title>Towards Exploring Fairness in Visual Transformer based Natural and GAN Image Detection Systems</title><source>Free E- Journals</source><creator>Gangan, Manjary P ; Kadan, Anoop ; Lajish, V L</creator><creatorcontrib>Gangan, Manjary P ; Kadan, Anoop ; Lajish, V L</creatorcontrib><description>Image forensics research has recently witnessed a lot of advancements towards developing computational models capable of accurately detecting natural images captured by cameras and GAN generated images. However, it is also important to ensure whether these computational models are fair enough and do not produce biased outcomes that could eventually harm certain societal groups or cause serious security threats. Exploring fairness in image forensic algorithms is an initial step towards mitigating these biases. This study explores bias in visual transformer based image forensic algorithms that classify natural and GAN images, since visual transformers are recently being widely used in image classification based tasks, including in the area of image forensics. The proposed study procures bias evaluation corpora to analyze bias in gender, racial, affective, and intersectional domains using a wide set of individual and pairwise bias evaluation measures. Since the robustness of the algorithms against image compression is an important factor to be considered in forensic tasks, this study also analyzes the impact of image compression on model bias. Hence to study the impact of image compression on model bias, a two-phase evaluation setting is followed, where the experiments are carried out in uncompressed and compressed evaluation settings. The study could identify bias existences in the visual transformer based models distinguishing natural and GAN images, and also observes that image compression impacts model biases, predominantly amplifying the presence of biases in class GAN predictions.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Bias ; Evaluation ; Image classification ; Image compression ; Image detection ; Transformers</subject><ispartof>arXiv.org, 2024-11</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Gangan, Manjary P</creatorcontrib><creatorcontrib>Kadan, Anoop</creatorcontrib><creatorcontrib>Lajish, V L</creatorcontrib><title>Towards Exploring Fairness in Visual Transformer based Natural and GAN Image Detection Systems</title><title>arXiv.org</title><description>Image forensics research has recently witnessed a lot of advancements towards developing computational models capable of accurately detecting natural images captured by cameras and GAN generated images. However, it is also important to ensure whether these computational models are fair enough and do not produce biased outcomes that could eventually harm certain societal groups or cause serious security threats. Exploring fairness in image forensic algorithms is an initial step towards mitigating these biases. This study explores bias in visual transformer based image forensic algorithms that classify natural and GAN images, since visual transformers are recently being widely used in image classification based tasks, including in the area of image forensics. The proposed study procures bias evaluation corpora to analyze bias in gender, racial, affective, and intersectional domains using a wide set of individual and pairwise bias evaluation measures. Since the robustness of the algorithms against image compression is an important factor to be considered in forensic tasks, this study also analyzes the impact of image compression on model bias. Hence to study the impact of image compression on model bias, a two-phase evaluation setting is followed, where the experiments are carried out in uncompressed and compressed evaluation settings. The study could identify bias existences in the visual transformer based models distinguishing natural and GAN images, and also observes that image compression impacts model biases, predominantly amplifying the presence of biases in class GAN predictions.</description><subject>Algorithms</subject><subject>Bias</subject><subject>Evaluation</subject><subject>Image classification</subject><subject>Image compression</subject><subject>Image detection</subject><subject>Transformers</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNys8OwUAQgPGNRELwDpM4S2pXtY7i_8VF40iGTpuVdpeZbfD2HDyA03f4fS3V1caMR-lE644aiNyiKNLTRMex6apT5p_IucDqda88W1fCGi07EgHr4GilwQoyRieF55oYLiiUwx5Dw19Bl8NmvoddjSXBkgJdg_UODm8JVEtftQushAa_9tRwvcoW29Gd_aMhCeebb9h96azTJJ2NE5NG5r_rAzTsQ9s</recordid><startdate>20241116</startdate><enddate>20241116</enddate><creator>Gangan, Manjary P</creator><creator>Kadan, Anoop</creator><creator>Lajish, V L</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20241116</creationdate><title>Towards Exploring Fairness in Visual Transformer based Natural and GAN Image Detection Systems</title><author>Gangan, Manjary P ; Kadan, Anoop ; Lajish, V L</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28789173803</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Bias</topic><topic>Evaluation</topic><topic>Image classification</topic><topic>Image compression</topic><topic>Image detection</topic><topic>Transformers</topic><toplevel>online_resources</toplevel><creatorcontrib>Gangan, Manjary P</creatorcontrib><creatorcontrib>Kadan, Anoop</creatorcontrib><creatorcontrib>Lajish, V L</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Gangan, Manjary P</au><au>Kadan, Anoop</au><au>Lajish, V L</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Towards Exploring Fairness in Visual Transformer based Natural and GAN Image Detection Systems</atitle><jtitle>arXiv.org</jtitle><date>2024-11-16</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Image forensics research has recently witnessed a lot of advancements towards developing computational models capable of accurately detecting natural images captured by cameras and GAN generated images. However, it is also important to ensure whether these computational models are fair enough and do not produce biased outcomes that could eventually harm certain societal groups or cause serious security threats. Exploring fairness in image forensic algorithms is an initial step towards mitigating these biases. This study explores bias in visual transformer based image forensic algorithms that classify natural and GAN images, since visual transformers are recently being widely used in image classification based tasks, including in the area of image forensics. The proposed study procures bias evaluation corpora to analyze bias in gender, racial, affective, and intersectional domains using a wide set of individual and pairwise bias evaluation measures. Since the robustness of the algorithms against image compression is an important factor to be considered in forensic tasks, this study also analyzes the impact of image compression on model bias. Hence to study the impact of image compression on model bias, a two-phase evaluation setting is followed, where the experiments are carried out in uncompressed and compressed evaluation settings. The study could identify bias existences in the visual transformer based models distinguishing natural and GAN images, and also observes that image compression impacts model biases, predominantly amplifying the presence of biases in class GAN predictions.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-11
issn 2331-8422
language eng
recordid cdi_proquest_journals_2878917380
source Free E- Journals
subjects Algorithms
Bias
Evaluation
Image classification
Image compression
Image detection
Transformers
title Towards Exploring Fairness in Visual Transformer based Natural and GAN Image Detection Systems
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T02%3A20%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Towards%20Exploring%20Fairness%20in%20Visual%20Transformer%20based%20Natural%20and%20GAN%20Image%20Detection%20Systems&rft.jtitle=arXiv.org&rft.au=Gangan,%20Manjary%20P&rft.date=2024-11-16&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2878917380%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2878917380&rft_id=info:pmid/&rfr_iscdi=true