Towards Data- and Compute-Efficient Fake-News Detection: An Approach Combining Active Learning and Pre-Trained Language Models

In today’s digital era, dominated by social media platforms such as Twitter , Facebook , and Instagram , the swift dissemination of misinformation represents a significant concern, impacting public sentiment and influencing pivotal global events. Promptly detecting such deceptive content with the he...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:SN computer science 2024-06, Vol.5 (5), p.470, Article 470
Hauptverfasser: Folino, Francesco, Folino, Gianluigi, Guarascio, Massimo, Pontieri, Luigi, Zicari, Paolo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In today’s digital era, dominated by social media platforms such as Twitter , Facebook , and Instagram , the swift dissemination of misinformation represents a significant concern, impacting public sentiment and influencing pivotal global events. Promptly detecting such deceptive content with the help of Machine Learning models is crucial, yet it comes with the challenge of dealing with labelled examples for training these models. Impressive performance results were recently achieved by high-capacity pre-trained transformer-based models (e.g., BERT). Still, such models are too data- and compute-demanding for many critical application contexts where memory, time, and energy consumption must be limited. Here, we propose an innovative semi-supervised method for efficient and effective fake news detection using a content-oriented classifier based on a small-sized BERT embedder. After fine-tuning this model on the sole few labelled data available, an iterative Active Learning (AL) process is carried out, which benefits from limited experts’ feedback to acquire more labelled data for improving the model. The proposed method ensures good detection performances using a few training samples, reasonably small human intervention, and compute/memory costs.
ISSN:2661-8907
2662-995X
2661-8907
DOI:10.1007/s42979-024-02809-1