Evaluating the Effectiveness of GPT Large Language Model for News Classification in the IPTC News Ontology

News classification plays a vital role in newsrooms, as it involves the time-consuming task of categorizing news articles and requires domain knowledge. Effective news classification is essential for categorizing and organizing a constant flow of information, serving as the foundation for subsequent...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2023-01, Vol.11, p.1-1
Hauptverfasser: Fatemi, Bahareh, Rabbi, Fazle, Opdahl, Andreas L.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:News classification plays a vital role in newsrooms, as it involves the time-consuming task of categorizing news articles and requires domain knowledge. Effective news classification is essential for categorizing and organizing a constant flow of information, serving as the foundation for subsequent tasks, such as news aggregation, monitoring, filtering, and organization. Automation of this process can significantly benefit newsrooms by saving time and resources. In this study, we explore the potential of the GPT large language model in a zero-shot setting for multi-class classification of news articles within the widely accepted International Press Telecommunications Council (IPTC) news ontology. The IPTC news ontology provides a structured framework for categorizing news, facilitating the efficient organization and retrieval of news content. By investigating the effectiveness of the GPT language model in this classification task, we aimed to understand its capabilities and potential applications in the news domain. This study was conducted as part of our ongoing research in the field of automated journalism.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3345414