A Data-Driven Approach for Finding Requirements Relevant Feedback from TikTok and YouTube
The increasing importance of videos as a medium for engagement, communication, and content creation makes them critical for organizations to consider for user feedback. However, sifting through vast amounts of video content on social media platforms to extract requirements-relevant feedback is chall...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The increasing importance of videos as a medium for engagement,
communication, and content creation makes them critical for organizations to
consider for user feedback. However, sifting through vast amounts of video
content on social media platforms to extract requirements-relevant feedback is
challenging. This study delves into the potential of TikTok and YouTube, two
widely used social media platforms that focus on video content, in identifying
relevant user feedback that may be further refined into requirements using
subsequent requirement generation steps. We evaluated the prospect of videos as
a source of user feedback by analyzing audio and visual text, and metadata
(i.e., description/title) from 6276 videos of 20 popular products across
various industries. We employed state-of-the-art deep learning
transformer-based models, and classified 3097 videos consisting of requirements
relevant information. We then clustered relevant videos and found multiple
requirements relevant feedback themes for each of the 20 products. This
feedback can later be refined into requirements artifacts. We found that
product ratings (feature, design, performance), bug reports, and usage tutorial
are persistent themes from the videos. Video-based social media such as TikTok
and YouTube can provide valuable user insights, making them a powerful and
novel resource for companies to improve customer-centric development. |
---|---|
DOI: | 10.48550/arxiv.2305.01796 |