Privacy-Aware Activity Classification from First Person Office Videos
In the advent of wearable body-cameras, human activity classification from First-Person Videos (FPV) has become a topic of increasing importance for various applications, including in life-logging, law-enforcement, sports, workplace, and healthcare. One of the challenging aspects of FPV is its expos...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In the advent of wearable body-cameras, human activity classification from
First-Person Videos (FPV) has become a topic of increasing importance for
various applications, including in life-logging, law-enforcement, sports,
workplace, and healthcare. One of the challenging aspects of FPV is its
exposure to potentially sensitive objects within the user's field of view. In
this work, we developed a privacy-aware activity classification system focusing
on office videos. We utilized a Mask-RCNN with an Inception-ResNet hybrid as a
feature extractor for detecting, and then blurring out sensitive objects (e.g.,
digital screens, human face, paper) from the videos. For activity
classification, we incorporate an ensemble of Recurrent Neural Networks (RNNs)
with ResNet, ResNext, and DenseNet based feature extractors. The proposed
system was trained and evaluated on the FPV office video dataset that includes
18-classes made available through the IEEE Video and Image Processing (VIP) Cup
2019 competition. On the original unprotected FPVs, the proposed activity
classifier ensemble reached an accuracy of 85.078% with precision, recall, and
F1 scores of 0.88, 0.85 & 0.86, respectively. On privacy protected videos, the
performances were slightly degraded, with accuracy, precision, recall, and F1
scores at 73.68%, 0.79, 0.75, and 0.74, respectively. The presented system won
the 3rd prize in the IEEE VIP Cup 2019 competition. |
---|---|
DOI: | 10.48550/arxiv.2006.06246 |