Recursive Multi-Relational Graph Convolutional Network for Automatic Photo Selection
Automatic Photo Selection (APS) is a fundamental and important task for further photo cropping and photo enhancement. As the images in a photo series normally have subtle differences, it remains challenging to surface the best photos among highly similar photos. In this work, we propose a Recursive...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on multimedia 2023, Vol.25, p.3825-3840 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Automatic Photo Selection (APS) is a fundamental and important task for further photo cropping and photo enhancement. As the images in a photo series normally have subtle differences, it remains challenging to surface the best photos among highly similar photos. In this work, we propose a Recursive Multi-Relational Graph Convolutional Network (RMGCN) for APS. Specifically, we explore and devise inner-relation and inter-relation graphs to learn informative representations in hierarchical manner. 1) Patch-aware Intra Graph Module (PIGM) captures visual and spatial relations between different patches to characterize the representations in an image. 2) Context-aware Inter Graph Module (CIGM) explicitly exploits mutual comparative relation between different images in a photo series. These two graphs are recursively refined each other by reasoning the graph representations. Then, our model aggregates the output of CIGM with multi-scale local features via the proposed Cross-domain Fusing Gate (CFG) to boost the discriminative ability. Besides, we formulate four companion objectives as soft constraints to improve convergence rate during training. Extensive experiments are conducted on photo-triage dataset, and superior results are reported on different metrics when comparing to the state-of-the-art methods. We also perform rigorous ablations and analysis to validate our approach. |
---|---|
ISSN: | 1520-9210 1941-0077 |
DOI: | 10.1109/TMM.2022.3167309 |