EDIR: Efficient Distributed Image Retrieval of Novel Objects in Mobile Networks
Crowdsourcing data collection from a network of mobile devices is useful in various applications. Mobile devices store a large amount of visual data that can aid in different application scenarios. Trained Convolutional Neural Networks (CNNs) can be deployed on mobile devices to be used in searching...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on mobile computing 2024-03, Vol.23 (3), p.2337-2350 |
---|---|
Hauptverfasser: | , , , |
Format: | Magazinearticle |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Crowdsourcing data collection from a network of mobile devices is useful in various applications. Mobile devices store a large amount of visual data that can aid in different application scenarios. Trained Convolutional Neural Networks (CNNs) can be deployed on mobile devices to be used in searching for objects of interest. Querying for novel objects, for which models have not been trained yet, presents some unique challenges. When novel objects are queried, new models must be trained and distributed to all edge devices. In this article, we propose an efficient method and a system, called EDIR, which enables answering these queries while taking into account the bandwidth limitations encountered in wireless networks, as well as the limited energy and computational power on mobile devices. Through extensive experimentation, we show that using distance-based classifiers, specifically those relying on the Cosine distance, leads to more efficient utilization of network resources by reducing the number of false positives. We perform analysis that enables the requester to tune the parameters of interest before issuing the query, and validate our theoretical results. EDIR reduces the amount of transferred data by more than 45% compared to other approaches while simultaneously achieving a good F1 score. |
---|---|
ISSN: | 1536-1233 1558-0660 |
DOI: | 10.1109/TMC.2023.3257268 |