Linear-Complexity Relaxed Word Mover's Distance with GPU Acceleration
The amount of unstructured text-based data is growing every day. Querying, clustering, and classifying this big data requires similarity computations across large sets of documents. Whereas low-complexity similarity metrics are available, attention has been shifting towards more complex methods that...
Gespeichert in:
Hauptverfasser: | , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The amount of unstructured text-based data is growing every day. Querying,
clustering, and classifying this big data requires similarity computations
across large sets of documents. Whereas low-complexity similarity metrics are
available, attention has been shifting towards more complex methods that
achieve a higher accuracy. In particular, the Word Mover's Distance (WMD)
method proposed by Kusner et al. is a promising new approach, but its time
complexity grows cubically with the number of unique words in the documents.
The Relaxed Word Mover's Distance (RWMD) method, again proposed by Kusner et
al., reduces the time complexity from qubic to quadratic and results in a
limited loss in accuracy compared with WMD. Our work contributes a
low-complexity implementation of the RWMD that reduces the average time
complexity to linear when operating on large sets of documents. Our
linear-complexity RWMD implementation, henceforth referred to as LC-RWMD, maps
well onto GPUs and can be efficiently distributed across a cluster of GPUs. Our
experiments on real-life datasets demonstrate 1) a performance improvement of
two orders of magnitude with respect to our GPU-based distributed
implementation of the quadratic RWMD, and 2) a performance improvement of three
to four orders of magnitude with respect to our distributed WMD implementation
that uses GPU-based RWMD for pruning. |
---|---|
DOI: | 10.48550/arxiv.1711.07227 |