Towards Generalizable Semantic Product Search by Text Similarity Pre-training on Search Click Logs
ECNLP 2022 Recently, semantic search has been successfully applied to e-commerce product search and the learned semantic space(s) for query and product encoding are expected to generalize to unseen queries or products. Yet, whether generalization can conveniently emerge has not been thoroughly studi...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | ECNLP 2022 Recently, semantic search has been successfully applied to e-commerce product
search and the learned semantic space(s) for query and product encoding are
expected to generalize to unseen queries or products. Yet, whether
generalization can conveniently emerge has not been thoroughly studied in the
domain thus far. In this paper, we examine several general-domain and
domain-specific pre-trained Roberta variants and discover that general-domain
fine-tuning does not help generalization, which aligns with the discovery of
prior art. Proper domain-specific fine-tuning with clickstream data can lead to
better model generalization, based on a bucketed analysis of a publicly
available manual annotated query-product pair da |
---|---|
DOI: | 10.48550/arxiv.2204.05231 |