DeText: A Deep Text Ranking Framework with BERT
Ranking is the most important component in a search system. Mostsearch systems deal with large amounts of natural language data,hence an effective ranking system requires a deep understandingof text semantics. Recently, deep learning based natural languageprocessing (deep NLP) models have generated...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Ranking is the most important component in a search system. Mostsearch
systems deal with large amounts of natural language data,hence an effective
ranking system requires a deep understandingof text semantics. Recently, deep
learning based natural languageprocessing (deep NLP) models have generated
promising results onranking systems. BERT is one of the most successful models
thatlearn contextual embedding, which has been applied to capturecomplex
query-document relations for search ranking. However,this is generally done by
exhaustively interacting each query wordwith each document word, which is
inefficient for online servingin search product systems. In this paper, we
investigate how tobuild an efficient BERT-based ranking model for industry use
cases.The solution is further extended to a general ranking framework,DeText,
that is open sourced and can be applied to various rankingproductions. Offline
and online experiments of DeText on threereal-world search systems present
significant improvement overstate-of-the-art approaches. |
---|---|
DOI: | 10.48550/arxiv.2008.02460 |