SimCPSR: Simple Contrastive Learning for Paper Submission Recommendation System
The recommendation system plays a vital role in many areas, especially academic fields, to support researchers in submitting and increasing the acceptance of their work through the conference or journal selection process. This study proposes a transformer-based model using transfer learning as an ef...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The recommendation system plays a vital role in many areas, especially
academic fields, to support researchers in submitting and increasing the
acceptance of their work through the conference or journal selection process.
This study proposes a transformer-based model using transfer learning as an
efficient approach for the paper submission recommendation system. By combining
essential information (such as the title, the abstract, and the list of
keywords) with the aims and scopes of journals, the model can recommend the Top
K journals that maximize the acceptance of the paper. Our model had developed
through two states: (i) Fine-tuning the pre-trained language model (LM) with a
simple contrastive learning framework. We utilized a simple supervised
contrastive objective to fine-tune all parameters, encouraging the LM to learn
the document representation effectively. (ii) The fine-tuned LM was then
trained on different combinations of the features for the downstream task. This
study suggests a more advanced method for enhancing the efficiency of the paper
submission recommendation system compared to previous approaches when we
respectively achieve 0.5173, 0.8097, 0.8862, 0.9496 for Top 1, 3, 5, and 10
accuracies on the test set for combining the title, abstract, and keywords as
input features. Incorporating the journals' aims and scopes, our model shows an
exciting result by getting 0.5194, 0.8112, 0.8866, and 0.9496 respective to Top
1, 3, 5, and 10. |
---|---|
DOI: | 10.48550/arxiv.2205.05940 |