SimCPSR: Simple Contrastive Learning for Paper Submission Recommendation System

by   Duc H. Le, et al.

The recommendation system plays a vital role in many areas, especially academic fields, to support researchers in submitting and increasing the acceptance of their work through the conference or journal selection process. This study proposes a transformer-based model using transfer learning as an efficient approach for the paper submission recommendation system. By combining essential information (such as the title, the abstract, and the list of keywords) with the aims and scopes of journals, the model can recommend the Top K journals that maximize the acceptance of the paper. Our model had developed through two states: (i) Fine-tuning the pre-trained language model (LM) with a simple contrastive learning framework. We utilized a simple supervised contrastive objective to fine-tune all parameters, encouraging the LM to learn the document representation effectively. (ii) The fine-tuned LM was then trained on different combinations of the features for the downstream task. This study suggests a more advanced method for enhancing the efficiency of the paper submission recommendation system compared to previous approaches when we respectively achieve 0.5173, 0.8097, 0.8862, 0.9496 for Top 1, 3, 5, and 10 accuracies on the test set for combining the title, abstract, and keywords as input features. Incorporating the journals' aims and scopes, our model shows an exciting result by getting 0.5194, 0.8112, 0.8866, and 0.9496 respective to Top 1, 3, 5, and 10.


page 1

page 2

page 3

page 4


Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self-Training Approach

Fine-tuned pre-trained language models (LMs) achieve enormous success in...

Similarity of Pre-trained and Fine-tuned Representations

In transfer learning, only the last part of the networks - the so-called...

Parotid Gland MR Image Segmentation Based on Contrastive Learning

Compared with natural images, medical images are difficult to acquire an...

From Dense to Sparse: Contrastive Pruning for Better Pre-trained Language Model Compression

Pre-trained Language Models (PLMs) have achieved great success in variou...

MIReAD: Simple Method for Learning High-quality Representations from Scientific Documents

Learning semantically meaningful representations from scientific documen...

Focused Contrastive Training for Test-based Constituency Analysis

We propose a scheme for self-training of grammaticality models for const...

FPSRS: A Fusion Approach for Paper Submission Recommendation System

Recommender systems have been increasingly popular in entertainment and ...

Please sign up or login with your details

Forgot password? Click here to reset