Fine-tuning Pretrained Multilingual BERT Model for Indonesian Aspect-based Sentiment Analysis

03/05/2021
by   Annisa Nurul Azhar, et al.
0

Although previous research on Aspect-based Sentiment Analysis (ABSA) for Indonesian reviews in hotel domain has been conducted using CNN and XGBoost, its model did not generalize well in test data and high number of OOV words contributed to misclassification cases. Nowadays, most state-of-the-art results for wide array of NLP tasks are achieved by utilizing pretrained language representation. In this paper, we intend to incorporate one of the foremost language representation model, BERT, to perform ABSA in Indonesian reviews dataset. By combining multilingual BERT (m-BERT) with task transformation method, we manage to achieve significant improvement by 8 compared to the result from our previous study.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset