Exploring Transformers in Emotion Recognition: a comparison of BERT, DistillBERT, RoBERTa, XLNet and ELECTRA
This paper investigates how Natural Language Understanding (NLU) could be applied in Emotion Recognition, a specific task in affective computing. We finetuned different transformers language models (BERT, DistilBERT, RoBERTa, XLNet, and ELECTRA) using a fine-grained emotion dataset and evaluating them in terms of performance (f1-score) and time to complete.
READ FULL TEXT