An emerging solution for explaining Transformer-based models is to use
v...
In recent years, there has been significant progress in developing
pre-t...
Multilingual pre-training significantly improves many multilingual NLP t...
Current pre-trained language models rely on large datasets for achieving...
It has been shown that NLI models are usually biased with respect to the...
There has been a growing interest in interpreting the underlying dynamic...
Human languages are full of metaphorical expressions. Metaphors help peo...
Community Question Answering (CQA) forums provide answers for many real-...
Duplicate question detection (DQD) is important to increase efficiency o...
Pretrained language models have achieved a new state of the art on many ...
We investigate whether example forgetting, a recently introduced measure...
Answering questions is a primary goal of many conversational systems or
...
Knowledge graphs (KGs) represent world's facts in structured forms. KG
c...
Word embeddings typically represent different meanings of a word in a si...
Knowledge bases (KBs) are paramount in NLP. We employ multiview learning...
Embedding models typically associate each word with a single real-valued...
Large scale knowledge graphs (KGs) such as Freebase are generally incomp...
This paper addresses the problem of corpus-level entity typing, i.e.,
in...
Entities are essential elements of natural language. In this paper, we
p...
In this paper, we address two different types of noise in information
ex...
We introduce a new methodology for intrinsic evaluation of word
represen...
This paper addresses the problem of corpus-level entity typing, i.e.,
in...