Although pre-trained language models (PLMs) have recently advanced the
r...
Large language models (LLMs), like ChatGPT, have shown some human-like
c...
Chain-of-thought prompting (CoT) and tool augmentation have been validat...
Recently, much Chinese text error correction work has focused on Chinese...
In natural language processing, pre-trained language models have become
...
Understanding mathematical questions effectively is a crucial task, whic...
Pre-trained language models achieve superior performance, but they are
c...
Pre-trained Language Model (PLM) has become a representative foundation ...
In this paper, we present an overview of the CTC 2021, a Chinese text
co...
This paper aims to advance the mathematical intelligence of machines by
...
Multilingual pre-trained language models (MPLMs) not only can handle tas...
Knowledge graph embedding (KGE) models learn the representation of entit...
Adversarial training (AT) as a regularization method has proved its
effe...
Multilingual pre-trained models have achieved remarkable transfer perfor...
Achieving human-level performance on some of Machine Reading Comprehensi...
Retrieving information from correlative paragraphs or documents to answe...
Computerized Adaptive Testing (CAT) is emerging as a promising testing
a...
With the blooming of various Pre-trained Language Models (PLMs), Machine...
Most pre-trained language models (PLMs) construct word representations a...
Machine Reading Comprehension (MRC) is an important testbed for evaluati...
Bidirectional Encoder Representations from Transformers (BERT) has shown...
Human conversations contain many types of information, e.g., knowledge,
...
Owing to the continuous contributions by the Chinese NLP community, more...
Recently, many works attempt to model texts as graph structure and intro...
In this paper, we introduce TextBrewer, an open-source knowledge distill...
We present a Chinese judicial reading comprehension (CJRC) dataset which...
Story Ending Prediction is a task that needs to select an appropriate en...
Recurrent Neural Networks (RNN) are known as powerful models for handlin...
Adversarial training (AT) as a regularization method has proved its
effe...
We consider the importance of different utterances in the context for
se...
Though the community has made great progress on Machine Reading Comprehe...
Bidirectional Encoder Representations from Transformers (BERT) has shown...
Understanding learning materials (e.g. test questions) is a crucial issu...
Adaptive learning, also known as adaptive teaching, relies on learning p...
Machine Reading Comprehension (MRC) with multiple-choice questions requi...
Machine Reading Comprehension (MRC) has become enormously popular recent...
This paper describes the system which got the state-of-the-art results a...
Machine Reading Comprehension (MRC) has become enormously popular recent...
Cloze-style queries are representative problems in reading comprehension...
Reading comprehension has embraced a booming in recent NLP research. Sev...
Most existing approaches for zero pronoun resolution are heavily relying...
Artificial neural networks are powerful models, which have been widely
a...