Lexicon Enhanced Chinese Sequence Labeling Using BERT Adapter

05/15/2021
by   Wei Liu, et al.
0

Lexicon information and pre-trained models, such as BERT, have been combined to explore Chinese sequence labelling tasks due to their respective strengths. However, existing methods solely fuse lexicon features via a shallow and random initialized sequence layer and do not integrate them into the bottom layers of BERT. In this paper, we propose Lexicon Enhanced BERT (LEBERT) for Chinese sequence labelling, which integrates external lexicon knowledge into BERT layers directly by a Lexicon Adapter layer. Compared with the existing methods, our model facilitates deep lexicon knowledge fusion at the lower layers of BERT. Experiments on ten Chinese datasets of three tasks including Named Entity Recognition, Word Segmentation, and Part-of-Speech tagging, show that LEBERT achieves the state-of-the-art results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2020

Improving Sequence Tagging for Vietnamese Text Using Transformer-based Neural Models

This paper describes our study on using mutilingual BERT embeddings and ...
research
10/27/2022

Unsupervised Boundary-Aware Language Model Pretraining for Chinese Sequence Labeling

Boundary information is critical for various Chinese language processing...
research
04/11/2020

End to End Chinese Lexical Fusion Recognition with Sememe Knowledge

In this paper, we present Chinese lexical fusion recognition, a new task...
research
09/18/2020

fastHan: A BERT-based Joint Many-Task Toolkit for Chinese NLP

We present fastHan, an open-source toolkit for four basic tasks in Chine...
research
03/25/2021

Bertinho: Galician BERT Representations

This paper presents a monolingual BERT model for Galician. We follow the...
research
03/12/2022

MarkBERT: Marking Word Boundaries Improves Chinese BERT

We present a Chinese BERT model dubbed MarkBERT that uses word informati...
research
01/02/2021

Lex-BERT: Enhancing BERT based NER with lexicons

In this work, we represent Lex-BERT, which incorporates the lexicon info...

Please sign up or login with your details

Forgot password? Click here to reset