Sublanguage: A Serious Issue Affects Pretrained Models in Legal Domain

04/15/2021
by   Ha-Thanh Nguyen, et al.
0

Legal English is a sublanguage that is important for everyone but not for everyone to understand. Pretrained models have become best practices among current deep learning approaches for different problems. It would be a waste or even a danger if these models were applied in practice without knowledge of the sublanguage of the law. In this paper, we raise the issue and propose a trivial solution by introducing BERTLaw a legal sublanguage pretrained model. The paper's experiments demonstrate the superior effectiveness of the method compared to the baseline pretrained model

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset