Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language

08/03/2022
by   Avi Shmidman, et al.
0

We present a new pre-trained language model (PLM) for Rabbinic Hebrew, termed Berel (BERT Embeddings for Rabbinic-Encoded Language). Whilst other PLMs exist for processing Hebrew texts (e.g., HeBERT, AlephBert), they are all trained on modern Hebrew texts, which diverges substantially from Rabbinic Hebrew in terms of its lexicographical, morphological, syntactic and orthographic norms. We demonstrate the superiority of Berel on Rabbinic texts via a challenge set of Hebrew homographs. We release the new model and homograph challenge set for unrestricted use.

READ FULL TEXT
research
11/28/2022

Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All

We present a new pre-trained language model (PLM) for modern Hebrew, ter...
research
04/15/2021

SINA-BERT: A pre-trained Language Model for Analysis of Medical Texts in Persian

We have released Sina-BERT, a language model pre-trained on BERT (Devlin...
research
08/31/2023

DictaBERT: A State-of-the-Art BERT Suite for Modern Hebrew

We present DictaBERT, a new state-of-the-art pre-trained BERT model for ...
research
03/29/2020

Abstractive Text Summarization based on Language Model Conditioning and Locality Modeling

We explore to what extent knowledge about the pre-trained language model...
research
11/10/2022

BERT in Plutarch's Shadows

The extensive surviving corpus of the ancient scholar Plutarch of Chaero...
research
07/28/2021

MWP-BERT: A Strong Baseline for Math Word Problems

Math word problem (MWP) solving is the task of transforming a sequence o...
research
05/29/2021

Constructing Flow Graphs from Procedural Cybersecurity Texts

Following procedural texts written in natural languages is challenging. ...

Please sign up or login with your details

Forgot password? Click here to reset