Evaluating BERT-based Scientific Relation Classifiers for Scholarly Knowledge Graph Construction on Digital Library Collections

by   Ming Jiang, et al.

The rapid growth of research publications has placed great demands on digital libraries (DL) for advanced information management technologies. To cater to these demands, techniques relying on knowledge-graph structures are being advocated. In such graph-based pipelines, inferring semantic relations between related scientific concepts is a crucial step. Recently, BERT-based pre-trained models have been popularly explored for automatic relation classification. Despite significant progress, most of them were evaluated in different scenarios, which limits their comparability. Furthermore, existing methods are primarily evaluated on clean texts, which ignores the digitization context of early scholarly publications in terms of machine scanning and optical character recognition (OCR). In such cases, the texts may contain OCR noise, in turn creating uncertainty about existing classifiers' performances. To address these limitations, we started by creating OCR-noisy texts based on three clean corpora. Given these parallel corpora, we conducted a thorough empirical evaluation of eight Bert-based classification models by focusing on three factors: (1) Bert variants; (2) classification strategies; and, (3) OCR noise impacts. Experiments on clean data show that the domain-specific pre-trained Bert is the best variant to identify scientific relations. The strategy of predicting a single relation each time outperforms the one simultaneously identifying multiple relations in general. The optimal classifier's performance can decline by around 10 discussed in this study can help DL stakeholders select techniques for building optimal knowledge-graph-based systems.


page 10

page 13


Improving Scholarly Knowledge Representation: Evaluating BERT-based Models for Scientific Relation Classification

With the rapid growth of research publications, there is a vast amount o...

Unsupervised Knowledge Graph Construction and Event-centric Knowledge Infusion for Scientific NLI

With the advance of natural language inference (NLI), a rising demand fo...

Knowledge Graph Refinement based on Triplet BERT-Networks

Knowledge graph embedding techniques are widely used for knowledge graph...

K-BERT: Enabling Language Representation with Knowledge Graph

Pre-trained language representation models, such as BERT, capture a gene...

Automated Mining of Leaderboards for Empirical AI Research

With the rapid growth of research publications, empowering scientists to...

Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT

Infusing factual knowledge into pre-trained models is fundamental for ma...

Cleaning Dirty Books: Post-OCR Processing for Previously Scanned Texts

Substantial amounts of work are required to clean large collections of d...

Please sign up or login with your details

Forgot password? Click here to reset