DSC IIT-ISM at SemEval-2020 Task 6: Boosting BERT with Dependencies for Definition Extraction

09/17/2020
by   Aadarsh Singh, et al.
0

We explore the performance of Bidirectional Encoder Representations from Transformers (BERT) at definition extraction. We further propose a joint model of BERT and Text Level Graph Convolutional Network so as to incorporate dependencies into the model. Our proposed model produces better results than BERT and achieves comparable results to BERT with fine tuned language model in DeftEval (Task 6 of SemEval 2020), a shared task of classifying whether a sentence contains a definition or not (Subtask 1).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/05/2019

Investigating the Successes and Failures of BERT for Passage Re-Ranking

The bidirectional encoder representations from transformers (BERT) model...
research
03/31/2021

Defx at SemEval-2020 Task 6: Joint Extraction of Concepts and Relations for Definition Extraction

Definition Extraction systems are a valuable knowledge source for both h...
research
05/12/2021

Better than BERT but Worse than Baseline

This paper compares BERT-SQuAD and Ab3P on the Abbreviation Definition I...
research
11/11/2019

Understanding BERT performance in propaganda analysis

In this paper, we describe our system used in the shared task for fine-g...
research
01/31/2022

Learning affective meanings that derives the social behavior using Bidirectional Encoder Representations from Transformers

Predicting the outcome of a process requires modeling the system dynamic...
research
05/02/2023

Cancer Hallmark Classification Using Bidirectional Encoder Representations From Transformers

This paper presents a novel approach to accurately classify the hallmark...
research
08/21/2022

A Syntax Aware BERT for Identifying Well-Formed Queries in a Curriculum Framework

A well formed query is defined as a query which is formulated in the man...

Please sign up or login with your details

Forgot password? Click here to reset