Ontological knowledge, which comprises classes and properties and their
...
Large language models (LLMs) have shown impressive ability for open-doma...
Neural QCFG is a grammar-based sequence-tosequence (seq2seq) model with
...
The MultiCoNER 2 shared task aims to tackle multilingual named entity
re...
Deep neural networks based on layer-stacking architectures have historic...
Open knowledge graph (KG) consists of (subject, relation, object) triple...
Ultra-fine entity typing (UFET) predicts extremely free-formed types (e....
Prior works on Information Extraction (IE) typically predict different t...
Multi-modal named entity recognition (NER) and relation extraction (RE) ...
Ultra-fine entity typing (UFET) aims to predict a wide range of type phr...
Hidden Markov Models (HMMs) and Probabilistic Context-Free Grammars (PCF...
Second-order semantic parsing with end-to-end mean-field inference has b...
Nested named entity recognition (NER) has been receiving increasing
atte...
The MultiCoNER shared task aims at detecting semantically ambiguous and
...
Recently, Multi-modal Named Entity Recognition (MNER) has attracted a lo...
Constituency parsing and nested named entity recognition (NER) are typic...
Graph-based methods are popular in dependency parsing for decades. Recen...
We propose a headed span-based method for projective dependency parsing....
This paper describes the system used in submission from SHANGHAITECH tea...
Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar
...
Recent advances in Named Entity Recognition (NER) show that document-lev...
Probabilistic context-free grammars (PCFGs) with neural parameterization...
This paper studies constrained text generation, which is to generate
sen...
The neural linear-chain CRF model is one of the most widely-used approac...
Sequence labeling is a fundamental problem in machine learning, natural
...
Most of the unsupervised dependency parsers are based on first-order
pro...
Knowledge distillation is a critical technique to transfer knowledge bet...
Pretrained contextualized embeddings are powerful word representations f...
In this paper, we propose second-order graph-based neural dependency par...
Building an effective adversarial attacker and elaborating on countermea...
Syntactic dependency parsing is an important task in natural language
pr...
Recent work proposes a family of contextual embeddings that significantl...
The linear-chain Conditional Random Field (CRF) model is one of the most...
This paper presents the system used in our submission to the IWPT
2020 S...
This paper presents the system used in our submission to the CoNLL
2019 ...
Multilingual sequence labeling is a task of predicting label sequences u...
Word embedding is an essential building block for deep learning methods ...
Neural models have been investigated for sentiment classification over
c...
Semantic dependency parsing aims to identify semantic relationships betw...
Language style transfer is the problem of migrating the content of a sou...
We introduce Latent Vector Grammars (LVeGs), a new framework that extend...
Information Extraction (IE) refers to automatically extracting structure...
Sum-product networks (SPNs) are a class of probabilistic graphical model...
Visual attention, which assigns weights to image regions according to th...
Unsupervised dependency parsing, which tries to discover linguistic
depe...
We study the impact of big models (in terms of the degree of lexicalizat...
Unsupervised dependency parsing aims to learn a dependency parser from
u...
Probabilistic modeling is one of the foundations of modern machine learn...
Stochastic And-Or grammars (AOG) extend traditional stochastic grammars ...
In many statistical learning problems, the target functions to be optimi...