Abstract
Log-linear models and more recently neural network models used for supervised relation extraction requires substantial amounts of training data and time, limiting the portability to new relations and domains. To this end, we propose a training representation based on the dependency paths between entities in a dependency tree which we call lexicalized dependency paths (LDPs). We show that this representation is fast, efficient and transparent. We further propose representations utilizing entity types and its subtypes to refine our model and alleviate the data sparsity problem. We apply lexicalized dependency paths to supervised learning using the ACE corpus and show that it can achieve similar performance level to other state-of-the-art methods and even surpass them on several categories.
Original language | English (US) |
---|---|
Pages (from-to) | 861-870 |
Number of pages | 10 |
Journal | Computer Systems Science and Engineering |
Volume | 43 |
Issue number | 3 |
DOIs | |
State | Published - 2022 |
Keywords
- Relation extraction
- dependency paths
- lexicalized dependency paths
- rule-based models
- supervised learning
ASJC Scopus subject areas
- Control and Systems Engineering
- Theoretical Computer Science
- General Computer Science