An Efficient Active Learning Framework for New Relation Types

Lisheng Fu, Ralph Grishman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Supervised training of models for semantic relation extraction has yielded good performance, but at substantial cost for the annotation of large training corpora. Active learning strategies can greatly reduce this annotation cost. We present an efficient active learning framework that starts from a better balance between positive and negative samples, and boosts training efficiency by interleaving self-training and co-testing. We also studied the reduction of annotation cost by enforcing argument type constraints. Experiments show a substantial speed-up by comparison to the previous state-of-the-art pure co-testing active learning framework. We obtain reasonable performance with only 150 labels for individual ACE 2004 relation types.

Original languageEnglish (US)
Title of host publication6th International Joint Conference on Natural Language Processing, IJCNLP 2013 - Proceedings of the Main Conference
EditorsRuslan Mitkov, Jong C. Park
PublisherAsian Federation of Natural Language Processing
Pages692-698
Number of pages7
ISBN (Electronic)9784990734800
StatePublished - 2013
Event6th International Joint Conference on Natural Language Processing, IJCNLP 2013 - Nagoya, Japan
Duration: Oct 14 2013 → …

Publication series

Name6th International Joint Conference on Natural Language Processing, IJCNLP 2013 - Proceedings of the Main Conference

Conference

Conference6th International Joint Conference on Natural Language Processing, IJCNLP 2013
Country/TerritoryJapan
CityNagoya
Period10/14/13 → …

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software

Fingerprint

Dive into the research topics of 'An Efficient Active Learning Framework for New Relation Types'. Together they form a unique fingerprint.

Cite this