Insertion-based Decoding with Automatically Inferred Generation Order

Jiatao Gu, Qi Liu, Kyunghyun Cho

Research output: Contribution to journalArticlepeer-review

Abstract

Conventional neural autoregressive decoding commonly assumes a fixed left-to-right generation order, which may be sub-optimal. In this work,wepropose a novel decoding algorithm— InDIGO—which supports flexible sequence generation in arbitrary orders through insertion operations. We extend Transformer, a state-of-the-art sequence generation model, to efficiently implement the proposed approach, enabling it to be trained with either a predefined generation order or adaptive orders obtained from beam-search. Experiments on four real-world tasks, including word order recovery, machine translation, image caption, and code generation, demonstrate that our algorithm can generate sequences following arbitrary orders, while achieving competitive or even better performance compared with the conventional left-to-right generation. The generated sequences show that InDIGO adopts adaptive generation orders based on input information.

Original languageEnglish (US)
Pages (from-to)661-676
Number of pages16
JournalTransactions of the Association for Computational Linguistics
Volume7
DOIs
StatePublished - 2019

ASJC Scopus subject areas

  • Communication
  • Human-Computer Interaction
  • Linguistics and Language
  • Computer Science Applications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Insertion-based Decoding with Automatically Inferred Generation Order'. Together they form a unique fingerprint.

Cite this