Abstract
Conversational machine comprehension requires the understanding of the conversation history, such as previous question/answer pairs, the document context and the current question. To enable traditional, single-turn models to encode the history comprehensively, we introduce FLOW, a mechanism that can incorporate intermediate representations generated during the process of answering previous questions, through an alternating parallel processing structure. Compared to approaches that concatenate previous questions/answers as input, FLOW integrates the latent semantics of the conversation history more deeply. Our model, FLOWQA, shows superior performance on two recently proposed conversational challenges (+7.2% F1 on CoQA and +4.0% on QuAC). The effectiveness of FLOW also shows in other tasks. By reducing sequential instruction understanding to conversational machine comprehension, FLOWQA outperforms the best models on all three domains in SCONE, with +1.8% to +4.4% improvement in accuracy.
Original language | English (US) |
---|---|
State | Published - 2019 |
Event | 7th International Conference on Learning Representations, ICLR 2019 - New Orleans, United States Duration: May 6 2019 → May 9 2019 |
Conference
Conference | 7th International Conference on Learning Representations, ICLR 2019 |
---|---|
Country/Territory | United States |
City | New Orleans |
Period | 5/6/19 → 5/9/19 |
ASJC Scopus subject areas
- Education
- Computer Science Applications
- Linguistics and Language
- Language and Linguistics