Ruminating Reader: Reasoning with Gated Multi-Hop Attention

Yichen Gong, Samuel R. Bowman

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    To answer the question in machine comprehension (MC) task, the models need to establish the interaction between the question and the context. To tackle the problem that the single-pass model cannot reflect on and correct its answer, we present Ruminating Reader. Ruminating Reader adds a second pass of attention and a novel information fusion component to the Bi-Directional Attention Flow model (BIDAF). We propose novel layer structures that construct a query aware context vector representation and fuse encoding representation with intermediate representation on top of BIDAF model. We show that a multi-hop attention mechanism can be applied to a bi-directional attention structure. In experiments on SQuAD, we find that the Reader outperforms the BIDAF baseline by 2.1 F1 score and 2.7 EM score. Our analysis shows that different hops of the attention have different responsibilities in selecting answers.

    Original languageEnglish (US)
    Title of host publicationACL 2018 - Machine Reading for Question Answering, Proceedings of the Workshop
    PublisherAssociation for Computational Linguistics (ACL)
    Pages1-11
    Number of pages11
    ISBN (Electronic)9781948087391
    StatePublished - 2018
    EventACL 2018 Workshop on Machine Reading for Question Answering, MRQA 2018 - Melbourne, Australia
    Duration: Jul 19 2018 → …

    Publication series

    NameProceedings of the Annual Meeting of the Association for Computational Linguistics
    ISSN (Print)0736-587X

    Conference

    ConferenceACL 2018 Workshop on Machine Reading for Question Answering, MRQA 2018
    Country/TerritoryAustralia
    CityMelbourne
    Period7/19/18 → …

    ASJC Scopus subject areas

    • Computer Science Applications
    • Linguistics and Language
    • Language and Linguistics

    Fingerprint

    Dive into the research topics of 'Ruminating Reader: Reasoning with Gated Multi-Hop Attention'. Together they form a unique fingerprint.

    Cite this