AraBART: a Pretrained Arabic Sequence-to-Sequence Model for Abstractive Summarization

Moussa Kamal Eddine, Nadi Tomeh, Nizar Habash, Joseph Le Roux, Michalis Vazirgiannis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Fingerprint

Dive into the research topics of 'AraBART: a Pretrained Arabic Sequence-to-Sequence Model for Abstractive Summarization'. Together they form a unique fingerprint.

Mathematics

Arts & Humanities

Social Sciences