TY - GEN
T1 - jiant
T2 - 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020
AU - Pruksachatkun, Yada
AU - Yeres, Phil
AU - Liu, Haokun
AU - Phang, Jason
AU - Htut, Phu Mon
AU - Wang, Alex
AU - Tenney, Ian
AU - Bowman, Samuel R.
N1 - Funding Information:
Subsequent development was possible in part by a donation to NYU from Eric and Wendy Schmidt made by recommendation of the Schmidt Futures program, by support from Intuit Inc., and by support from Samsung Research under the project Improving Deep Learning using Latent Structure. We gratefully acknowledge the support of NVIDIA Corporation with the donation of a Titan V GPU used at NYU in this work. Alex Wang’s work on the project is supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE 1342536. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. Yada Pruk-sachatkun’s work on the project is supported in part by the Moore-Sloan Data Science Environment as part of the NYU Data Science Services initiative. Sam Bowman’s work on jiant during Summer 2019 took place in his capacity as a visiting researcher at Google.
Publisher Copyright:
© 2020 Association for Computational Linguistics
PY - 2020
Y1 - 2020
N2 - We introduce jiant, an open source toolkit for conducting multitask and transfer learning experiments on English NLU tasks. jiant enables modular and configuration-driven experimentation with state-of-the-art models and implements a broad set of tasks for probing, transfer learning, and multitask training experiments. jiant implements over 50 NLU tasks, including all GLUE and SuperGLUE benchmark tasks. We demonstrate that jiant reproduces published performance on a variety of tasks and models, including BERT and RoBERTa. jiant is available at https://jiant.info.
AB - We introduce jiant, an open source toolkit for conducting multitask and transfer learning experiments on English NLU tasks. jiant enables modular and configuration-driven experimentation with state-of-the-art models and implements a broad set of tasks for probing, transfer learning, and multitask training experiments. jiant implements over 50 NLU tasks, including all GLUE and SuperGLUE benchmark tasks. We demonstrate that jiant reproduces published performance on a variety of tasks and models, including BERT and RoBERTa. jiant is available at https://jiant.info.
UR - http://www.scopus.com/inward/record.url?scp=85098438732&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85098438732&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85098438732
T3 - Proceedings of the Annual Meeting of the Association for Computational Linguistics
SP - 109
EP - 117
BT - ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics, Proceedings of the System Demonstrations
PB - Association for Computational Linguistics (ACL)
Y2 - 5 July 2020 through 10 July 2020
ER -