Follow
Iulia-Raluca Turc
Iulia-Raluca Turc
Unknown affiliation
No verified email - Homepage
Title
Cited by
Cited by
Year
Well-read students learn better: On the importance of pre-training compact models
I Turc, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1908.08962, 2019
6122019
Well-read students learn better: The impact of student initialization on knowledge distillation
I Turc, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1908.08962 13, 2019
2112019
Canine: Pre-training an Efficient Tokenization-Free Encoder for Language Representation
JH Clark, D Garrette, I Turc, J Wieting
Transactions of the Association for Computational Linguistics 10, 73-91, 2022
1702022
Pix2struct: Screenshot parsing as pretraining for visual language understanding
K Lee, M Joshi, IR Turc, H Hu, F Liu, JM Eisenschlos, U Khandelwal, ...
International Conference on Machine Learning, 18893-18912, 2023
1202023
Measuring attribution in natural language generation models
H Rashkin, V Nikolaev, M Lamm, L Aroyo, M Collins, D Das, S Petrov, ...
Computational Linguistics 49 (4), 777-840, 2023
1002023
The multiberts: Bert reproductions for robustness analysis
T Sellam, S Yadlowsky, J Wei, N Saphra, A D'Amour, T Linzen, J Bastings, ...
arXiv preprint arXiv:2106.16163, 2021
772021
Revisiting the primacy of english in zero-shot cross-lingual transfer
I Turc, K Lee, J Eisenstein, MW Chang, K Toutanova
arXiv preprint arXiv:2106.16171, 2021
452021
Well-read students learn better: The impact of student initialization on knowledge distillation. CoRR abs/1908.08962 (2019)
I Turc, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1908.08962, 2019
172019
Well-read students learn better: On the importance of pre-training compact models. arXiv 2019
I Turc, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1908.08962, 1908
171908
Well-read students learn better: On the importance of pre-training compact models. arXiv: Computation and Language
I Turc, MW Chang, K Lee, K Toutanova
62019
High performance natural language processing
G Ilharco, C Ilharco, I Turc, T Dettmers, F Ferreira, K Lee
Proceedings of the 2020 Conference on Empirical Methods in Natural Language …, 2020
32020
Learning task sampling policy for multitask learning
D Sundararaman, H Tsai, KH Lee, I Turc, L Carin
Findings of the Association for Computational Linguistics: EMNLP 2021, 4410-4415, 2021
22021
CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Understanding
JH Clark, D Garrette, I Turc, J Wieting
2022
Recurrent Neural Networks for Statistical Machine Translation
IR Turc
University of Oxford, 2014
2014
The system can't perform the operation now. Try again later.
Articles 1–14