Follow
Tolúlọpẹ́ Ògúnrẹ̀mí
Tolúlọpẹ́ Ògúnrẹ̀mí
Verified email at stanford.edu
Title
Cited by
Cited by
Year
Mini but mighty: Efficient multilingual pretraining with linguistically-informed data selection
T Ogunremi, D Jurafsky, CD Manning
Findings of the Association for Computational Linguistics: EACL 2023, 1251-1266, 2023
52023
Decolonizing NLP for “Low-resource Languages”: Applying Abebe Birhane’s Relational Ethics
T Ògúnrẹ̀mí, WO Nekoto, S Samuel
GRACE: Global Review of AI Community Ethics 1 (1), 2023
32023
Automated speech tools for helping communities process restricted-access corpora for language revival efforts
N San, M Bartelds, T Ogunremi, A Mount, R Thompson, M Higgins, ...
arXiv preprint arXiv:2204.07272, 2022
22022
Multilingual self-supervised speech representations improve the speech recognition of low-resource African languages with codeswitching
T Ogunremi, CD Manning, D Jurafsky
arXiv preprint arXiv:2311.15077, 2023
12023
Leveraging Bias in Pre-trained Word Embeddings for Unsupervised Microaggression Detection
T Ògúnrẹ̀mí, V Basile, T Caselli
IJCoL. Italian Journal of Computational Linguistics 8 (8-2), 2022
12022
\{I} r\{o} y\{i} nSpeech: A multi-purpose Yor\{u} b\'{a} Speech Corpus
T Ogunremi, K Tubosun, A Aremu, I Orife, DI Adelani
arXiv preprint arXiv:2307.16071, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–6