Follow
Clara Na
Clara Na
Language Technologies Institute, Carnegie Mellon University
Verified email at cs.cmu.edu - Homepage
Title
Cited by
Cited by
Year
Train Flat, Then Compress: Sharpness-Aware Minimization Learns More Compressible Models
C Na, SV Mehta, E Strubell
arXiv preprint arXiv:2205.12694, 2022
162022
The Framework Tax: Disparities Between Inference Efficiency in NLP Research and Deployment
J Fernandez, J Kahn, C Na, Y Bisk, E Strubell
arXiv preprint arXiv:2302.06117, 2023
22023
To Build Our Future, We Must Know Our Past: Contextualizing Paradigm Shifts in Natural Language Processing
S Gururaja, A Bertsch, C Na, DG Widder, E Strubell
arXiv preprint arXiv:2310.07715, 2023
12023
Energy and Carbon Considerations of Fine-Tuning BERT
X Wang, C Na, E Strubell, S Friedler, S Luccioni
arXiv preprint arXiv:2311.10267, 2023
2023
Less Is More? In Patents, Design Transformations that Add Occur More Often Than Those that Subtract
K Stenger, C Na, L Klotz
Design Computing and Cognition’20, 283-295, 2022
2022
The system can't perform the operation now. Try again later.
Articles 1–5