Follow
Noel Loo
Noel Loo
PhD Student at MIT
Verified email at mit.edu - Homepage
Title
Cited by
Cited by
Year
Efficient dataset distillation using random feature approximation
N Loo, R Hasani, A Amini, D Rus
Advances in Neural Information Processing Systems 35, 13877-13891, 2022
502022
Generalized variational continual learning
N Loo, S Swaroop, RE Turner
arXiv preprint arXiv:2011.12328, 2020
492020
Dataset distillation with convexified implicit gradients
N Loo, R Hasani, M Lechner, D Rus
International Conference on Machine Learning, 22649-22674, 2023
242023
Evolution of neural tangent kernels under benign and adversarial training
N Loo, R Hasani, A Amini, D Rus
Advances in Neural Information Processing Systems 35, 11642-11657, 2022
72022
Dataset distillation fixes dataset reconstruction attacks
N Loo, R Hasani, M Lechner, D Rus
arXiv preprint arXiv:2302.01428 1, 2023
42023
A real‐world evaluation of the implementation of NLP technology in abstract screening of a systematic review
S Perlman‐Arrow, N Loo, N Bobrovitz, T Yan, RK Arora
Research Synthesis Methods 14 (4), 608-621, 2023
12023
Combining Variational Continual Learning with FiLM Layers
N Loo, S Swaroop, RE Turner
4th Lifelong Machine Learning Workshop at ICML 2020, 2020
12020
On the Size and Approximation Error of Distilled Datasets
A Maalouf, M Tukan, N Loo, R Hasani, M Lechner, D Rus
Advances in Neural Information Processing Systems 36, 2024
2024
Leveraging Low-Rank and Sparse Recurrent Connectivity for Robust Closed-Loop Control
N Tumma, M Lechner, N Loo, R Hasani, D Rus
Causal Representation Learning Workshop at NeurIPS 2023, 2023
2023
On the Size and Approximation Error of Distilled Sets
A Maalouf, M Tukan, N Loo, R Hasani, M Lechner, D Rus
arXiv preprint arXiv:2305.14113, 2023
2023
Understanding Reconstruction Attacks with the Neural Tangent Kernel and Dataset Distillation
N Loo, R Hasani, M Lechner, A Amini, D Rus
arXiv preprint arXiv:2302.01428, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–11