팔로우
Hayeon Lee
Hayeon Lee
Postdoctoral Researcher, Meta AI (FAIR)
meta.com의 이메일 확인됨 - 홈페이지
제목
인용
인용
연도
Learning to balance: Bayesian meta-learning for imbalanced and out-of-distribution tasks
HB Lee*, H Lee*, D Na*, S Kim, M Park, E Yang, SJ Hwang
ICLR 2020, 2019
1192019
Help: Hardware-adaptive efficient latency prediction for nas via meta-learning
H Lee, S Lee, S Chong, SJ Hwang
arXiv preprint arXiv:2106.08630, 2021
42*2021
Rapid neural architecture search by learning to generate graphs from datasets
H Lee*, E Hyung*, SJ Hwang
ICLR 2021, 2021
352021
A survey of supernet optimization and its applications: Spatial and temporal optimization for neural architecture search
S Cha, T Kim, H Lee, SY Yun
arXiv preprint arXiv:2204.03916, 2022
13*2022
Task-Adaptive Neural Network Search with Meta-Contrastive Learning
W Jeong, H Lee, G Park, E Hyung, J Baek, SJ Hwang
Advances in Neural Information Processing Systems 34, 21310-21324, 2021
62021
Online Hyperparameter Meta-Learning with Hypergradient Distillation
HB Lee, H Lee, J Shin, E Yang, T Hospedales, SJ Hwang
arXiv preprint arXiv:2110.02508, 2021
52021
A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models
H Lee, R Hou, J Kim, D Liang, SJ Hwang, A Min
arXiv preprint arXiv:2305.18239, 2023
22023
DiffusionNAG: Predictor-guided Neural Architecture Generation with Diffusion Models
S An, H Lee, J Jo, S Lee, SJ Hwang
arXiv preprint arXiv:2305.16943, 2023
22023
Meta-prediction Model for Distillation-Aware NAS on Unseen Datasets
H Lee, S An, M Kim, SJ Hwang
The Eleventh International Conference on Learning Representations, 0
2*
Lightweight Neural Architecture Search with Parameter Remapping and Knowledge Distillation
H Lee, S An, M Kim, SJ Hwang
First Conference on Automated Machine Learning (Late-Breaking Workshop), 2022
12022
Co-training and Co-distillation for Quality Improvement and Compression of Language Models
H Lee, R Hou, J Kim, D Liang, H Zhang, SJ Hwang, A Min
arXiv preprint arXiv:2311.02849, 2023
2023
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–11