팔로우
Hyeonmin Ha
Hyeonmin Ha
snu.ac.kr의 이메일 확인됨
제목
인용
인용
연도
Parallax: Sparsity-aware data parallel training of deep neural networks
S Kim, GI Yu, H Park, S Cho, E Jeong, H Ha, S Lee, JS Jeong, BG Chun
Proceedings of the Fourteenth EuroSys Conference 2019, 1-15, 2019
972019
Refurbish your training data: Reusing partially augmented samples for faster deep neural network training
G Lee, I Lee, H Ha, K Lee, H Hyun, A Shin, BG Chun
2021 USENIX Annual Technical Conference (USENIX ATC 21), 537-550, 2021
192021
Breaking ad-hoc runtime integrity protection mechanisms in android financial apps
T Kim, H Ha, S Choi, J Jung, BG Chun
Proceedings of the 2017 ACM on Asia Conference on Computer and …, 2017
142017
Sumnas: Supernet with unbiased meta-features for neural architecture search
H Ha, JH Kim, S Park, BG Chun
International Conference on Learning Representations, 2021
42021
Two Examples are Better than One: Context Regularization for Gradient-based Prompt Tuning
H Ha, S Jung, J Park, M Seo, S Hwang, BG Chun
Findings of the Association for Computational Linguistics: ACL 2023, 3335-3350, 2023
12023
Survey on Recent Continual Learning Studies in NLP
J Lee, H Ha, BG Chun
한국정보과학회 학술발표논문집, 999-1001, 2022
12022
Meta-Learning of Prompt Generation for Lightweight Prompt Engineering on Language-Model-as-a-Service
H Ha, J Lee, W Han, BG Chun
The 2023 Conference on Empirical Methods in Natural Language Processing, 2023
2023
A Survey on Memory Optimization Techniques and Frameworks for Training Large Language Models
W Han, T Lee, H Ha, BG Chun
한국정보과학회 학술발표논문집, 1187-1189, 2022
2022
Question Answering Model for Programming Languages
M Kim, A Shin, G Lee, S Jung, J Lee, H Ha, BG Chun
한국정보과학회 학술발표논문집, 1395-1397, 2021
2021
A Survey on Pruning and Sharing DNN Parameters for Compression and Acceleration
H Ha, BG Chun
한국정보과학회 학술발표논문집, 631-633, 2019
2019
Adversarial Example Generation for DNNs with Non-differentiable Objective Functions
H Ha, BG Chun
한국정보과학회 학술발표논문집, 892-894, 2018
2018
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–11