팔로우
Sanghyun Kim
Sanghyun Kim
kaist.ac.kr의 이메일 확인됨
제목
인용
인용
연도
Towards safe self-distillation of internet-scale text-to-image diffusion models
S Kim, S Jung, B Kim, M Choi, J Shin, J Lee
arXiv preprint arXiv:2307.05977, 2023
92023
A simple yet powerful deep active learning with snapshots ensembles
S Jung, S Kim, J Lee
The Eleventh International Conference on Learning Representations, 2022
32022
Slot-Mixup with Subsampling: A Simple Regularization for WSI Classification
S Keum, S Kim, S Lee, J Lee
arXiv preprint arXiv:2311.17466, 2023
2023
Modeling Uplift from Observational Time-Series in Continual Scenarios
S Kim, J Choi, NH Kim, J Ryu, J Lee
AAAI Bridge Program on Continual Causality, 75-84, 2023
2023
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–4