Online knowledge distillation for efficient pose estimation Z Li, J Ye, M Song, Y Huang, Z Pan Proceedings of the IEEE/CVF international conference on computer vision …, 2021 | 85 | 2021 |
Curriculum temperature for knowledge distillation Z Li, X Li, L Yang, B Zhao, R Song, L Luo, J Li, J Yang Proceedings of the AAAI Conference on Artificial Intelligence 37 (2), 1504-1512, 2023 | 42 | 2023 |
Dream-experiment: a MR user interface with natural multi-channel interaction for virtual experiments T Luo, M Zhang, Z Pan, Z Li, N Cai, J Miao, Y Chen, M Xu IEEE Transactions on Visualization and Computer Graphics 26 (12), 3524-3534, 2020 | 26 | 2020 |
Online knowledge distillation via multi-branch diversity enhancement Z Li, Y Huang, D Chen, T Luo, N Cai, Z Pan Proceedings of the Asian Conference on Computer Vision, 2020 | 22 | 2020 |
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation? Z Li, Y Li, P Zhao, R Song, X Li, J Yang arXiv preprint arXiv:2305.12954, 2023 | 8 | 2023 |
Dual teachers for self-knowledge distillation Z Li, X Li, L Yang, R Song, J Yang, Z Pan Pattern Recognition, 110422, 2024 | 2* | 2024 |
PromptKD: Unsupervised prompt distillation for vision-language models Z Li, X Li, X Fu, X Zhang, W Wang, S Chen, J Yang arXiv preprint arXiv:2403.02781, 2024 | 1 | 2024 |