Follow
Seongjin Shin
Seongjin Shin
NAVER CLOVA
Verified email at navercorp.com - Homepage
Title
Cited by
Cited by
Year
On the effect of pretraining corpora on in-context learning by a large-scale language model
S Shin, SW Lee, H Ahn, S Kim, HS Kim, B Kim, K Cho, G Lee, W Park, ...
arXiv preprint arXiv:2204.13509, 2022
582022
Prometheus: Inducing fine-grained evaluation capability in language models
S Kim, J Shin, Y Cho, J Jang, S Longpre, H Lee, S Yun, S Shin, S Kim, ...
arXiv preprint arXiv:2310.08491, 2023
262023
Two-stage textual knowledge distillation for end-to-end spoken language understanding
S Kim, G Kim, S Shin, S Lee
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and …, 2021
242021
Prometheus: Inducing evaluation capability in language models
S Kim, J Shin, Y Cho, J Jang, S Longpre, H Lee, S Yun, S Shin, S Kim, ...
NeurIPS 2023 Workshop on Instruction Tuning and Instruction Following, 2023
52023
HyperCLOVA X Technical Report
KM Yoo, J Han, S In, H Jeon, J Jeong, J Kang, H Kim, KM Kim, M Kim, ...
arXiv preprint arXiv:2404.01954, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–5