Follow
Jue Hou
Jue Hou
Verified email at helsinki.fi
Title
Cited by
Cited by
Year
Modeling language learning using specialized Elo rating
J Hou, K Maximilian, JMH Quecedo, N Stoyanova, R Yangarber
Proceedings of the Fourteenth Workshop on Innovative Use of NLP for Building …, 2019
212019
Applying gamification incentives in the Revita language-learning system
J Hou, I Kylliäinen, A Katinskaia, G Furlan, R Yangarber
Proceedings of the 9th Workshop on Games and Natural Language Processing …, 2022
52022
Semi-automatically annotated learner corpus for Russian
A Katinskaia, M Lebedeva, J Hou, R Yangarber
Proceedings of the Thirteenth Language Resources and Evaluation Conference …, 2022
52022
Projecting named entity recognizers without annotated or parallel corpora
J Hou, M Koppatz, JMH Quecedo, R Yangarber
Nordic Conference on Computational Linguistics, 232-241, 2019
42019
Integration of computer-aided language learning into formal university-level l2 instruction
N Stoyanova, J Hou, R Yangarber, M Kopotev
L'Analisi Linguistica e Letteraria 29 (3), 2021
32021
What do Transformers Know about Government?
J Hou, A Katinskaia, L Kotilainen, S Trangcasanchai, AD Vu, R Yangarber
arXiv preprint arXiv:2404.14270, 2024
2024
Effects of sub-word segmentation on performance of transformer language models
J Hou, A Katinskaia, AD Vu, R Yangarber
arXiv preprint arXiv:2305.05480, 2023
2023
Linguistic Constructs Represent the Domain Model in Intelligent Language Tutoring
A Katinskaia, J Hou, AD Vu, R Yangarber
Proceedings of the 17th Conference of the European Chapter of the …, 2023
2023
Projecting named entity recognizers from resource-rich to resource-poor languages without annotated or parallel corpora
J Hou
University of Helsinki, 2020
2020
The system can't perform the operation now. Try again later.
Articles 1–9