Transformers are sample-efficient world models V Micheli, E Alonso, F Fleuret International Conference on Learning Representations (ICLR), 𝗡𝗼𝘁𝗮𝗯𝗹𝗲 𝘁𝗼𝗽 𝟱%, 2023 | 88 | 2023 |
On the importance of pre-training data volume for compact language models V Micheli, M d'Hoffschmidt, F Fleuret Empirical Methods in Natural Language Processing (EMNLP), 2020 | 42 | 2020 |
Language models are few-shot butlers V Micheli, F Fleuret Empirical Methods in Natural Language Processing (EMNLP), 2021 | 24 | 2021 |
MineRL Diamond 2021 Competition: Overview, Results, and Lessons Learned A Kanervisto, S Milani, K Ramanauskas, N Topin, Z Lin, J Li, J Shi, D Ye, ... Proceedings of Machine Learning Research (PMLR), NeurIPS 2021 Competitions …, 2022 | 21 | 2022 |
Human control redressed: Comparing AI and human predictability in a real-effort task S Kandul, V Micheli, J Beck, T Burri, F Fleuret, M Kneer, M Christen Computers in Human Behavior Reports 10, 100290, 2023 | 3 | 2023 |
Explainable AI: A Review of the Empirical Literature S Kandul, V Micheli, J Beck, M Kneer, T Burri, F Fleuret, M Christen | 3 | 2023 |
Multi-task Reinforcement Learning with a Planning Quasi-Metric V Micheli, K Sinnathamby, F Fleuret Neural Information Processing Systems (NeurIPS), Deep Reinforcement Learning …, 2020 | 1 | 2020 |
Towards Efficient World Models E Alonso, V Micheli, F Fleuret Workshop on Efficient Systems for Foundation Models@ ICML2023, 2023 | | 2023 |
Structural analysis of an all-purpose question answering model V Micheli, Q Heinrich, F Fleuret, W Belblidia arXiv preprint arXiv:2104.06045, 2021 | | 2021 |