R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling X Hu, H Mi, Z Wen, Y Wang, Y Su, J Zheng, G de Melo Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021 | 19 | 2021 |
Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation X Hu, H Mi, L Li, G de Melo The 2022 Conference on Empirical Methods in Natural Language Processing, 2022 | 9 | 2022 |
A Multi-Grained Self-Interpretable Symbolic-Neural Model For Single/Multi-Labeled Text Classification X Hu, X Kong, K Tu In the Eleventh International Conference on Learning Representations (ICLR 2023), 2023 | 4 | 2023 |
Interactive Question Clarification in Dialogue via Reinforcement Learning X Hu, Z Wen, Y Wang, X Li, G de Melo COLING 2020 Industry Track, 2020 | 4 | 2020 |
Augmenting Transformers with Recursively Composed Multi-grained Representations X Hu, Q Zhu, K Tu, W Wu The Twelfth International Conference on Learning Representations (ICLR 2024), 2023 | 1 | 2023 |
Unsupervised Morphological Tree Tokenizer Q Zhu, X Hu, P Ji, W Wu, K Tu https://arxiv.org/html/2406.15245v1, 2024 | | 2024 |
Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale X Hu, P Ji, Q Zhu, W Wu, K Tu ACL 2024, 2024 | | 2024 |
Language processing using a neural network X Hu, Z Wen US Patent 11,210,474, 2021 | | 2021 |