Follow
Xiang Hu
Xiang Hu
Ant Research
Verified email at antgroup.com
Title
Cited by
Cited by
Year
R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling
X Hu, H Mi, Z Wen, Y Wang, Y Su, J Zheng, G de Melo
Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021
192021
Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation
X Hu, H Mi, L Li, G de Melo
The 2022 Conference on Empirical Methods in Natural Language Processing, 2022
92022
A Multi-Grained Self-Interpretable Symbolic-Neural Model For Single/Multi-Labeled Text Classification
X Hu, X Kong, K Tu
In the Eleventh International Conference on Learning Representations (ICLR 2023), 2023
42023
Interactive Question Clarification in Dialogue via Reinforcement Learning
X Hu, Z Wen, Y Wang, X Li, G de Melo
COLING 2020 Industry Track, 2020
42020
Augmenting Transformers with Recursively Composed Multi-grained Representations
X Hu, Q Zhu, K Tu, W Wu
The Twelfth International Conference on Learning Representations (ICLR 2024), 2023
12023
Unsupervised Morphological Tree Tokenizer
Q Zhu, X Hu, P Ji, W Wu, K Tu
https://arxiv.org/html/2406.15245v1, 2024
2024
Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale
X Hu, P Ji, Q Zhu, W Wu, K Tu
ACL 2024, 2024
2024
Language processing using a neural network
X Hu, Z Wen
US Patent 11,210,474, 2021
2021
The system can't perform the operation now. Try again later.
Articles 1–8