Follow
Lin Zheng
Lin Zheng
Verified email at connect.hku.hk - Homepage
Title
Cited by
Cited by
Year
Linear complexity randomized self-attention mechanism
L Zheng, C Wang, L Kong
International conference on machine learning, 27011-27041, 2022
222022
A reparameterized discrete diffusion model for text generation
L Zheng, J Yuan, L Yu, L Kong
arXiv preprint arXiv:2302.05737, 2023
192023
Efficient attention via control variates
L Zheng, J Yuan, C Wang, L Kong
arXiv preprint arXiv:2302.04542, 2023
92023
Generative semantic hashing enhanced via Boltzmann machines
L Zheng, Q Su, D Shen, C Chen
arXiv preprint arXiv:2006.08858, 2020
92020
Cab: comprehensive attention benchmarking on long sequence modeling
J Zhang, S Jiang, J Feng, L Zheng, L Kong
International Conference on Machine Learning, 41194-41218, 2023
42023
Ripple attention for visual perception with sub-quadratic complexity
L Zheng, H Pan, L Kong
International Conference on Machine Learning, 26993-27010, 2022
32022
Cascaded head-colliding attention
L Zheng, Z Wu, L Kong
arXiv preprint arXiv:2105.14850, 2021
22021
Linear attention via orthogonal memory
J Zhang, S Jiang, J Feng, L Zheng, L Kong
arXiv preprint arXiv:2312.11135, 2023
12023
Diffusion of Thoughts: Chain-of-Thought Reasoning in Diffusion Language Models
J Ye, S Gong, L Chen, L Zheng, J Gao, H Shi, C Wu, Z Li, W Bi, L Kong
arXiv preprint arXiv:2402.07754, 2024
2024
Self-Infilling Code Generation
L Zheng, J Yuan, Z Zhang, H Yang, L Kong
arXiv preprint arXiv:2311.17972, 2023
2023
Attentive Multi-Layer Perceptron for Non-autoregressive Generation
S Jiang, J Zhang, J Feng, L Zheng, L Kong
Joint European Conference on Machine Learning and Knowledge Discovery in …, 2023
2023
Attentive MLP for Non-Autoregressive Generation
S Jiang, J Zhang, J Feng, L Zheng, L Kong
2022
The system can't perform the operation now. Try again later.
Articles 1–12