Follow
Qingjun Cui
Qingjun Cui
Senior Applied Science Manager in Amazon
Verified email at amazon.com
Title
Cited by
Cited by
Year
CTR-BERT: Cost-effective knowledge distillation for billion-parameter teacher models
A Muhamed, I Keivanloo, S Perera, J Mracek, Y Xu, Q Cui, S Rajagopalan, ...
NeurIPS Efficient Natural Language and Speech Processing Workshop, 2021
322021
Behavior-driven query similarity prediction based on pre-trained language models for e-commerce search
Y Huang, J Gesi, X Hong, H Cheng, K Zhong, V Mittal, Q Cui, V Salaka
152023
Web-scale semantic product search with large language models
A Muhamed, S Srinivasan, CH Teo, Q Cui, B Zeng, T Chilimbi, ...
Pacific-Asia Conference on Knowledge Discovery and Data Mining, 73-85, 2023
42023
ReAugKD: Retrieval-augmented knowledge distillation for pre-trained language models
J Zhang, A Muhamed, A Anantharaman, G Wang, C Chen, K Zhong, ...
Proceedings of the 61st Annual Meeting of the Association for Computational …, 2023
22023
OssCSE: Overcoming Surface Structure Bias in Contrastive Learning for Unsupervised Sentence Embedding
Z Shi, G Wang, K Bai, J Li, X Li, Q Cui, B Zeng, T Chilimbi, X Zhu
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
12023
DCAF-BERT: A Distilled Cachable Adaptable Factorized Model For Improved Ads CTR Prediction
A Muhamed, J Singh, S Zheng, I Keivanloo, S Perera, J Mracek, Y Xu, ...
Companion Proceedings of the Web Conference 2022, 110-115, 2022
12022
Grouped-Attention for Content-Selection and Content-Plan Generation
B Distiawan, X Wang, J Qi, R Zhang, Q Cui
Findings of the Association for Computational Linguistics: EMNLP 2021, 1935-1944, 2021
2021
The system can't perform the operation now. Try again later.
Articles 1–7