Follow
Ankur Sikarwar
Ankur Sikarwar
Grad Student, Mila and Université de Montréal
Verified email at mila.quebec - Homepage
Title
Cited by
Cited by
Year
When Can Transformers Ground and Compose: Insights from Compositional Generalization Benchmarks
A Sikarwar, A Patel, N Goyal
EMNLP 2022, 2022
102022
On the Efficacy of Co-Attention Transformer Layers in Visual Question Answering
A Sikarwar, G Kreiman
arXiv preprint arXiv:2201.03965, 2022
42022
Reason from context with self-supervised learning
X Liu, A Sikarwar, G Kreiman, Z Shi, M Zhang
arXiv preprint arXiv:2211.12817, 2023
22023
Learning to Learn: How to Continuously Teach Humans and Machines
P Singh, Y Li, A Sikarwar, W Lei, D Gao, MB Talbot, Y Sun, MZ Shou, ...
ICCV 2023, 2023
22023
Human or Machine? Turing Tests for Vision and Language
M Zhang, G Dellaferrera, A Sikarwar, M Armendariz, N Mudrik, P Agrawal, ...
arXiv preprint arXiv:2211.13087, 2022
12022
Decoding the Enigma: Benchmarking Humans and AIs on the Many Facets of Working Memory
A Sikarwar, M Zhang
NeurIPS 2023, Datasets and Benchmarks Track, 2023
2023
Supplementary Material Learning to Learn: How to Continuously Teach Humans and Machines
P Singh, L You, A Sikarwar, W Lei, D Gao, MB Talbot, Y Sun, MZ Shou, ...
The system can't perform the operation now. Try again later.
Articles 1–7