Authors
Arvind Neelakantan, Luke Vilnis, Quoc V Le, Ilya Sutskever, Lukasz Kaiser, Karol Kurach, James Martens
Publication date
2015/11/21
Journal
arXiv preprint arXiv:1511.06807
Description
Abstract: Deep feedforward and recurrent networks have achieved impressive results in
many perception and language processing applications. This success is partially attributed
to architectural innovations such as convolutional and long short-term memory networks.
The main motivation for these architectural innovations is that they capture better domain
knowledge, and importantly are easier to optimize than more basic architectures. Recently,
more complex architectures such as Neural Turing Machines and Memory Networks have ...
many perception and language processing applications. This success is partially attributed
to architectural innovations such as convolutional and long short-term memory networks.
The main motivation for these architectural innovations is that they capture better domain
knowledge, and importantly are easier to optimize than more basic architectures. Recently,
more complex architectures such as Neural Turing Machines and Memory Networks have ...
Total citations
Scholar articles
A Neelakantan, L Vilnis, QV Le, I Sutskever, L Kaiser… - arXiv preprint arXiv:1511.06807, 2015
Dates and citation counts are estimated and are determined automatically by a computer program.