Authors
Wojciech Zaremba, Ilya Sutskever, Oriol Vinyals
Publication date
2014/9/8
Journal
arXiv preprint arXiv:1409.2329
Description
Abstract: We present a simple regularization technique for Recurrent Neural Networks
(RNNs) with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique
for regularizing neural networks, does not work well with RNNs and LSTMs. In this paper, we
show how to correctly apply dropout to LSTMs, and show that it substantially reduces
overfitting on a variety of tasks. These tasks include language modeling, speech recognition,
image caption generation, and machine translation.
(RNNs) with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique
for regularizing neural networks, does not work well with RNNs and LSTMs. In this paper, we
show how to correctly apply dropout to LSTMs, and show that it substantially reduces
overfitting on a variety of tasks. These tasks include language modeling, speech recognition,
image caption generation, and machine translation.
Total citations
Scholar articles
W Zaremba, I Sutskever, O Vinyals - arXiv preprint arXiv:1409.2329, 2014
Dates and citation counts are estimated and are determined automatically by a computer program.