Authors
Andrew M Dai, Quoc V Le
Publication date
2015
Conference
Advances in Neural Information Processing Systems
Pages
3079-3087
Description
Abstract We present two approaches to use unlabeled data to improve Sequence
Learningwith recurrent networks. The first approach is to predict what comes next in
asequence, which is a language model in NLP. The second approach is to use asequence
autoencoder, which reads the input sequence into a vector and predictsthe input sequence
again. These two algorithms can be used as a “pretraining” algorithm for a later supervised
sequence learning algorithm. In other words, theparameters obtained from the pretraining ...
Total citations
20152016423
Scholar articles
AM Dai, QV Le - Advances in Neural Information Processing Systems, 2015