Latent dirichlet allocation
Abstract We describe latent Dirichlet allocation (LDA), a generative probabilistic model for
collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian
model, in which each item of a collection is modeled as a finite mixture over an underlying
collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian
model, in which each item of a collection is modeled as a finite mixture over an underlying
Cited by 17215 Related articles All 123 versions Cite SaveSaving...Error saving. Try again? More Fewer
[PDF][PDF] On spectral clustering: Analysis and an algorithm
Despite many empirical successes of spectral clustering methodsalgorithms that cluster
points using eigenvectors of matrices derived from the data-there are several unresolved
issues. First, there are a wide variety of algorithms that use the eigenvectors in slightly
points using eigenvectors of matrices derived from the data-there are several unresolved
issues. First, there are a wide variety of algorithms that use the eigenvectors in slightly
Cited by 5409 Related articles All 62 versions Cite SaveSaving...Error saving. Try again? More View as HTML Fewer
[PDF][PDF] ROS: an open-source Robot Operating System
…, T Foote, J Leibs, R Wheeler, AY Ng - ICRA workshop on …, 2009 - willowgarage.com
Abstract—This paper gives an overview of ROS, an opensource robot operating system.
ROS is not an operating system in the traditional sense of process management and
scheduling; rather, it provides a structured communications layer above the host operating
ROS is not an operating system in the traditional sense of process management and
scheduling; rather, it provides a structured communications layer above the host operating
Cited by 3104 Related articles All 18 versions Cite SaveSaving...Error saving. Try again? More View as HTML Fewer
[PDF][PDF] Distance metric learning with application to clustering with side-information
Abstract Many algorithms rely critically on being given a good metric over their inputs. For
instance, data can often be clustered in many “plausible” ways, and if a clustering algorithm
such as K-means initially fails to find one that is meaningful to a user, the only recourse may
instance, data can often be clustered in many “plausible” ways, and if a clustering algorithm
such as K-means initially fails to find one that is meaningful to a user, the only recourse may
Cited by 2216 Related articles All 35 versions Cite SaveSaving...Error saving. Try again? More View as HTML Fewer
[PDF][PDF] Efficient sparse coding algorithms
Abstract Sparse coding provides a class of algorithms for finding succinct representations of
stimuli; given only unlabeled input data, it discovers basis functions that capture higher-level
features in the data. However, finding sparse codes remains a very difficult computational
stimuli; given only unlabeled input data, it discovers basis functions that capture higher-level
features in the data. However, finding sparse codes remains a very difficult computational
Cited by 1938 Related articles All 27 versions Cite SaveSaving...Error saving. Try again? More View as HTML Fewer
[PDF][PDF] On discriminative vs. generative classifiers: A comparison of logistic regression and naive bayes
AY Ng, MI Jordan - Advances in neural information processing …, 2002 - papers.nips.cc
Abstract We compare discriminative and generative learning as typified by logistic
regression and naive Bayes. We show, contrary to a widelyheld belief that discriminative
classifiers are almost always to be preferred, that there can often be two distinct regimes of
regression and naive Bayes. We show, contrary to a widelyheld belief that discriminative
classifiers are almost always to be preferred, that there can often be two distinct regimes of
Cited by 1488 Related articles All 29 versions Cite SaveSaving...Error saving. Try again? More View as HTML Fewer
Cheap and fast---but is it good?: evaluating non-expert annotations for natural language tasks
Abstract Human linguistic annotation is crucial for many natural language processing tasks
but can be expensive and time-consuming. We explore the use of Amazon's Mechanical
Turk system, a significantly cheaper and faster method for collecting annotations from a
but can be expensive and time-consuming. We explore the use of Amazon's Mechanical
Turk system, a significantly cheaper and faster method for collecting annotations from a
Cited by 1376 Related articles All 38 versions Cite SaveSaving...Error saving. Try again? More Fewer
Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations
Abstract There has been much interest in unsupervised learning of hierarchical generative
models such as deep belief networks. Scaling such models to full-sized, high-dimensional
images remains a difficult problem. To address this problem, we present the convolutional
models such as deep belief networks. Scaling such models to full-sized, high-dimensional
images remains a difficult problem. To address this problem, we present the convolutional
Cited by 1278 Related articles All 31 versions Cite SaveSaving...Error saving. Try again? More Fewer
[PDF][PDF] Map-reduce for machine learning on multicore
Abstract We are at the beginning of the multicore era. Computers will have increasingly
many cores (processors), but there is still no good programming framework for these
architectures, and thus no simple and unified way for machine learning to take advantage of
many cores (processors), but there is still no good programming framework for these
architectures, and thus no simple and unified way for machine learning to take advantage of
Cited by 1171 Related articles All 38 versions Cite SaveSaving...Error saving. Try again? More View as HTML Fewer
[PDF][PDF] Recursive deep models for semantic compositionality over a sentiment treebank
Abstract Semantic word spaces have been very useful but cannot express the meaning of
longer phrases in a principled way. Further progress towards understanding
compositionality in tasks such as sentiment detection requires richer supervised training and
longer phrases in a principled way. Further progress towards understanding
compositionality in tasks such as sentiment detection requires richer supervised training and
Cited by 1037 Related articles All 20 versions Cite SaveSaving...Error saving. Try again? More View as HTML Fewer
Related searches
- andrew ng deep learning
- andrew ng machine learning
- andrew ng stanford
- andrew ng reinforcement learning
- andrew ng helicopter
- andrew ng learning gpu
- andrew ng sparse autoencoder
- andrew ng logistic regression
- andrew ng spectral clustering
- andrew ng latent dirichlet allocation
- andrew ng convolutional neural network
- andrew ng lecture notes
- andrew ng unsupervised learning
- andrew ng sparse coding
- andrew ng coates
- andrew ng mooc