Latent dirichlet allocation

DM Blei, AY Ng, MI Jordan - Journal of machine Learning research, 2003 - jmlr.org
Abstract We describe latent Dirichlet allocation (LDA), a generative probabilistic model for
collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian
model, in which each item of a collection is modeled as a finite mixture over an underlying

[PDF][PDF] On spectral clustering: Analysis and an algorithm

AY Ng, MI Jordan, Y Weiss - NIPS, 2001 - papers.nips.cc
Despite many empirical successes of spectral clustering methodsalgorithms that cluster
points using eigenvectors of matrices derived from the data-there are several unresolved
issues. First, there are a wide variety of algorithms that use the eigenvectors in slightly

[PDF][PDF] ROS: an open-source Robot Operating System

…, T Foote, J Leibs, R Wheeler, AY Ng - ICRA workshop on …, 2009 - willowgarage.com
Abstract—This paper gives an overview of ROS, an opensource robot operating system.
ROS is not an operating system in the traditional sense of process management and
scheduling; rather, it provides a structured communications layer above the host operating

[PDF][PDF] Distance metric learning with application to clustering with side-information

EP Xing, AY Ng, MI Jordan, S Russell - NIPS, 2002 - papers.nips.cc
Abstract Many algorithms rely critically on being given a good metric over their inputs. For
instance, data can often be clustered in many “plausible” ways, and if a clustering algorithm
such as K-means initially fails to find one that is meaningful to a user, the only recourse may

[PDF][PDF] Efficient sparse coding algorithms

H Lee, A Battle, R Raina, AY Ng - Advances in neural information …, 2007 - papers.nips.cc
Abstract Sparse coding provides a class of algorithms for finding succinct representations of
stimuli; given only unlabeled input data, it discovers basis functions that capture higher-level
features in the data. However, finding sparse codes remains a very difficult computational

[PDF][PDF] On discriminative vs. generative classifiers: A comparison of logistic regression and naive bayes

AY Ng, MI Jordan - Advances in neural information processing …, 2002 - papers.nips.cc
Abstract We compare discriminative and generative learning as typified by logistic
regression and naive Bayes. We show, contrary to a widelyheld belief that discriminative
classifiers are almost always to be preferred, that there can often be two distinct regimes of

Cheap and fast---but is it good?: evaluating non-expert annotations for natural language tasks

R Snow, B O'Connor, D Jurafsky, AY Ng - Proceedings of the conference …, 2008 - dl.acm.org
Abstract Human linguistic annotation is crucial for many natural language processing tasks
but can be expensive and time-consuming. We explore the use of Amazon's Mechanical
Turk system, a significantly cheaper and faster method for collecting annotations from a

Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations

H Lee, R Grosse, R Ranganath, AY Ng - Proceedings of the 26th annual …, 2009 - dl.acm.org
Abstract There has been much interest in unsupervised learning of hierarchical generative
models such as deep belief networks. Scaling such models to full-sized, high-dimensional
images remains a difficult problem. To address this problem, we present the convolutional

[PDF][PDF] Map-reduce for machine learning on multicore

CT Chu, SK Kim, YA Lin, YY Yu, G Bradski, AY Ng… - NIPS, 2006 - papers.nips.cc
Abstract We are at the beginning of the multicore era. Computers will have increasingly
many cores (processors), but there is still no good programming framework for these
architectures, and thus no simple and unified way for machine learning to take advantage of

[PDF][PDF] Recursive deep models for semantic compositionality over a sentiment treebank

…, JY Wu, J Chuang, CD Manning, AY Ng… - Proceedings of the …, 2013 - Citeseer
Abstract Semantic word spaces have been very useful but cannot express the meaning of
longer phrases in a principled way. Further progress towards understanding
compositionality in tasks such as sentiment detection requires richer supervised training and

Create alert