1 code implementation • NeurIPS 2015 • Jacob Steinhardt, Percy S. Liang
For weakly-supervised problems with deterministic constraints between the latent variables and observed output, learning necessitates performing inference over latent variables conditioned on the output, which can be intractable no matter how simple the model family is.
1 code implementation • NeurIPS 2015 • Volodymyr Kuleshov, Percy S. Liang
In user-facing applications, displaying calibrated confidence measures---probabilities that correspond to true frequency---can be as important as obtaining high accuracy.
1 code implementation • NeurIPS 2014 • Roy Frostig, Sida Wang, Percy S. Liang, Christopher D. Manning
We focus on the problem of maximum a posteriori (MAP) inference in Markov random fields with binary variables and pairwise interactions.
no code implementations • NeurIPS 2012 • Daniel J. Hsu, Sham M. Kakade, Percy S. Liang
This paper explores unsupervised learning of parsing models along two directions.
no code implementations • NeurIPS 2009 • Percy S. Liang, Guillaume Bouchard, Francis R. Bach, Michael. I. Jordan
Many types of regularization schemes have been employed in statistical learning, each one motivated by some assumption about the problem domain.
no code implementations • NeurIPS 2007 • Alexandre Bouchard-Côté, Percy S. Liang, Dan Klein, Thomas L. Griffiths
We present a probabilistic approach to language change in which word forms are represented by phoneme sequences that undergo stochastic edits along the branches of a phylogenetic tree.