TY - JOUR
TI - Probabilistic machine learning and artificial intelligence
AU - Ghahramani, Zoubin
T2 - Nature
DA - 2015/05//
PY - 2015
DO - 10/gdxwhq
DP - Crossref
VL - 521
IS - 7553
SP - 452
EP - 459
LA - en
SN - 0028-0836, 1476-4687
UR - http://www.nature.com/articles/nature14541
Y2 - 2019/11/28/12:16:49
KW - Bayesian inference
KW - Classical ML
KW - Machine learning
KW - Probabilistic programming
ER -
TY - JOUR
TI - A Predicate/State Transformer Semantics for Bayesian Learning
AU - Jacobs, Bart
AU - Zanasi, Fabio
T2 - Electronic Notes in Theoretical Computer Science
T3 - The Thirty-second Conference on the Mathematical Foundations of Programming Semantics (MFPS XXXII)
AB - This paper establishes a link between Bayesian inference (learning) and predicate and state transformer operations from programming semantics and logic. Specifically, a very general definition of backward inference is given via first applying a predicate transformer and then conditioning. Analogously, forward inference involves first conditioning and then applying a state transformer. These definitions are illustrated in many examples in discrete and continuous probability theory and also in quantum theory.
DA - 2016/10/05/
PY - 2016
DO - 10/ggdgbb
DP - ScienceDirect
VL - 325
SP - 185
EP - 200
J2 - Electronic Notes in Theoretical Computer Science
LA - en
SN - 1571-0661
UR - http://www.sciencedirect.com/science/article/pii/S1571066116300883
Y2 - 2019/11/24/12:04:12
KW - Bayesianism
KW - Categorical ML
KW - Categorical probability theory
KW - Effectus theory
KW - Programming language theory
KW - Semantics
ER -
TY - JOUR
TI - Deep Probabilistic Programming
AU - Tran, Dustin
AU - Hoffman, Matthew D.
AU - Saurous, Rif A.
AU - Brevdo, Eugene
AU - Murphy, Kevin
AU - Blei, David M.
T2 - arXiv:1701.03757 [cs, stat]
AB - We propose Edward, a Turing-complete probabilistic programming language. Edward defines two compositional representations---random variables and inference. By treating inference as a first class citizen, on a par with modeling, we show that probabilistic programming can be as flexible and computationally efficient as traditional deep learning. For flexibility, Edward makes it easy to fit the same model using a variety of composable inference methods, ranging from point estimation to variational inference to MCMC. In addition, Edward can reuse the modeling representation as part of inference, facilitating the design of rich variational models and generative adversarial networks. For efficiency, Edward is integrated into TensorFlow, providing significant speedups over existing probabilistic systems. For example, we show on a benchmark logistic regression task that Edward is at least 35x faster than Stan and 6x faster than PyMC3. Further, Edward incurs no runtime overhead: it is as fast as handwritten TensorFlow.
DA - 2017/03/07/
PY - 2017
DP - arXiv.org
UR - http://arxiv.org/abs/1701.03757
Y2 - 2019/11/27/23:15:14
KW - Bayesian inference
KW - Implementation
KW - Machine learning
KW - Probabilistic programming
ER -