TY - JOUR
TI - A Predicate/State Transformer Semantics for Bayesian Learning
AU - Jacobs, Bart
AU - Zanasi, Fabio
T2 - Electronic Notes in Theoretical Computer Science
T3 - The Thirty-second Conference on the Mathematical Foundations of Programming Semantics (MFPS XXXII)
AB - This paper establishes a link between Bayesian inference (learning) and predicate and state transformer operations from programming semantics and logic. Specifically, a very general definition of backward inference is given via first applying a predicate transformer and then conditioning. Analogously, forward inference involves first conditioning and then applying a state transformer. These definitions are illustrated in many examples in discrete and continuous probability theory and also in quantum theory.
DA - 2016/10/05/
PY - 2016
DO - 10/ggdgbb
DP - ScienceDirect
VL - 325
SP - 185
EP - 200
J2 - Electronic Notes in Theoretical Computer Science
LA - en
SN - 1571-0661
UR - http://www.sciencedirect.com/science/article/pii/S1571066116300883
Y2 - 2019/11/24/12:04:12
KW - Bayesianism
KW - Categorical ML
KW - Categorical probability theory
KW - Effectus theory
KW - Programming language theory
KW - Semantics
ER -
TY - JOUR
TI - The Logical Essentials of Bayesian Reasoning
AU - Jacobs, Bart
AU - Zanasi, Fabio
T2 - arXiv:1804.01193 [cs]
AB - This chapter offers an accessible introduction to the channel-based approach to Bayesian probability theory. This framework rests on algebraic and logical foundations, inspired by the methodologies of programming language semantics. It offers a uniform, structured and expressive language for describing Bayesian phenomena in terms of familiar programming concepts, like channel, predicate transformation and state transformation. The introduction also covers inference in Bayesian networks, which will be modelled by a suitable calculus of string diagrams.
DA - 2018/04/27/
PY - 2018
DP - arXiv.org
UR - http://arxiv.org/abs/1804.01193
Y2 - 2019/11/21/20:39:51
KW - Bayesianism
KW - Categorical probability theory
ER -
TY - JOUR
TI - A Formal Semantics of Influence in Bayesian Reasoning
AU - Jacobs, Bart
AU - Zanasi, Fabio
T2 - Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik GmbH, Wadern/Saarbruecken, Germany
AB - This paper proposes a formal deﬁnition of inﬂuence in Bayesian reasoning, based on the notions of state (as probability distribution), predicate, validity and conditioning. Our approach highlights how conditioning a joint entwined/entangled state with a predicate on one of its components has ‘crossover’ inﬂuence on the other components. We use the total variation metric on probability distributions to quantitatively measure such inﬂuence. These insights are applied to give a rigorous explanation of the fundamental concept of d-separation in Bayesian networks.
DA - 2017///
PY - 2017
DO - 10/ggdgbc
DP - DataCite
LA - en
UR - http://drops.dagstuhl.de/opus/volltexte/2017/8089/
Y2 - 2019/11/24/12:11:15
KW - Bayesianism
KW - Categorical probability theory
KW - Programming language theory
KW - Semantics
ER -
TY - JOUR
TI - Causal Inference by String Diagram Surgery
AU - Jacobs, Bart
AU - Kissinger, Aleks
AU - Zanasi, Fabio
T2 - arXiv:1811.08338 [cs, math]
AB - Extracting causal relationships from observed correlations is a growing area in probabilistic reasoning, originating with the seminal work of Pearl and others from the early 1990s. This paper develops a new, categorically oriented view based on a clear distinction between syntax (string diagrams) and semantics (stochastic matrices), connected via interpretations as structure-preserving functors. A key notion in the identification of causal effects is that of an intervention, whereby a variable is forcefully set to a particular value independent of any prior propensities. We represent the effect of such an intervention as an endofunctor which performs `string diagram surgery' within the syntactic category of string diagrams. This diagram surgery in turn yields a new, interventional distribution via the interpretation functor. While in general there is no way to compute interventional distributions purely from observed data, we show that this is possible in certain special cases using a calculational tool called comb disintegration. We demonstrate the use of this technique on a well-known toy example, where we predict the causal effect of smoking on cancer in the presence of a confounding common cause. After developing this specific example, we show this technique provides simple sufficient conditions for computing interventions which apply to a wide variety of situations considered in the causal inference literature.
DA - 2019/07/28/
PY - 2019
DP - arXiv.org
UR - http://arxiv.org/abs/1811.08338
Y2 - 2019/11/21/20:42:12
KW - Bayesianism
KW - Categorical probability theory
KW - Implementation
ER -