TY - CHAP
TI - Graphical Models: Overview
AU - Wermuth, N.
AU - Cox, D. R.
T2 - International Encyclopedia of the Social & Behavioral Sciences
A2 - Smelser, Neil J.
A2 - Baltes, Paul B.
AB - Graphical Markov models provide a method of representing possibly complicated multivariate dependencies in such a way that the general qualitative features can be understood, that statistical independencies are highlighted, and that some properties can be derived directly. Variables are represented by the nodes of a graph. Pairs of nodes may be joined by an edge. Edges are directed if one variable is a response to the other variable considered as explanatory, but are undirected if the variables are on an equal footing. Absence of an edge typically implies statistical independence, conditional, or marginal depending on the kind of graph. The need for a number of types of graph arises because it is helpful to represent a number of different kinds of dependence structures. Of special importance are chain graphs in which variables are arranged in a sequence or chain of blocks, the variables in any one block being on an equal footing, some being possibly joint responses to variables in the past and some being jointly explanatory to variables in the future of the block considered. Some main properties of such systems are outlined, and recent research results are sketched. Suggestions for further reading are given. As an illustrative example, some analysis of data on the treatment of chronic pain is presented.
CY - Oxford
DA - 2001/01/01/
PY - 2001
DP - ScienceDirect
SP - 6379
EP - 6386
LA - en
PB - Pergamon
SN - 978-0-08-043076-8
ST - Graphical Models
UR - http://www.sciencedirect.com/science/article/pii/B008043076700440X
Y2 - 2019/11/22/19:12:23
KW - Bayesianism
KW - Classical ML
KW - Machine learning
ER -
TY - CHAP
TI - Commutative Semantics for Probabilistic Programming
AU - Staton, Sam
T2 - Programming Languages and Systems
A2 - Yang, Hongseok
AB - We show that a measure-based denotational semantics for probabilistic programming is commutative. The idea underlying probabilistic programming languages (Anglican, Church, Hakaru, ...) is that programs express statistical models as a combination of prior distributions and likelihood of observations. The product of prior and likelihood is an unnormalized posterior distribution, and the inference problem is to ﬁnd the normalizing constant. One common semantic perspective is thus that a probabilistic program is understood as an unnormalized posterior measure, in the sense of measure theory, and the normalizing constant is the measure of the entire semantic domain.
CY - Berlin, Heidelberg
DA - 2017///
PY - 2017
DP - Crossref
VL - 10201
SP - 855
EP - 879
LA - en
PB - Springer Berlin Heidelberg
SN - 978-3-662-54433-4 978-3-662-54434-1
UR - http://link.springer.com/10.1007/978-3-662-54434-1_32
Y2 - 2019/11/23/16:35:50
KW - Bayesianism
KW - Probabilistic programming
KW - Programming language theory
KW - Semantics
ER -
TY - CHAP
TI - Probabilistic Automata: System Types, Parallel Composition and Comparison
AU - Sokolova, Ana
AU - de Vink, Erik P.
T2 - Validation of Stochastic Systems
A2 - Baier, Christel
A2 - Haverkort, Boudewijn R.
A2 - Hermanns, Holger
A2 - Katoen, Joost-Pieter
A2 - Siegle, Markus
A3 - Goos, Gerhard
A3 - Hartmanis, Juris
A3 - van Leeuwen, Jan
AB - We survey various notions of probabilistic automata and probabilistic bisimulation, accumulating in an expressiveness hierarchy of probabilistic system types. The aim of this paper is twofold: On the one hand it provides an overview of existing types of probabilistic systems and, on the other hand, it explains the relationship between these models. We overview probabilistic systems with discrete probabilities only. The expressiveness order used to built the hierarchy is deﬁned via the existence of mappings between the corresponding system types that preserve and reﬂect bisimilarity. Additionally, we discuss parallel composition for the presented types of systems, augmenting the map of probabilistic automata with closedness under this compositional operator.
CY - Berlin, Heidelberg
DA - 2004///
PY - 2004
DP - Crossref
VL - 2925
SP - 1
EP - 43
LA - en
PB - Springer Berlin Heidelberg
SN - 978-3-540-22265-1 978-3-540-24611-4
ST - Probabilistic Automata
UR - http://link.springer.com/10.1007/978-3-540-24611-4_1
Y2 - 2019/11/28/16:11:25
KW - Coalgebras
KW - Probabilistic transition systems
KW - Transition systems
ER -
TY - CHAP
TI - Tomaso A. Poggio autobiography
AU - Poggio, Tomaso
DA - 2013///
PY - 2013
SP - 54
UR - http://poggio-lab.mit.edu/sites/default/files/cv/tomasopoggio.pdf
KW - Classical ML
KW - Compendium
KW - Machine learning
ER -
TY - CHAP
TI - Tools for the Advancement of Objective Logic: Closed Categories and Toposes
AU - Lawvere, F. William
T2 - The Logical Foundations of Cognition
A2 - Macnamara, John
A2 - Reyes, Gonzalo E.
DA - 1994///
PY - 1994
DP - PhilPapers
SP - 43
EP - 56
PB - Oxford University Press USA
ST - Tools for the Advancement of Objective Logic
KW - Compendium
KW - Emergence
KW - Psychology
KW - Sketchy
ER -
TY - CHAP
TI - Neural Algebra and Consciousness: A Theory of Structural Functionality in Neural Nets
AU - Engeler, Erwin
T2 - Algebraic Biology
A2 - Horimoto, Katsuhisa
A2 - Regensburger, Georg
A2 - Rosenkranz, Markus
A2 - Yoshida, Hiroshi
AB - Thoughts are spatio-temporal patterns of coalitions of ﬁring neurons and their interconnections. Neural algebras represent these patterns as formal algebraic objects, and a suitable composition operation reﬂects their interaction. Thus, a neural algebra is associated with any neural net. The present paper presents this formalization and develops the basic algebraic tools for formulating and solving the problem of ﬁnding the neural correlates of concepts such as reﬂection, association, coordination, etc. The main application is to the notion of consciousness, whose structural and functional basis is made explicit as the emergence of a set of solutions to a ﬁxpoint equation.
CY - Berlin, Heidelberg
DA - 2008///
PY - 2008
DP - Crossref
VL - 5147
SP - 96
EP - 109
LA - en
PB - Springer Berlin Heidelberg
SN - 978-3-540-85100-4 978-3-540-85101-1
ST - Neural Algebra and Consciousness
UR - http://link.springer.com/10.1007/978-3-540-85101-1_8
Y2 - 2019/11/22/18:24:23
KW - Emergence
KW - Neuroscience
KW - Sketchy
ER -