TY - ELEC
TI - Algebraic Geometry and Statistical Learning Theory
AU - Watanabe, Sumio
T2 - Cambridge Core
AB - Cambridge Core - Pattern Recognition and Machine Learning - Algebraic Geometry and Statistical Learning Theory - by Sumio Watanabe
DA - 2009/08//
PY - 2009
LA - en
UR - /core/books/algebraic-geometry-and-statistical-learning-theory/9C8FD1BDC817E2FC79117C7F41544A3A
Y2 - 2019/11/22/18:05:57
KW - Algebra
KW - Bayesianism
KW - Purely theoretical
KW - Statistical learning theory
ER -
TY - JOUR
TI - What is a statistical model?
AU - McCullagh, Peter
T2 - The Annals of Statistics
DA - 2002/10//
PY - 2002
DO - 10/bkts3m
DP - Crossref
VL - 30
IS - 5
SP - 1225
EP - 1310
LA - en
UR - http://projecteuclid.org/euclid.aos/1035844977
Y2 - 2019/11/22/17:39:10
KW - Bayesianism
KW - Categorical ML
KW - Categorical probability theory
KW - Compendium
KW - Purely theoretical
KW - Statistical learning theory
ER -
TY - JOUR
TI - Backprop as Functor: A compositional perspective on supervised learning
AU - Fong, Brendan
AU - Spivak, David I.
AU - Tuyéras, Rémy
T2 - arXiv:1711.10455 [cs, math]
AB - A supervised learning algorithm searches over a set of functions $A \to B$ parametrised by a space $P$ to find the best approximation to some ideal function $f\colon A \to B$. It does this by taking examples $(a,f(a)) \in A\times B$, and updating the parameter according to some rule. We define a category where these update rules may be composed, and show that gradient descent---with respect to a fixed step size and an error function satisfying a certain property---defines a monoidal functor from a category of parametrised functions to this category of update rules. This provides a structural perspective on backpropagation, as well as a broad generalisation of neural networks.
DA - 2019/05/01/
PY - 2019
DP - arXiv.org
ST - Backprop as Functor
UR - http://arxiv.org/abs/1711.10455
Y2 - 2019/11/23/14:42:07
KW - Categorical ML
KW - Machine learning
KW - Purely theoretical
ER -
TY - BOOK
TI - The Combinatory Programme
AU - Engeler, Erwin
T2 - Progress in Theoretical Computer Science
AB - Combinatory logic started as a programme in the foundation of mathematics and in an historical context at a time when such endeavours attracted the most gifted among the mathematicians. This small volume arose under quite differ ent circumstances, namely within the context of reworking the mathematical foundations of computer science. I have been very lucky in finding gifted students who agreed to work with me and chose, for their Ph. D. theses, subjects that arose from my own attempts 1 to create a coherent mathematical view of these foundations. The result of this collaborative work is presented here in the hope that it does justice to the individual contributor and that the reader has a chance of judging the work as a whole. E. Engeler ETH Zurich, April 1994 lCollected in Chapter III, An Algebraization of Algorithmics, in Algorithmic Properties of Structures, Selected Papers of Erwin Engeler, World Scientific PubJ. Co. , Singapore, 1993, pp. 183-257. I Historical and Philosophical Background Erwin Engeler In the fall of 1928 a young American turned up at the Mathematical Institute of Gottingen, a mecca of mathematicians at the time; he was a young man with a dream and his name was H. B. Curry. He felt that he had the tools in hand with which to solve the problem of foundations of mathematics mice and for all. His was an approach that came to be called "formalist" and embodied that later became known as Combinatory Logic.
DA - 1995///
PY - 1995
DP - www.springer.com
LA - en
PB - Birkhäuser Basel
SN - 978-0-8176-3801-6
UR - https://www.springer.com/gb/book/9780817638016
Y2 - 2019/11/26/14:23:14
KW - Algebra
KW - Programming language theory
KW - Purely theoretical
ER -
TY - JOUR
TI - Bayesian machine learning via category theory
AU - Culbertson, Jared
AU - Sturtz, Kirk
T2 - arXiv:1312.1445 [math]
AB - From the Bayesian perspective, the category of conditional probabilities (a variant of the Kleisli category of the Giry monad, whose objects are measurable spaces and arrows are Markov kernels) gives a nice framework for conceptualization and analysis of many aspects of machine learning. Using categorical methods, we construct models for parametric and nonparametric Bayesian reasoning on function spaces, thus providing a basis for the supervised learning problem. In particular, stochastic processes are arrows to these function spaces which serve as prior probabilities. The resulting inference maps can often be analytically constructed in this symmetric monoidal weakly closed category. We also show how to view general stochastic processes using functor categories and demonstrate the Kalman filter as an archetype for the hidden Markov model.
DA - 2013/12/05/
PY - 2013
DP - arXiv.org
UR - http://arxiv.org/abs/1312.1445
Y2 - 2019/11/22/17:32:35
KW - Bayesianism
KW - Categorical ML
KW - Categorical probability theory
KW - Purely theoretical
ER -