TY - JOUR
TI - Probabilistic machine learning and artificial intelligence
AU - Ghahramani, Zoubin
T2 - Nature
DA - 2015/05//
PY - 2015
DO - 10/gdxwhq
DP - Crossref
VL - 521
IS - 7553
SP - 452
EP - 459
LA - en
SN - 0028-0836, 1476-4687
UR - http://www.nature.com/articles/nature14541
Y2 - 2019/11/28/12:16:49
KW - Bayesian inference
KW - Classical ML
KW - Machine learning
KW - Probabilistic programming
ER -
TY - JOUR
TI - A Tutorial on Learning With Bayesian Networks
AU - Heckerman, David
AB - A Bayesian network is a graphical model that encodes probabilistic relationships among variables of interest. When used in conjunction with statistical techniques, the graphical model has several advantages for data analysis. One, because the model encodes dependencies among all variables, it readily handles situations where some data entries are missing. Two, a Bayesian network can …
DA - 1995/03/01/
PY - 1995
DP - www.microsoft.com
LA - en-US
UR - https://www.microsoft.com/en-us/research/publication/a-tutorial-on-learning-with-bayesian-networks/
Y2 - 2019/11/22/19:09:15
KW - Bayesianism
KW - Classical ML
KW - Machine learning
ER -
TY - CHAP
TI - Graphical Models: Overview
AU - Wermuth, N.
AU - Cox, D. R.
T2 - International Encyclopedia of the Social & Behavioral Sciences
A2 - Smelser, Neil J.
A2 - Baltes, Paul B.
AB - Graphical Markov models provide a method of representing possibly complicated multivariate dependencies in such a way that the general qualitative features can be understood, that statistical independencies are highlighted, and that some properties can be derived directly. Variables are represented by the nodes of a graph. Pairs of nodes may be joined by an edge. Edges are directed if one variable is a response to the other variable considered as explanatory, but are undirected if the variables are on an equal footing. Absence of an edge typically implies statistical independence, conditional, or marginal depending on the kind of graph. The need for a number of types of graph arises because it is helpful to represent a number of different kinds of dependence structures. Of special importance are chain graphs in which variables are arranged in a sequence or chain of blocks, the variables in any one block being on an equal footing, some being possibly joint responses to variables in the past and some being jointly explanatory to variables in the future of the block considered. Some main properties of such systems are outlined, and recent research results are sketched. Suggestions for further reading are given. As an illustrative example, some analysis of data on the treatment of chronic pain is presented.
CY - Oxford
DA - 2001/01/01/
PY - 2001
DP - ScienceDirect
SP - 6379
EP - 6386
LA - en
PB - Pergamon
SN - 978-0-08-043076-8
ST - Graphical Models
UR - http://www.sciencedirect.com/science/article/pii/B008043076700440X
Y2 - 2019/11/22/19:12:23
KW - Bayesianism
KW - Classical ML
KW - Machine learning
ER -
TY - BOOK
TI - Model-Based Machine Learning
AU - Winn, John Michael
AB - This book is unusual for a machine learning text book in that the authors do not review dozens of different algorithms. Instead they introduce all of the key ideas through a series of case studies involving real-world applications. Case studies play a central role because it is only in the context of applications that it makes sense to discuss modelling assumptions. Each chapter therefore introduces one case study which is drawn from a real-world application that has been solved using a model-based approach.
DA - 2019/06//
PY - 2019
DP - Google Books
SP - 400
LA - en
PB - Taylor & Francis Incorporated
SN - 978-1-4987-5681-5
KW - Bayesian inference
KW - Classical ML
KW - Implementation
ER -