TY - JOUR
TI - Neural Logic Machines
AU - Dong, Honghua
AU - Mao, Jiayuan
AU - Lin, Tian
AU - Wang, Chong
AU - Li, Lihong
AU - Zhou, Denny
T2 - arXiv:1904.11694 [cs, stat]
AB - We propose the Neural Logic Machine (NLM), a neural-symbolic architecture for both inductive learning and logic reasoning. NLMs exploit the power of both neural networks---as function approximators, and logic programming---as a symbolic processor for objects with properties, relations, logic connectives, and quantifiers. After being trained on small-scale tasks (such as sorting short arrays), NLMs can recover lifted rules, and generalize to large-scale tasks (such as sorting longer arrays). In our experiments, NLMs achieve perfect generalization in a number of tasks, from relational reasoning tasks on the family tree and general graphs, to decision making tasks including sorting arrays, finding shortest paths, and playing the blocks world. Most of these tasks are hard to accomplish for neural networks or inductive logic programming alone.
DA - 2019/04/26/
PY - 2019
DP - arXiv.org
UR - http://arxiv.org/abs/1904.11694
Y2 - 2019/11/24/16:33:13
KW - Abstract machines
KW - Machine learning
KW - Symbolic logic
ER -
TY - JOUR
TI - Neural Turing Machines
AU - Graves, Alex
AU - Wayne, Greg
AU - Danihelka, Ivo
T2 - arXiv:1410.5401 [cs]
AB - We extend the capabilities of neural networks by coupling them to external memory resources, which they can interact with by attentional processes. The combined system is analogous to a Turing Machine or Von Neumann architecture but is differentiable end-to-end, allowing it to be efficiently trained with gradient descent. Preliminary results demonstrate that Neural Turing Machines can infer simple algorithms such as copying, sorting, and associative recall from input and output examples.
DA - 2014/12/10/
PY - 2014
DP - arXiv.org
UR - http://arxiv.org/abs/1410.5401
Y2 - 2019/11/21/21:09:35
KW - Abstract machines
KW - Classical ML
KW - Machine learning
ER -
TY - JOUR
TI - Derivatives of Turing machines in Linear Logic
AU - Murfet, Daniel
AU - Clift, James
T2 - arXiv:1805.11813 [math]
AB - We calculate denotations under the Sweedler semantics of the Ehrhard-Regnier derivatives of various encodings of Turing machines into linear logic. We show that these derivatives calculate the rate of change of probabilities naturally arising in the Sweedler semantics of linear logic proofs. The resulting theory is applied to the problem of synthesising Turing machines by gradient descent.
DA - 2019/01/28/
PY - 2019
DP - arXiv.org
UR - http://arxiv.org/abs/1805.11813
Y2 - 2019/11/21/20:33:27
KW - Abstract machines
KW - Categorical ML
KW - Differentiation
KW - Linear logic
KW - Machine learning
ER -
TY - JOUR
TI - Logic and the $2$-Simplicial Transformer
AU - Murfet, Daniel
AU - Clift, James
AU - Doryn, Dmitry
AU - Wallbridge, James
T2 - arXiv:1909.00668 [cs, stat]
AB - We introduce the $2$-simplicial Transformer, an extension of the Transformer which includes a form of higher-dimensional attention generalising the dot-product attention, and uses this attention to update entity representations with tensor products of value vectors. We show that this architecture is a useful inductive bias for logical reasoning in the context of deep reinforcement learning.
DA - 2019/09/02/
PY - 2019
DP - arXiv.org
UR - http://arxiv.org/abs/1909.00668
Y2 - 2019/11/21/20:31:14
KW - Abstract machines
KW - Algebra
KW - Machine learning
KW - Semantics
ER -
TY - JOUR
TI - Logic Tensor Networks: Deep Learning and Logical Reasoning from Data and Knowledge
AU - Serafini, Luciano
AU - Garcez, Artur d'Avila
T2 - arXiv:1606.04422 [cs]
AB - We propose Logic Tensor Networks: a uniform framework for integrating automatic learning and reasoning. A logic formalism called Real Logic is defined on a first-order language whereby formulas have truth-value in the interval [0,1] and semantics defined concretely on the domain of real numbers. Logical constants are interpreted as feature vectors of real numbers. Real Logic promotes a well-founded integration of deductive reasoning on a knowledge-base and efficient data-driven relational machine learning. We show how Real Logic can be implemented in deep Tensor Neural Networks with the use of Google's tensorflow primitives. The paper concludes with experiments applying Logic Tensor Networks on a simple but representative example of knowledge completion.
DA - 2016/07/07/
PY - 2016
DP - arXiv.org
ST - Logic Tensor Networks
UR - http://arxiv.org/abs/1606.04422
Y2 - 2019/11/24/16:33:44
KW - Abstract machines
KW - Machine learning
KW - Symbolic logic
ER -