CATEGORICAL LOGIC
MACHINE LEARNING

Neural Nets via Forward State Transformation and Backward Loss Transformation

Resource type
Authors/contributors
Title
Neural Nets via Forward State Transformation and Backward Loss Transformation
Abstract
This article studies (multilayer perceptron) neural networks with an emphasis on the transformations involved --- both forward and backward --- in order to develop a semantical/logical perspective that is in line with standard program semantics. The common two-pass neural network training algorithms make this viewpoint particularly fitting. In the forward direction, neural networks act as state transformers. In the reverse direction, however, neural networks change losses of outputs to losses of inputs, thereby acting like a (real-valued) predicate transformer. In this way, backpropagation is functorial by construction, as shown earlier in recent other work. We illustrate this perspective by training a simple instance of a neural network.
Publication
arXiv:1803.09356 [cs]
Date
2018-03-25
Accessed
2019-11-21T20:40:18Z
Library Catalog
Extra
ZSCC: 0000001 arXiv: 1803.09356
Citation
Jacobs, B., & Sprunger, D. (2018). Neural Nets via Forward State Transformation and Backward Loss Transformation. ArXiv:1803.09356 [Cs]. Retrieved from http://arxiv.org/abs/1803.09356
CATEGORICAL LOGIC
Attachment
Processing time: 0.02 seconds

Graph of references

(from Zotero to Gephi via Zotnet with this script)
Graph of references