A Theory of Neural Computation with Clifford Algebras
- Zugl.: Kiel, Diss., 2005.
The present thesis introduces Clifford Algebra as a framework for neural computation. Clifford Algebra subsumes the reals, complex numbers and quaternions. Neural computation with Clifford algebras is model--based. This principle is established by constructing Clifford algebras from quadratic spaces. Then the subspace grading inherent to any Clifford algebra is introduced, which allows the representation of different geometric entities like points, lines, and so on. The above features of Clifford algebras are then taken as motivation for introducing the Basic Clifford Neuron (BCN), which is solely based on the geometric product of the underlying Clifford algebra. Using BCNs the Linear Associator is generalized to the Clifford associator. As a second type of Clifford neuron the Spinor Clifford Neuron (SCN) is presented. The propagation function of a SCN is an orthogonal transformation. Examples of how Clifford neurons can be used advantageously are given, including the linear computation of Möbius transformations by a SCN. A systematic basis for Clifford neural computation is provided by the important notions of isomorphic Clifford neurons and isomorphic representations. After the neuron level is established, the discussion continues with (Spinor) Clifford Multilayer Perceptrons. The treatment is divided into two parts according to the type of activation function used. First, (Spinor) Clifford Multilayer Perceptrons with real-valued activation functions ((S)CMLPs) are studied. A generic Backpropagation algorithm for CMLPs is derived. Also, universal approximation theorems for (S)CMLPs are presented. The efficency of (S)CMLPs is shown in a couple of simulations. Finally, the class of Clifford Multilayer Perceptrons with Clifford-valued activation functions is studied.