全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
Information  2012 

Quaternionic Multilayer Perceptron with Local Analyticity

DOI: 10.3390/info3040756

Keywords: quaternion, local analyticity, Wirtinger calculus, multilayer perceptron, error back-propagation

Full-Text   Cite this paper   Add to My Lib

Abstract:

A multi-layered perceptron type neural network is presented and analyzed in this paper. All neuronal parameters such as input, output, action potential and connection weight are encoded by quaternions, which are a class of hypercomplex number system. Local analytic condition is imposed on the activation function in updating neurons’ states in order to construct learning algorithm for this network. An error back-propagation algorithm is introduced for modifying the connection weights of the network.

References

[1]  Hirose, A. Complex-Valued Neural Networks: Theories and Application; World Scientific Publishing: Singapore, 2003.
[2]  Hirose, A. Complex-Valued Neural Networks; Springer-Verlag: Berlin, Germany, 2006.
[3]  Nitta, T. Complex-Valued Neural Networks: Utilizing High-Dimensional Parameters; Information Science Reference: New York, NY, USA, 2009.
[4]  Hamilton, W.R. Lectures on Quaternions; Hodges and Smith: Dublin, Ireland, 1853.
[5]  Hankins, T.L. Sir William Rowan Hamilton; Johns Hopkins University Press: Baltimore, MD, USA, 1980.
[6]  Mukundan, R. Quaternions: From classical mechanics to computer graphics, and beyond. In Proceedings of the 7th Asian Technology Conference in Mathematics, Melaka, Malaysia, 17-21 December 2002; pp. 97–105.
[7]  Kuipers, J.B. Quaternions and Rotation Sequences: A Primer with Applications to Orbits, Aerospace and Virtual Reality; Princeton University Press: Princeton, NJ, USA, 1998.
[8]  Hoggar, S.G. Mathematics for Computer Graphics; Cambridge University Press: Cambridge, MA, USA, 1992.
[9]  Nitta, T. An extension of the back-propagation algorithm to quaternions. In Proceedings of International Conference on Neural Information Processing (ICONIP’96), Hong Kong, China, 24-27 September 1996; 1, pp. 247–250.
[10]  Arena, P.; Fortuna, L.; Muscato, G.; Xibilia, M. Multilayer perceptronsto approximate quaternion valued functions. Neural Netw. 1997, 10, 335–342, doi:10.1016/S0893-6080(96)00048-2.
[11]  Buchholz, S.; Sommer, G. Quaternionic spinor MLP. In Proceeding of 8th European Symposium on Artificial Neural Networks (ESANN 2000), Bruges, Belgium, 26-28 April 2000; pp. 377–382.
[12]  Matsui, N.; Isokawa, T.; Kusamichi, H.; Peper, F.; Nishimura, H. Quaternion neural network with geometrical operators. J. Intell. Fuzzy Syst. 2004, 15, 149–164.
[13]  Mandic, D.P.; Jahanchahi, C.; Took, C.C. A quaternion gradient operator and its applications. IEEE Signal Proc. Lett. 2011, 18, 47–50, doi:10.1109/LSP.2010.2091126.
[14]  Ujang, B.C.; Took, C.C.; Mandic, D.P. Quaternion-valued nonlinear adaptive filtering. IEEE Trans. Neural Netw. 2011, 22, 1193–1206, doi:10.1109/TNN.2011.2157358.
[15]  Kusamichi, H.; Isokawa, T.; Matsui, N.; Ogawa, Y.; Maeda, K. Anewschemeforcolornight vision by quaternion neural network. In Proceedings of the 2nd International Conferenceon Autonomous Robots and Agents (ICARA2004), Palmerston North, New Zealand, 13-15 December 2004; pp. 101–106.
[16]  Isokawa, T.; Matsui, N.; Nishimura, H. Quaternionic neural networks: Fundamental properties and applications. In Complex-Valued Neural Networks: Utilizing High-Dimensional Parameters; Nitta, T., Ed.; Information Science Reference: New York, NY, USA, 2009; pp. 411–439. Chapter XVI.
[17]  Nitta, T. A solution to the 4-bit parity problem with a single quaternary neuron. Neural Inf. Process. Lett. Rev. 2004, 5, 33–39.
[18]  Yoshida, M.; Kuroe, Y.; Mori, T. Models of hopfield-type quaternion neural networks and their energy functions. Int. J. Neural Syst. 2005, 15, 129–135, doi:10.1142/S012906570500013X.
[19]  Isokawa, T.; Nishimura, H.; Kamiura, N.; Matsui, N. Fundamental properties of quaternionic hopfield neural network. In Proceedings of 2006 International Joint Conference on Neural Networks, Vancouver BC, USA, 30 October 2006; pp. 610–615.
[20]  Isokawa, T.; Nishimura, H.; Kamiura, N.; Matsui, N. Associative memoryin quaternionic hopfield neural network. Int. J. Neural Syst. 2008, 18, 135–145, doi:10.1142/S0129065708001440.
[21]  Isokawa, T.; Nishimura, H.; Kamiura, N.; Matsui, N. Dynamics of discrete-time quaternionic hopfield neural networks. In Proceedings of 17th International Conference on Artificial Neural Networks, Porto, Portugal, 9-13 September 2007; pp. 848–857.
[22]  Isokawa, T.; Nishimura, H.; Matsui, N. On the fundamental properties of fully quaternionic hopfield network. In Proceedings of IEEE World Congress on Computational Intelligence (WCCI2012), Brisbane, Australia, 10-15 June 2012; pp. 1246–1249.
[23]  Isokawa, T.; Nishimura, H.; Saitoh, A.; Kamiura, N.; Matsui, N. On the scheme of quaternionic multistate hopfield neural network. In Proceedings of Joint 4th International Conference on Soft Computing and Intelligent Systems and 9th International Symposium on Advanced Intelligent Systems (SCIS&ISIS 2008), Nagoya, Japan, 17-21 September 2008; pp. 809–813.
[24]  Isokawa, T.; Nishimura, H.; Matsui, N. Commutative quaternion and multistate hopfield neural networks. In Proceedings of IEEE World Congress on Computational Intelligence (WCCI2010), Barcelona, Spain, 18-23 July 2010; pp. 1281–1286.
[25]  Isokawa, T.; Nishimura, H.; Matsui, N. An iterative learning schemefor multistate complex-valued and quaternionic hopfield neural networks. In Proceedings of International Joint Conference on Neural Networks (IJCNN2009), Atlanta, GA, USA, 14-19 June 2009; pp. 1365–1371.
[26]  Leo, S.D.; Rotelli, P.P. Local hypercomplex analyticity. 1997. Available online: http://arxiv.org/abs/funct-an/9703002 (accessed on 20 November 2012).
[27]  Leo, S.D.; Rotelli, P.P. Quaternonic analyticity. Appl. Math. Lett. 2003, 16, 1077–1081, doi:10.1016/S0893-9659(03)90097-8.
[28]  Schwartz, C. Calculus with a quaternionic variable. J. Math. Phys. 2009, 50, 013523:1–013523:11.
[29]  Kim, T.; Adal?, T. Approximationby fully complex multilayer perceptrons. Neural Comput. 2003, 15, 1641–1666, doi:10.1162/089976603321891846.
[30]  Wirtinger, W. Zur formalen theorie der funktionen von mehr komplexen ver?nderlichen. Math. Ann. 1927, 97, 357–375, doi:10.1007/BF01447872.
[31]  Cybenko, G. Approximations by superpositions of sigmoidal functions. Math. Control Signals Syst. 1989, 2, 303–314, doi:10.1007/BF02551274.
[32]  Hornik, K. Approximation capabilities of multilayer feedforward networks. Neural Netw. 1991, 4, 215–257.
[33]  Segre, C. The real representations of complex elements and extension to bicomplex systems. Math. Ann. 1892, 40, 322–335.
[34]  Catoni, F.; Cannata, R.; Zampetti, P. An Introduction to commutative quaternions. Adv. Appl. CliffordAlgebras 2006, 16, 1–28, doi:10.1007/s00006-006-0002-y.
[35]  Davenport, C.M. A commutative hypercomplex algebra with associated function theory. In Clifford Algebra With Numeric and Symbolic Computation; Ablamowicz, R., Ed.; Birkhauser: Boston, MA, USA, 1996; pp. 213–227.
[36]  Pei, S.C.; Chang, J.H.; Ding, J.J. Commutative reduced biquaternions and their Fourier Transformfor signal and image processing applications. IEEE Trans. Signal Proc. 2004, 52, 2012–2031, doi:10.1109/TSP.2004.828901.
[37]  Hirose, A. Continuous complex-valued back-propagation learning. Electron. Lett. 1992, 28, 1854–1855, doi:10.1049/el:19921186.
[38]  Georgiou, G.M.; Koutsougeras, C. Complex domain backpropagation. IEEE Trans. Circuits Syst. II 1992, 39, 330–334, doi:10.1109/82.142037.

Full-Text

Contact Us

[email protected]

QQ:3279437679

WhatsApp +8615387084133