Stability of Neural Ordinary Differential Equations with Power Nonlinearities
Анотація
ReLU. Note that the purpose of introducing power activation functions is that they allow one to obtain verifiable Lyapunov stability conditions for solutions of the system differential equations simulating the corresponding dynamic processes. In turn, Lyapunov stability is one of the guarantees of the adequacy of the neural network model for the process under study. In addition, from the global stability (or at least the boundedness) of continuous analog solutions it follows that learning process of the corresponding neural network will not diverge for any training sample.
Ключові слова
Повний текст:
PDF (English)Посилання
V. Ye. Belozyorov, Ye. M. Kosariev, M. M. Pulin, V. G. Sychenko, V. G. Zaytsev. A new mathematical model of dynamic process in direct current traction power supply system, Journal of Optimization, Differential Equations and Their Applications (JODEA), 27(1) (2019), 21 – 55.
V. Ye. Belozyorov, D. V. Dantsev, S. A. Volkova. On Equivalence of Linear Control Systems and Its Usage to Verification of the Adequacy of Different Models for A Real Dynamic Process, Journal of Optimization, Differential Equations and Their Applications (JODEA), 28(1)(2020), 43 – 97.
J. Jia, A. R. Benson, Neural jump stochastic differential equations, arXiv preprint arXiv:1905.10403v3[cs.LG], (2020), 1 – 14.
B. Chang, M. Chen, E. Haber, E.D. Chi, Antisymmetric RNN: A Dynamical System Viev on Recurrent Neural Networks, In conference ICLR, May 6 - May 9, New Orleans, Louisiana, USA, (2019), 1 – 15.
R.T.Q. Chen, Y. Rubanova, J. Bettencourt, D. Duvenaud, Neural ordinary differential equations, arXiv preprint arXiv:1806.07366v5[cs.LG], (2019), 1 – 18.
E. Haber, L. Ruthotto, Stable Architectures for Deep Neural Networks, arXiv preprint arXiv: 1705.03341v1[cs.LG], (2019), 1 – 23.
S. Haykin, Neural Networks. A Comprehensive Foundation, Second Edition, Pearson Education, Prentice Hall, 2005.
E. M. Izhikevich, Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting, The MIT Press Cambridge, Massachusetts, London, England, 2007.
H. K. Khalil, Nonlinear Systems – 2nd Edition, (Prentice Hall/New-Jersy), 1996.
J. Liang, W. Song, M. Wang, Stock Price Prediction Based on Procedural Neural Networks, Advances in Artificial Neural Systems, 2011, 2011, Article ID 814769, 11 pages.
S. T. Piantadosi, Zipf 0s word frequency law in natural language: a critical review and future directions. Psychonomic bulletin and review, 21(5), (2014), 1112 –1130.
J. A. Sjogren. Homogeneous Systems and Euclidean Topology, In: SIAM Conference on Applied Algebraic Geometry (AG17), July 31 – August 4, Georgia Tech, Atlanta GA, USA, (2017), 1 – 20.
S. Sonoda, N. Murata, Neural network with unbounded activation functions is universal approximator, Applied and Computational Harmonic Analysis, 43, 2017, pp. 233 – 268.
N.-E. Tatar, Hopfield neural networks with unbounded monotone activation functions, Hindawi Publishing Corporation. Advances in Artificial Neural Systems, 2012, 2012, 571358-1 – 5.
Z. Wang, J. Liang, Y. Liu, Mathematical Problems for Complex Networks, Mathematical Problems in Engineering, 2012, 2012, Article ID 934680, 5 pages.
H. Zhang, X. Gao, J. Unterman, T. Arodz, Approximation capabilities of neural ordinary differential equations, arXiv preprint arXiv: 907.12998v1[cs.LG], (2019), 1 – 11.
DOI: http://dx.doi.org/10.15421/1420O5
Посилання
- Поки немає зовнішніх посилань.
Індексування журналу
Журнал розміщено у наукометричних базах, репозитаріях та пошукових системах:
Адреса редколегії: 49050, Україна, Дніпровський національний університет імені Олеся Гончара, вул. Козакова 18, корп. 14, механіко-математичний факультет, д-р фіз.-мат. наук, проф. Когут П.І.
email: p.kogut@i.ua
Це видання має доступ за ліцензією Creative Commons «Attribution» («Атрибуция») 4.0 Всемирная.