WAVELET-LIKE ARCHITECTURE OF COMPLEX-VALUED CONVOLUTIONAL NEURAL NETWORK FOR COMPLEX SIGNAL SYNTHESIS
https://doi.org/10.34822/1999-7604-2020-2-20-31
Abstract
The paper proposes the architecture of a complex-valued convolutional neural network, built upon a structure of the discrete wavelet transform. This architecture allows performing multiscale decomposition of a complex signal, thus forming a set of features that can be used for the tasks of signal synthesis and classification. The article presents the results of solving the prediction problem for the values of a chaotic complex signal by a neural network based on the proposed architecture.
The obtained results are compared with the results of solving this problem using real-valued neural networks based on alternative modern approaches. The analysis of the presented complex-valued convolutional network in the frequency domain is also carried out, which was achieved due to a relatively small number of adaptive parameters.
About the Author
D. A. KaravaevRussian Federation
References
1. LeCun Y., Bengio Y., Hinton G. E. Deep Learning // Nature. 2015. Vol. 521, No. 7553. P. 436–444. DOI: https://doi.org/10.1038/nature14539.
2. Джиган В. И. Адаптивные фильтры и их приложения в радиотехнике и связи. Ч. 3 // Соврем. электроника. 2010. Т. 2. С. 70–77.
3. Hirose A. Complex-Valued Neural Networks. Studies in computational intelligence. Springer-Verlag, 2006. URL: https://books.google.ru/books?id=d2INKuuzQjEC (дата обращения: 30.04.2020).
4. Chiheb T., Bilaniuk O., Serdyuk D., et al. Deep Complex Networks // CoRR. 2017. URL: http://arxiv.org/abs/1705.09792 (дата обращения: 30.04.2020).
5. Ioffe S., Szegedy C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift // CoRR. 2015. URL: http://arxiv.org/abs/1502.03167 (дата обращения: 30.04.2020).
6. Bruna J., Chintala S., LeCun Y., et al. A Theoretical Argument for Complex-Valued Convolutional Networks // CoRR. 2015. URL: http://arxiv.org/abs/1503.03438 (дата обращения: 30.04.2020).
7. Theano Development Team. Theano: A Python framework for fast computation of mathematical expressions // arXiv e-prints. 2016. URL: http://arxiv.org/abs/1605.02688 (дата обращения: 30.04.2020).
8. Keras: the Python deep learning API. URL: https://keras.io. (дата обращения: 30.04.2020).
9. Hunter J. D. Matplotlib: A 2D Graphics Environment // Computing in Science & Engineering. 2007. Vol. 9, No. 3. P. 90–95.
10. Yu F., Koltun V. Multi-Scale Context Aggregation by Dilated Convolutions // CoRR. 2015. URL: http://arxiv.org/abs/1511.07122 (дата обращения: 30.04.2020).
11. Santurkar S., Tsipras D., Ilyas A., et al. How Does Batch Normalization Help Optimization? // Advances in Neural Information Processing Systems. 2018. Vol. 31, P. 2483–2493.
12. Clevert D. A., Unterthiner T., Hochreiter S. Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) // CoRR. 2015. URL: http://arxiv.org/abs/1511.07289 (дата
13. обращения: 30.04.2020).
14. Arjovsky M., Shah A., Bengio Y. Unitary Evolution Recurrent Neural Networks // CoRR. 2015. URL: http://arxiv.org/abs/1511.06464 (дата обращения: 30.04.2020).
15. Mallat S. A Wavelet Tour of Signal Processing, Third Edition: The Sparse Way. Orlando, FL, USA : Academic Press, Inc., 2008.
16. Goodfellow I., Bengio Y., Courville A. Deep Learning. MIT Press, 2016.
17. He K., Zhang X., Ren S. et al. Deep Residual Learning for Image Recognition // CoRR. 2015. URL: http://arxiv.org/abs/1512.03385 (дата обращения: 30.04.2020).
18. Van den Oord A., Dieleman S., Zen H., et al. WaveNet: A Generative Model for Raw Audio // CoRR. 2016. URL: http://arxiv.org/abs/1609.03499 (дата обращения: 30.04.2020).
19. Borovykh A., Bohte S. M., Oosterlee C. W. Conditional Time Series Forecasting with Convolutional Neural Networks // CoRR. 2017. URL: http://arxiv.org/abs/1703.04691 (дата обращения: 30.04.2020).
20. Dozat T. Incorporating Nesterov Momentum into Adam // Workshop track – ICLR. 2016. URL: https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ (дата обращения: 30.04.2020).
21. Glorot X., Bengio Y. Understanding the Difficulty of Training Deep Feedforward Neural Networks // Proceedings of the International Conference on Artificial Intelligence and Statistics
22. (AISTATS’10) : Society for Artificial Intelligence and Statistics. 2010.
23. Ikeda K. Multiple-Valued Stationary State and its Instability of the Transmitted Light by a Ring Cavity System // Optics Communications. 1979. Vol. 30, No. 2. P. 257–261.
24. Mandic D. P., Still S., Douglas S. C. Duality between Widely Linear and Dual Channel Adaptive Filtering // 2009 IEEE International Conference on Acoustics, Speech and Signal Processing. 2009. P. 1729–1732.
25. Hochreiter S., Schmidhuber J. Long Short-Term Memory // Neural Computation. 1997. Vol. 9, No. 8. P. 1735–1780.
26. Rusch L. A., Poor H. V. Narrowband Interference Suppression in CDMA Spread Spectrum Communications // IEEE Transactions on Communications. 1994. Vol. 42, No. 234. P. 1969–1979.
Review
For citations:
Karavaev D.A. WAVELET-LIKE ARCHITECTURE OF COMPLEX-VALUED CONVOLUTIONAL NEURAL NETWORK FOR COMPLEX SIGNAL SYNTHESIS. Proceedings in Cybernetics. 2020;(2 (38)):20-31. (In Russ.) https://doi.org/10.34822/1999-7604-2020-2-20-31