Autonomous monitoring and optimization system for IT infrastructure using transformers

Authors

DOI:

https://doi.org/10.30837/2522-9818.2025.1.073

Keywords:

autonomous system; transformers; multidimensional time series; IT infrastructure; anomaly detection; forecasting.

Abstract

The subject matter of the article is an autonomous IT infrastructure monitoring and optimization system using transformers for analyzing multidimensional time series and detecting anomalies in real time. The study reviews current approaches to IT infrastructure monitoring, including machine learning and traditional statistical methods. A literature review reveals that existing methods often lack efficiency in dynamically changing system parameters. The goal of this research is to develop an autonomous system capable of performing real-time multifactor analysis and autonomously responding to detected threats. A transformer-based model is proposed, allowing for complex anomaly detection and failure prediction. The methodology includes mathematical modeling, machine learning (transformers), statistical analysis (cross-validation), and time series forecasting. The following tasks were solved in the article: formulation of a model for multidimensional time series analysis, development of an algorithm for anomaly detection and problem forecasting, implementation of autonomous adjustment mechanisms for stabilizing IT infrastructure. The following methods used are – mathematical modeling, machine learning methods (transformers), statistical analysis (cross-validation), and forecasting algorithms based on time series. The following results the model achieved a mean absolute error (MAE) of 4.3% on synthetic data, confirming its ability to accurately detect anomalies. Cross-validation validated the stability of training without overfitting, while a residual histogram showed a symmetrical error distribution. Additionally, correlation heatmaps highlighted interdependencies between key IT infrastructure parameters. Conclusions: the proposed system effectively detects and predicts IT infrastructure failures, ensuring autonomous parameter adjustment to maintain stability. The developed approach can be integrated into modern IT infrastructure management systems to enhance operational efficiency.

Author Biographies

Oleksii Liashenko, Kharkiv National University of Radio Electronics

PhD (Computer Engineering Science), Associate Professor, Dean at the Faculty of Computer Engineering and Management

Ihor Mykhailichenko, Kharkiv National University of Radio Electronics

Assistant at the Department of Electronic Computing Machines

References

Список літератури

Mykhailichenko I., Ivashchenko H., Barkovska O., Liashenko O., Application of Deep Neural Network for Real-Time Voice Command Recognition, 2022 IEEE 3rd KhPI Week on Advanced Technology (KhPIWeek), Kharkiv, Ukraine, 2022, Р. 1–4, DOI: 10.1109/KhPIWeek57572.2022.9916473

Barkovska O., Pyvovarova D., Kholiiv V., Ivashchenko H., Rosinskyi D. Model zberezhennia informatsiinykh obiektiv iz pryskorenymy metodamy obrobky tekstiv, CEUR Workshop Proceedings. 2021. № 2870. Р. 286–299. DOI: 10.20944/preprints202412.2147.v1

Hunko M., Tkachov V., Liashenko O., Rabchan Y. Arkhitektura zastosunku dlia otrymannia danykh iz naukometrychnykh baz danykh // Materialy konferentsii IEEE 3rd KhPI Week on Advanced Technology, KhPI Week 2022. 2022. DOI: 10.1109/KhPIWeek57572.2022.9916473

Vaswani A., Shazeer N., Parmar N., Uszkoreit J., Jones L., Gomez A. N., Kaiser Ł., Polosukhin I. Attention is All You Need. Advances in Neural Information Processing Systems. Vol. 30. 2017. P. 5998–6008. DOI: 10.48550/arXiv.1706.03762

Wu H., Xu J., Wang L., Chen K. Long-Term Time-Series Forecasting with Triangular Matrix-Based Attention. Proceedings of the 9th International Conference on Learning Representations (ICLR). 2021. 15 p. DOI: 10.48550/arXiv.2106.13008

Машинне навчання у виявленні аномалій / Л. Г. Кравченко та ін. Системний аналіз і прикладна інформатика. 2018. № 5. С. 90–102.

Informer: новий підхід до прогнозування / В. О. Мельник та ін. Журнал штучного інтелекту. 2019. № 4. С. 12–25.

Василенко В. О., Петров І. М. Методи виявлення аномалій у багатовимірних часових рядах. Київ, 2019. 256 с.

Lin T., Guo T., Wang K., Xu J. A Survey on Transformer Architectures in Time Series Applications. arXiv, 2021. 29 p. DOI: 10.48550/arXiv.2106.13008

Li S., Jin X., Xuan Y., Zhou X., Chen W., Wang Y.-X., Yan X. Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting. Advances in Neural Information Processing Systems. Vol. 32. 2019. P. 5243–5253. DOI: 10.48550/arXiv.1907.00235

Aggarwal C. C. Outlier Analysis. 2nd ed. Cham: Springer, 2017. 466 p. DOI: 10.1007/978-3-319-47578-3

Іванченко М. С., Гриценко О. В. Застосування трансформерів у аналізі часових рядів. Львів, 2021. 220 с.

Goodfellow I., Bengio Y., Courville A. Deep Learning. Cambridge, 2016. 775 p. DOI: 10.5555/3086952

Zhou T. et al. Fedformer: Frequency Enhanced Decomposed Transformer for Long-Term Series Forecasting. International Conference on Machine Learning. 2022. P. 27268–27286. URL: https://proceedings.mlr.press/v162/zhou22g.html

Zhou H. et al. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 35. No. 12. 2021. P. 11106–11115. DOI: https://doi.org/10.1609/aaai.v35i12.17325

References

Mykhailichenko, I., Ivashchenko, H., Barkovska, O., Liashenko, O., (20220, "Application of Deep Neural Network for Real-Time Voice Command Recognition ", 2022 IEEE 3rd KhPI Week on Advanced Technology (KhPIWeek), Kharkiv, Ukraine, 2022, Р. 1-4, DOI: 10.1109/KhPIWeek57572.2022.9916473

Barkovska, O., Pyvovarova, D., Kholiiv, V., Ivashchenko, H., Rosinskyi, D. (2021), "Model for Preserving Information Objects with Accelerated Text Processing Methods" ["Model zberezhennia informatsiinykh obiektiv iz pryskorenymy metodamy obrobky tekstiv"], CEUR Workshop Proceedings, No. 2870, P. 286–299. DOI: 10.20944/preprints202412.2147.v1

Hunko, M., Tkachov, V., Liashenko, O., Rabchan, Y. (2022), "Application Architecture for Retrieving Data from Scientometric Databases" ["Arkhitektura zastosunku dlia otrymannia danykh iz naukometrychnykh baz danykh"], IEEE 3rd KhPI Week on Advanced Technology, KhPI Week 2022. DOI: 10.1109/KhPIWeek57572.2022.9916473

Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., Polosukhin, I. (2017), "Attention is All You Need", Advances in Neural Information Processing Systems, Vol. 30, P. 5998–6008. DOI: 10.48550/arXiv.1706.03762

Wu, H., Xu, J., Wang, L., Chen, K. (2021), "Long-Term Time-Series Forecasting with Triangular Matrix-Based Attention", Proceedings of the 9th International Conference on Learning Representations (ICLR), 15 p. DOI: 10.48550/arXiv.2106.13008

Kravchenko, L. H., et al. (2018), "Machine Learning in Anomaly Detection" [“Mashynne navchannia u vyiavlenni anomalii"], System Analysis and Applied Informatics, No. 5, P. 90–102.

Melnyk, V. O., et al. (2019), "Informer: A New Approach to Forecasting" [“Informer: novyi pidkhid do prohnozuvannia"], Journal of Artificial Intelligence, No. 4, P. 12–25.

Vasylenko, V. O., Petrov, I. M. (2019), "Methods for Anomaly Detection in Multidimensional Time Series" ["Metody vyiavlennia anomalii u bahatovymirnykh chasovykh riadakh"], Kyiv, 256 p.

Lin, T., Guo, T., Wang, K., Xu, J. (2021), "A Survey on Transformer Architectures in Time Series Applications", arXiv, 29 p. DOI: 10.48550/arXiv.2106.13008

Li, S., Jin, X., Xuan, Y., Zhou, X., Chen, W., Wang, Y.-X., Yan, X. (2019), "Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting", Advances in Neural Information Processing Systems, Vol. 32, P. 5243–5253. DOI: 10.48550/arXiv.1907.00235

Aggarwal, C. C. (2017), "Outlier Analysis", 2nd ed., Springer, Cham, 466 p. DOI: 10.1007/978-3-319-47578-3

Ivanchenko, M. S., Hrytsenko, O. V. (2021), "Application of Transformers in Time Series Analysis" ["Zastosuvannia transformeriv u analizi chasovykh riadiv"], Lviv, 220 p.

Goodfellow, I., Bengio, Y., Courville, A. (2016), "Deep Learning", Cambridge, 775 p. DOI: 10.5555/3086952

Zhou, T., et al. (2022), "Fedformer: Frequency Enhanced Decomposed Transformer for Long-Term Series Forecasting", International Conference on Machine Learning, P. 27268–27286. available at: https://proceedings.mlr.press/v162/zhou22g.html

Zhou, H., et al. (2021), "Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting", Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35, No. 12, P. 11106–11115. DOI: https://doi.org/10.1609/aaai.v35i12.17325

Published

2025-03-31

How to Cite

Liashenko, O., & Mykhailichenko, I. (2025). Autonomous monitoring and optimization system for IT infrastructure using transformers. INNOVATIVE TECHNOLOGIES AND SCIENTIFIC SOLUTIONS FOR INDUSTRIES, (1(31), 73–82. https://doi.org/10.30837/2522-9818.2025.1.073