A weighted sensitivity metric for predicting latency in a KAFKA cluster
DOI:
https://doi.org/10.30837/2522-9818.2025.3.152Keywords:
Morris method; Sobol index; Euclidean distance; Hellinger distance; Jensen divergence; sensitivity analysis.Abstract
The subject of this article is sensitivity analysis methods used to assess how variations in input parameters affect the output results of a model or system. The aim of the study is to develop a new approach to sensitivity analysis that combines classical parameter impact assessment methods (Morris and Sobol methods) with metrics that capture structural changes in the distribution of output data (Euclidean distance, Hellinger distance, Jensen divergence). This approach allows for evaluating the influence of a parameter not only in terms of the amplitude of its effect, but also in terms of changes in the shape and structure of the probability distribution of the results. To achieve this objective, the article addresses the following tasks: formal definition of a new sensitivity analysis approach; development of a Bayesian network for modeling end-to-end latency in a Kafka cluster; performing sensitivity analysis using the proposed approach; and conducting an experimental study using the calculated parameter influence weights to initialize the weight matrix of a neural network that predicts Kafka cluster latency based on selected configuration parameters. To accomplish these tasks, the study applied methods from the theory of experiments, Euclidean geometry, statistical distribution theory, information theory, machine learning, Bayesian statistics, and graph theory. Results: To evaluate the effectiveness of the proposed approach, comparative training of a neural network was conducted using various weight initialization strategies. Analysis of the loss function, constructed using the mean squared error minimization criterion, showed that the lowest values were achieved by the model initialized with weights obtained using the proposed parameter influence estimation approach. Conclusions: The study proposes a novel approach to sensitivity analysis. The innovation lies in integrating the strengths of both causal-oriented and variance-based methods within a unified weighted sensitivity metric. The practical value of this approach is that its application in sensitivity analysis or neural network weight initialization improves the accuracy of parameter impact assessment, enhances model convergence, and reduces training time.
References
Список літератури
Solovei O., Honcharenko T., Fesan A. "Technologies to manager big data of urban building projects", Management of Development of Complex Systems, No. 60, Р.121–128, 2024. DOI: 10.32347/2412-9933.2024.60.121-128
Narkhede, M. V., Bartakke, P. P., Sutaone, M. S. "A review on weight initialization strategies for neural networks", Artificial intelligence review, No. 55(1), Р. 291-322. 2022. DOI: 10.1007/s10462-021-10033-z
Wong, K., Dornberger, R., Hanne, T. "An analysis of weight initialization methods in connection with different activation functions for feedforward neural networks", Evolutionary Intelligence, No. 17(3), Р.2081-2089, 2024. DOI: 10.1007/s12065-022-00795-y
Brand J.E., Zhou X., Xie Y. "Recent developments in causal inference and machine learning", Annual Review of Sociology, No. 49 (1), Р. 81-110. 2023. DOI: 10.1146/annurev-soc-030420-015345
Chumachenko K., Iosifidis A., and Gabbouj M. "Feedforward neural networks initialization based on discriminant learning", Neural Networks, No. 146, Р. 220-229. 2022. DOI: 10.1016/j.neunet.2021.11.020
Zhao J., Schäfer F., and Anandkumar A. "Zero initialization: Initializing neural networks with only zeros and ones", Published in Transactions on Machine Learning Research, № 11. 2021. URL: arXiv preprint arXiv:2110.12661
Pan Y., Wang C., Wu Z., Wang Q., Zhang M., and Xu Z. "IDInit: A Universal and Stable Initialization Method for Neural Network Training", 2025. URL: arXiv preprint arXiv:2503.04626
Zhu C., Ni R., Xu Z., Kong K., Huang W. R., and Goldstein T. "Gradinit: Learning to initialize neural networks for stable and efficient training", Advances in Neural Information Processing Systems, No. 34, Р.16410-16422. 2021. DOI: https://doi.org/10.3390/app15042008
Nouri A., van Treeck C., & Frisch J. "Sensitivity Assessment of Building Energy Performance Simulations Using MARS Meta-Modeling in Combination with Sobol’Method", Energies, No. 17(3), 695 р. 2024. DOI: https://doi.org/10.3390/en17030695
Sadeghi Z., Matwin S. "A Review of Global Sensitivity Analysis Methods and a comparative case study on Digit Classification". 2024. URL: arXiv preprint arXiv:2406.16975
Mazo G., Tournier L. "An inference method for global sensitivity analysis", Technometrics, No. 67(2), Р. 270-282. 2025.
Kozniewski M., Kolendo Ł., Chmur S., Ksepko M. "Impact of Parameters and Tree Stand Features on Accuracy of Watershed-Based Individual Tree Crown Detection Method Using ALS Data in Coniferous Forests from North-Eastern Poland", Remote Sensing, No. 17(4), 575 р. 2025. DOI: https://doi.org/10.3390/rs17040575
Kaddoura M., Majeau-Bettez G., Amor B., & Margni M. "Global sensitivity analysis reduces data collection efforts in LCA: A comparison between two additive manufacturing technologies", Science of the Total Environment, No. 975, 179269 р. 2025. DOI: https://doi.org/10.1016/j.scitotenv.2025.179269
Raptis T. P., Passarella A. "A survey on networked data streaming with apache kafka", IEEE Access, No. 11, P. 85333-85350. 2023. DOI: 10.1109/ACCESS.2023.3303810
Kafka Producer Configuration Reference for Confluent Platform. URL: https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html.
Wang J., Chen Z., Song Y., Liu Y., He J., Ma S. "Data-Driven Dynamic Bayesian Network Model for Safety Resilience Evaluation of Prefabricated Building Construction", Buildings, No. 14, 570 р. 2024. DOI: 10.3390/buildings14030570
Echabarri S., Do P, Vu H., Bornand B. "Machine learning and Bayesian optimization for performance prediction of proton-exchange membrane fuel cells", Energy and AI, No. 17, 100380 р. DOI: https://doi.org/10.1016/j.egyai.2024.100380
References
Solovei, O., Honcharenko, T., Fesan, A. (2024), "Technologies to manager big data of urban building projects", Management of Development of Complex Systems, No. 60, Р.121–128, DOI: 10.32347/2412-9933.2024.60.121-128
Narkhede, M. V., Bartakke, P. P., Sutaone, M. S. (2022), "A review on weight initialization strategies for neural networks", Artificial intelligence review, No. 55(1), P. 291-322. DOI: 10.1007/s10462-021-10033-z
Wong, K., Dornberger, R., Hanne, T. (2024), "An analysis of weight initialization methods in connection with different activation functions for feedforward neural networks", Evolutionary Intelligence, No. 17(3), P.2081-2089, DOI: 10.1007/s12065-022-00795-y
Brand, J.E., Zhou, X., Xie, Y. (2023), "Recent developments in causal inference and machine learning", Annual Review of Sociology, No. 49 (1), Р. 81-110. DOI: 10.1146/annurev-soc-030420-015345
Chumachenko, K., Iosifidis, A., Gabbouj, M. (2022), "Feedforward neural networks initialization based on discriminant learning", Neural Networks, No. 146, P. 220-229. DOI: 10.1016/j.neunet.2021.11.020
Zhao, J., Schäfer, F., Anandkumar, A. (2021), "Zero initialization: Initializing neural networks with only zeros and ones", Published in Transactions on Machine Learning Research, № 11. available at: arXiv preprint arXiv:2110.12661
Pan, Y., Wang, C., Wu, Z., Wang, Q., Zhang, M., Xu, Z. (2025), "IDInit: A Universal and Stable Initialization Method for Neural Network Training", available at: arXiv preprint arXiv:2503.04626
Zhu, C., Ni, R., Xu, Z., Kong, K., Huang, W. R., Goldstein, T. (2021), "Gradinit: Learning to initialize neural networks for stable and efficient training", Advances in Neural Information Processing Systems, No. 34, Р.16410-16422. DOI: https://doi.org/10.3390/app15042008
Nouri, A., van Treeck, C., Frisch, J. (2024), "Sensitivity Assessment of Building Energy Performance Simulations Using MARS Meta-Modeling in Combination with Sobol’Method", Energies, No. 17(3), 695 р. DOI: https://doi.org/10.3390/en17030695
Sadeghi, Z., Matwin, S. (2024), "A Review of Global Sensitivity Analysis Methods and a comparative case study on Digit Classification". available at: arXiv preprint arXiv:2406.16975
Mazo, G., Tournier, L. (2025), "An inference method for global sensitivity analysis", Technometrics, No. 67(2), P. 270-282.
Kozniewski, M., Kolendo, Ł., Chmur, S., Ksepko, M. (2025), "Impact of Parameters and Tree Stand Features on Accuracy of Watershed-Based Individual Tree Crown Detection Method Using ALS Data in Coniferous Forests from North-Eastern Poland", Remote Sensing, No. 17(4), 575 р. DOI: https://doi.org/10.3390/rs17040575
Kaddoura, M., Majeau-Bettez, G., Amor, B., Margni, M. (2025), "Global sensitivity analysis reduces data collection efforts in LCA: A comparison between two additive manufacturing technologies”, Science of the Total Environment, No. 975, 179269 р. DOI: https://doi.org/10.1016/j.scitotenv.2025.179269
Raptis, T. P., Passarella, A. (2023), "A survey on networked data streaming with apache kafka", IEEE Access, No. 11, P. 85333-85350. DOI: 10.1109/ACCESS.2023.3303810
"Kafka Producer Configuration Reference for Confluent Platform". available at: https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html.
Wang, J., Chen, Z., Song, Y., Liu, Y., He, J., Ma S. (2024), "Data-Driven Dynamic Bayesian Network Model for Safety Resilience Evaluation of Prefabricated Building Construction", Buildings, No. 14, 570 р. DOI: 10.3390/buildings14030570
Echabarri, S., Do, P, Vu, H., Bornand, B. (2024), "Machine learning and Bayesian optimization for performance prediction of proton-exchange membrane fuel cells", Energy and AI, No. 17, 100380 р. DOI: https://doi.org/10.1016/j.egyai.2024.100380
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Our journal abides by the Creative Commons copyright rights and permissions for open access journals.
Authors who publish with this journal agree to the following terms:
Authors hold the copyright without restrictions and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
Authors are able to enter into separate, additional contractual arrangements for the non-commercial and non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
Authors are permitted and encouraged to post their published work online (e.g., in institutional repositories or on their website) as it can lead to productive exchanges, as well as earlier and greater citation of published work.












