Robust identification of non-stationary objects with nongaussian interference

Authors

DOI:

https://doi.org/10.15587/1729-4061.2019.181256

Keywords:

Markov model, gradient algorithm, mixing parameter, recurrent procedure, asymptotic estimate, identification accuracy

Abstract

The problem of identification of non-stationary parameters of a linear object, which can be described by the first-order Markov model, with non-Gaussian interference is considered. The identification algorithm is a gradient minimization procedure of the combined functional. The combined functional, in turn, consists of quadratic and modular functionals, the weights of which are set using the mixing parameter. Such a combination of functionals makes it possible to obtain estimates with robust properties. The identification algorithm does not require knowledge of the degree of non-stationarity of the investigated object. It is the simplest, since information about only one measurement cycle (step) is used in model construction. The use of the Markov model is quite effective, as it allows obtaining analytical estimates of the properties of the algorithm.

Conditions of mean and mean-square convergence of the gradient algorithm in the estimation of non-stationary parameters and with non-Gaussian measurement interference are determined.

The obtained estimates are quite general and depend both on the degree of object non-stationarity and statistical characteristics of useful signals and interference. In addition, expressions are determined for the asymptotic values of the parameter estimation error and asymptotic accuracy of identification. Since these expressions contain a number of unknown parameters (values of signal and interference dispersion, dispersion characterizing non-stationarity), estimates of these parameters should be used for their practical application. For this purpose, any recurrent procedure for evaluating these unknown parameters should be applied and the resulting estimates should be used to refine the parameters included in the algorithms. In addition, the asymptotic values of the estimation error and identification accuracy depend on the choice of mixing parameter

Author Biographies

Oleg Rudenko, Kharkiv National University of Radio Electronics Nauky ave., 14, Kharkiv, Ukraine, 61166

Doctor of Technical Sciences, Professor, Head of Department

Department of Сomputer Intelligent Technologies and Systems

Oleksandr Bezsonov, Kharkiv National University of Radio Electronics Nauky ave., 14, Kharkiv, Ukraine, 61166

Doctor of Technical Sciences, Associate Professor

Department of Сomputer Intelligent Technologies and Systems

Oleh Lebediev, Kharkiv National University of Radio Electronics Nauky ave., 14, Kharkiv, Ukraine, 61166

PhD, Associate Professor

Department of Electronic Computers

Nataliia Serdiuk, Kharkiv National University of Radio Electronics Nauky ave., 14, Kharkiv, Ukraine, 61166

PhD, Associate Professor

Department of Сomputer Intelligent Technologies and Systems

References

  1. Kaczmarz, S. (1993). Approximate solution of systems of linear equations†. International Journal of Control, 57 (6), 1269–1271. doi: https://doi.org/10.1080/00207179308934446
  2. Raybman, N. S., Chadeev, V. M. (1966). Adaptivnye modeli v sistemah upravleniya. Moscow: Sovetskoe radio, 156.
  3. Aved'yan, E. D. (1978). Modified Kaczmarz algorithms for estimating the parameters of linear plants. Avtomatika i telemehanika, 5, 64–72.
  4. Liberol', B. D., Rudenko, O. G., Timofeev, V. A. (1995). Modifitsirovanniy algoritm Kachmazha dlya otsenivaniya parametrov nestatsionarnyh obektov. Problemy upravleniya i informatiki, 3, 81–89.
  5. Liberol', B. D., Rudenko, O. G., Bessonov, A. A. (2018). Issledovanie shodimosti odnoshagovyh adaptivnyh algoritmov identifikatsii. Problemy upravleniya i informatiki, 5, 19–32.
  6. Strohmer, T., Vershynin, R. (2009). Comments on the Randomized Kaczmarz Method. Journal of Fourier Analysis and Applications, 15 (4), 437–440. doi: https://doi.org/10.1007/s00041-009-9082-0
  7. Rudenko, O., Bezsonov, O., Romanyk, O., Lebediev, V. (2019). Analysis of convergence of adaptive single­step algorithms for the identification of non­stationary objects. Eastern-European Journal of Enterprise Technologies, 1 (4 (97)), 6–14. doi: https://doi.org/10.15587/1729-4061.2019.157288
  8. Rudenko, O. G., Bezsonov, O. O. (2019). The Regularized Adaline Learning Algorithm for the Problem of Evaluation of Non-Stationary Parameters. Control Systems and Computers, 1, 22–30. doi: https://doi.org/10.15407/usim.2019.01.022
  9. Shao, T., Zheng, Y. R., Benesty, J. (2010). An Affine Projection Sign Algorithm Robust Against Impulsive Interferences. IEEE Signal Processing Letters, 17 (4), 327–330. doi: https://doi.org/10.1109/lsp.2010.2040203
  10. Shin, J., Yoo, J., Park, P. (2012). Variable step-size affine projection sign algorithm. Electronics Letters, 48 (9), 483. doi: https://doi.org/10.1049/el.2012.0751
  11. Lu, L., Zhao, H., Li, K., Chen, B. (2015). A Novel Normalized Sign Algorithm for System Identification Under Impulsive Noise Interference. Circuits, Systems, and Signal Processing, 35 (9), 3244–3265. doi: https://doi.org/10.1007/s00034-015-0195-1
  12. Huang, H.-C., Lee, J. (2012). A New Variable Step-Size NLMS Algorithm and Its Performance Analysis. IEEE Transactions on Signal Processing, 60 (4), 2055–2060. doi: https://doi.org/10.1109/tsp.2011.2181505
  13. Casco-Sánchez, F. M., Medina-Ramírez, R. C., López-Guerrero, M. (2011). A New Variable Step-Size NLMS Algorithm and its Performance Evaluation in Echo Cancelling Applications. Journal of Applied Research and Technology, 9 (3), 302–313.
  14. Huber, P. J. (1977). Robust methods of estimation of regression coefficients. Series Statistics, 8 (1), 41–53. doi: https://doi.org/10.1080/02331887708801356
  15. Huber, P. J. (1964). Robust Estimation of a Location Parameter. The Annals of Mathematical Statistics, 35 (1), 73–101. doi: https://doi.org/10.1214/aoms/1177703732
  16. Hampel, F. R. (1974). The Influence Curve and its Role in Robust Estimation. Journal of the American Statistical Association, 69 (346), 383–393. doi: https://doi.org/10.1080/01621459.1974.10482962
  17. Adamczyk, T. (2017). Application of the Huber and Hampel M-estimation in real estate value modeling. Geomatics and Environmental Engineering, 11 (1), 15. doi: https://doi.org/10.7494/geom.2017.11.1.15
  18. Andrews, D. F. (1974). A Robust Method for Multiple Linear Regression. Technometrics, 16 (4), 523. doi: https://doi.org/10.2307/1267603
  19. Yohai, V. J. (1987). High Breakdown-Point and High Efficiency Robust Estimates for Regression. The Annals of Statistics, 15 (2), 642–656. doi: https://doi.org/10.1214/aos/1176350366
  20. Croux, C., Rousseeuw, P. J., Hossjer, O. (1994). Generalized S-Estimators. Journal of the American Statistical Association, 89 (428), 1271. doi: https://doi.org/10.2307/2290990
  21. Rudenko, O. G., Bessonov, A. A. (2011). Robastnoe obuchenie radial'no-bazisnyh setey. Kibernetika i sistemniy analiz, 6, 38–46.
  22. Rudenko, O. G., Bessonov, A. A. (2014). Robust neuroevolutionary identification of nonlinear nonstationary objects. Kibernetika i sistemniy analiz, 50 (1), 21–36.
  23. Rudenko, O. G., Bessonov, A. A., Rudenko, C. O. (2013). Robastnaya identifikatsiya nelineynyh obektov s pomoshch'yu evolyutsioniruyushchey radial'no-bazisnoy seti. Kibernetika i sistemniy analiz, 2, 15–26.
  24. Rudenko, O., Bezsonov, O. (2011). Function Approximation Using Robust Radial Basis Function Networks. Journal of Intelligent Learning Systems and Applications, 03 (01), 17–25. doi: https://doi.org/10.4236/jilsa.2011.31003
  25. Walach, E., Widrow, B. (1984). The least mean fourth (LMF) adaptive algorithm and its family. IEEE Transactions on Information Theory, 30 (2), 275–283. doi: https://doi.org/10.1109/tit.1984.1056886
  26. Bershad, N. J., Bermudez, J. C. M. (2011). Mean-square stability of the Normalized Least-Mean Fourth algorithm for white Gaussian inputs. Digital Signal Processing, 21 (6), 694–700. doi: https://doi.org/10.1016/j.dsp.2011.06.002
  27. Eweda, E., Zerguine, A. (2011). New insights into the normalization of the least mean fourth algorithm. Signal, Image and Video Processing, 7 (2), 255–262. doi: https://doi.org/10.1007/s11760-011-0231-y
  28. Eweda, E. (2012). Global Stabilization of the Least Mean Fourth Algorithm. IEEE Transactions on Signal Processing, 60 (3), 1473–1477. doi: https://doi.org/10.1109/tsp.2011.2177976
  29. Eweda, E., Bershad, N. J. (2012). Stochastic Analysis of a Stable Normalized Least Mean Fourth Algorithm for Adaptive Noise Canceling With a White Gaussian Reference. IEEE Transactions on Signal Processing, 60 (12), 6235–6244. doi: https://doi.org/10.1109/tsp.2012.2215607
  30. Hubscher, P. I., Bermudez, J. C. M., Nascimento, Ví. H. (2007). A Mean-Square Stability Analysis of the Least Mean Fourth Adaptive Algorithm. IEEE Transactions on Signal Processing, 55 (8), 4018–4028. doi: https://doi.org/10.1109/tsp.2007.894423
  31. Chambers, J. A., Tanrikulu, O., Constantinides, A. G. (1994). Least mean mixed-norm adaptive filtering. Electronics Letters, 30 (19), 1574–1575. doi: https://doi.org/10.1049/el:19941060
  32. Rakesh, P., Kumar, T. K., Albu, F. (2019). Modified Least-Mean Mixed-Norm Algorithms For Adaptive Sparse System Identification Under Impulsive Noise Environment. 2019 42nd International Conference on Telecommunications and Signal Processing (TSP). doi: https://doi.org/10.1109/tsp.2019.8768813
  33. Chambers, J., Avlonitis, A. (1997). A robust mixed-norm adaptive filter algorithm. IEEE Signal Processing Letters, 4 (2), 46–48. doi: https://doi.org/10.1109/97.554469
  34. Papoulis, E. V., Stathaki, T. (2004). A Normalized Robust Mixed-Norm Adaptive Algorithm for System Identification. IEEE Signal Processing Letters, 11 (1), 56–59. doi: https://doi.org/10.1109/lsp.2003.819353
  35. Arenas-García, J., Figueiras-Vidal, A. R. (2005). Adaptive combination of normalised filters for robust system identification. Electronics Letters, 41 (15), 874. doi: https://doi.org/10.1049/el:20051936
  36. Wagner, K. T., Doroslovacki, M. I. (2008). Towards analytical convergence analysis of proportionate-type nlms algorithms. 2008 IEEE International Conference on Acoustics, Speech and Signal Processing. doi: https://doi.org/10.1109/icassp.2008.4518487
  37. Price, R. (1958). A useful theorem for nonlinear devices having Gaussian inputs. IEEE Transactions on Information Theory, 4 (2), 69–72. doi: https://doi.org/10.1109/tit.1958.1057444
  38. Feuer, A., Weinstein, E. (1985). Convergence analysis of LMS filters with uncorrelated Gaussian data. IEEE Transactions on Acoustics, Speech, and Signal Processing, 33 (1), 222–230. doi: https://doi.org/10.1109/tassp.1985.1164493
  39. Gladyshev, E. G. (1965). On Stochastic Approximation. Theory of Probability & Its Applications, 10 (2), 275–278. doi: https://doi.org/10.1137/1110031

Downloads

Published

2019-10-21

How to Cite

Rudenko, O., Bezsonov, O., Lebediev, O., & Serdiuk, N. (2019). Robust identification of non-stationary objects with nongaussian interference. Eastern-European Journal of Enterprise Technologies, 5(4 (101), 44–52. https://doi.org/10.15587/1729-4061.2019.181256

Issue

Section

Mathematics and Cybernetics - applied aspects