DEVELOPMENT OF A MODIFIED UMAC ALGORITHM BASED ON CRYPTO-CODE CONSTRUCTIONS

The presented algorithms for evaluating the properties of universality and strict universality of hash codes make it possible to evaluate the security of the proposed hashing constructs based on universal hash functions, taking into account the preservation of the universality property


Introduction
The development of the banking sector in the last decade has made it possible to significantly expand the range of its services based on the use of computing resources of Inter-net technologies and X "G" -LTE (Long-Term Evolution) technologies.
These changes contribute to the development of the digital economy, and in particular, electronic banking [1,2]. However, this is accompanied by an increase in the number

The development of computer technology has determined the vector for the expansion of services based on the Internet and "G" technologies. The main requirements for modern services in the banking sector are security and reliability. At the same time, security is considered not only as ensuring the confidentiality and integrity of transactions, but also their authenticity. However, in the post-quantum period, US NIST specialists question the durability of modern means of providing basic security services based on symmetric and asymmetric cryptography algorithms. The increase in computing resources allows attackers to use modern threats in combination. Thus, there is a need to search for new and/or modify known algorithms for generating MAC (message authentication codes). In addition, the growth of services increases the amount of information that needs to be authenticated. Among the well-known hash algorithms, the hash functions of universal hashing are distinguished, which allow initially determining the number of collisions and their uniform distribution over the entire set of hash codes. Possibilities of modifying the cascade hashing algorithm UMAC (message authentication code based on universal hashing, universal MAC) based on the use of McEliece crypto-code construction on algebrogeometric (elliptic codes (EC), modified elliptic codes (MEC) and damaged codes (DC). This approach allows preserving the uniqueness property, in contrast to the classical UMAC scheme based on a block symmetric cipher (AES). The presented algorithms for evaluating the properties of universality and strict universality of hash codes make it possible to evaluate the security of the proposed hashing constructs based on universal hash functions, taking into account the preservation of the universality property
Keywords: authenticity, hashing algorithm, crypto-code constructions, elliptic codes, modified elliptic codes, damaged codes, UMAC algorithm, MV2 algorithm (universal damage mechanism), postquantum cryptography UDC 681. 3.06 (0.43) DOI: 10.15587/1729-4061.2020.210683 and diversity of cyber threats. In [3][4][5], the results of the analysis of cyber threats to automated banking systems (ABS) of banking sector organizations (BSO) over the past three years are presented (Fig. 1).
The above graph shows that in 2017-2019, the extent of damage from implemented attacks in the financial sector was significantly influenced by such methods of implemented attacks as phishing (2017 -16 %, 2018 -49 %, 2019 -80 %) and malicious software (2017 -48 %, 2018 -58 %, 2019 -85 %). From the point of view of the main vectors of modern attacks on ABS, the analysis carried out indicates their integration with the methods of social engineering. This leads to the appearance of the properties of hybridity and synergism in the already known threats [3][4][5].
The presented statistics show that the number of threats associated with the service of authenticity is growing steadily. Therefore, it is important to recognize scientific research focused on the formation of new approaches that provide an authenticity service in the face of the rapid growth of computing resources and modern threats based on a full-scale quantum computer, based on the development of a modified UMAC algorithm with McEliece crypto-code constructions (as a pad).

Literature review and problem statement
With the growth of computing resources and modern technologies for increasing data volumes, an integrated task arises to ensure not only security, efficiency, but also authenticity. For implementation, it is proposed to use the UMAC cascade hashing algorithm (message authentication code based on universal hashing), which at the same time allows providing the required level of security and efficiency based on the use of universal hashing functions. However, the classical scheme uses the Advanced Encryption Standard (AES) block-symmetric cipher algorithm to ensure the strength of the hash code, which ultimately does not allow for universality. The works [3][4][5] present the results of studies of the vectors of cyber threats. The analysis shows that the use of threats aimed at cracking authentication mechanisms allows remote access to confidential information and/or to obtain "privileges" that provide the possibility of "hacking" an integrated information security system (IISS). In [6][7][8], the possibilities of using McEliece and Niederreiter crypto-code constructions, providing security services -confidentiality, integrity, are considered. In addition, the efficiency of crypto-transformations and reliability of data transmission are provided on the basis of the use of algebraic geometric (cyclic) codes. A significant disadvantage of their practical use is the possibility of finding elements of masking matrices (the private key of each user of the system) based on the Sidelnikov's attack [9] and significant energy costs for practical implementation over a finite (computational) field of dimension GF (2 10 -2 13 ) (Galois field). As known, a finite field is a finite set that defines arbitrary operations called addition, multiplication, subtraction and division.
In [10][11][12][13][14][15], the possibilities of full-scale quantum computers are considered, which provide the problem of breaking symmetric and asymmetric cryptography algorithms in polynomial time. Thus, the implementation of an attack on a quantum computer practically casts doubt on the stability of hashing algorithms based on block-symmetric ciphers in the CBC (Cipher Block Chaining) and CFB (Cipher Feedback Mode) modes.
The analysis of the possibilities of cryptanalysis in [16] confirms that based on the Shor and Grover quantum algorithms and a full-scale quantum computer, symmetric and asymmetric cryptography algorithms are susceptible to breaking in polynomial time.
The works [17,18] consider the possibility of constructing a cascade UMAC algorithm based on the use of MASH-1 and MASH-2 keyless algorithms (MASH -Modular Arith-  IoT -Internet of Things metic Secure Hash) as the formation of a pseudo-random pad on the third layer. However, the results presented by the authors indicate that the MASH-1 algorithm does not provide the required stability and versatility parameters and cannot be used in a modified (improved) algorithm. The MASH-2 algorithm provides the required level of cryptographic strength and retains the property of universality of the generated hash code in the improved UMAC, but does not provide its use in online mode, due to significant computing resources. In [19], the possibility of forming hash functions based on the use of cyclic algebraic geometric noise-resistant codes is considered. However, their use in the UMAC algorithm has not been considered, and research on their practical implementation has not been carried out. This approach provides versatility and allows the use of crypto-code constructs as a pseudo-random pad in a cascade hashing algorithm. In [7,10,20], hybrid crypto-code constructions (HCC) are considered based on the synthesis of the classical McEliece and Niederreiter schemes. However, the use of two schemes increases not only the capacitive costs for its practical implementation, but also reduces the efficiency of crypto-transformations, which is essential for use in the formation of the MAC code. In [21], the authors consider the use of Reed-Solomon cyclic codes, but they do not study the resistance of such a crypto-code structure to the Sidelnikov's attack, which does not allow using CCC as a "guarantor" of the hash code strength. In [22], the possibility of using the Niederreiter crypto-code construction in post-quantum cryptography is considered; however, this scheme uses two algorithms (equilibrium coding and McEliece's schemes), which complicates the formation of the pseudo-pad and practical implementation. In [23,24], the security of universal hash functions is considered. The authors confirm that the use of universal hashing does not allow the formation of hack-resistant hash codes unambiguously, and suggest using additional encryption to ensure the strength of the hash code. This approach increases the computational costs of their online implementation. The work [25] proposes the use of a new multicast authentication scheme based on a symmetric algorithm. However, in the post-quantum period, this algorithm can be cracked, and the use of a modified cascade hashing algorithm will provide the required level of security and efficiency. The possibilities of forming a hash code based on the UMAC algorithm, proposed in [7,10,[17][18][19][20][21][22][23][24][25], provide the property of universality, however, this requires significant energy costs for their practical implementation and does not ensure the use of the algorithm in the online mode of digest generation. The formation of a modified UMAC algorithm based on crypto-code constructions will provide a solution to the authentication problem in post-quantum cryptography, the required level of efficiency (online hash code generation) with the further growth of information data arrays, and a decrease in energy consumption during their practical implementation.

The aim and objectives of the study
The aim of the study is to develop a modified UMAC algorithm based on crypto-code constructions and algorithms for assessing the strength of the hash code.
To achieve the aim, the following objectives are set: -to consider the requirements for algorithms of universal and strictly universal classes of hash functions; -to analyze the construction of the cascade UMAC algorithm, taking into account the provision of universality; -to develop algorithms for the modified UMAC algorithm based on the McEliece crypto-code construction on algebraic geometric and damaged codes; -to develop algorithms for assessing the strength of hash codes of the modified UMAC algorithm based on evaluating the criteria for universality and strict universality of the classes of hash functions.

Basic requirements for algorithms of universal and strictly universal classes of hash functions
Universal classes of hash functions were first proposed in [26]. The basic properties of universality and strict universality of classes of hash functions were studied in [27][28][29].
The idea of universal hashing is to define such a set of elements of a finite set H of hash functions : , = (А -the set of outgoing messages; В -the set of МАС codes; |А| -number of outgoing messages; |B| -the number of possible states of MAC codes; a -outgoing message; bthe state of MAC codes), so that random selection of the function h H ∈ would provide a low probability of collision, i. e. for any different inputs x 1 and x 2 , the probability that h(x 1 )= =h(x 2 ) (probability of collision) cannot exceed some predetermined value (fixed precision) ε: wherein the collision probability can be calculated as The works [18,30] give definitions of universal hashing: 1. Assuming that 0<ε<1. H is an ε-universal hash class (abbreviated ε-U(H, A, B)), if for two different elements x 1 , x 2 ∈A there is no more than (|H|×ε) functions f∈H such that h(x 1 )=h(x 2 ), if The definition of a universal class of hash functions is equivalent to the definition of the MAC algorithm, in which the number of different rules for generating a hash code (the number of keys), in which there is a collision for two arbitrary input sequences, is limited. The number of such keys cannot exceed the value ( ) where P col -collision probability, |H| -the number of all rules (keys).
The universal class of hash functions satisfies such security conditions as resistance to a preimage, to another preimage, and to collision [29].
Authentication schemes that are equivalent to the universal classes of hash functions have not been put into practice due to their low resistance to intrusion threats. Consequently, in the definition of a universal class of hash functions, there is no condition that determines its resistance to differential analysis.
The ideas of universal authentication were developed in the theory of unconditional authentication using strictly universal hashing [18,31]. So, in [19], a definition of a strictly universal class of hash functions is given, which is equivalent to the definition of such an algorithm for generating data integrity and authenticity control codes, under which the following rules will be fulfilled.
1. The number of MAC generation rules (number of keys), under which the value of the data integrity and authenticity control code does not change for an arbitrary input sequence, is limited. The number of such keys cannot exceed the value / . H B 2. The number of rules for generating the data integrity and authenticity control code (the number of keys), under which the corresponding MAC values do not change for two arbitrary input sequences, is limited. The number of such keys cannot exceed the value .
кол P H × The probability of collision of data integrity and authenticity control codes in a scheme with strictly universal hashing is defined as Р col ≤ε.
The conditions of strictly universal classes of hash functions are more "stringent" in comparison with the requirements of universal classes. Authentication schemes equivalent to strictly universal classes have a number of advantages, which are determined by the exact value of the collision probability, aversion to frequency and differential analysis, etc. However, the construction of such schemes is very problematic. Practical schemes are known that are equivalent to strictly universal classes, which have a significant drawback -a large amount of key data exceeding the amount of information.
Thus, to assess the strength of hash codes obtained on the basis of universal classes of hash functions, it is practically enough to evaluate the fulfillment of the criteria of universality and strict universality. In addition, knowing the number of collisions and even distribution of hash codes over the entire set allows using collision data as identifiers for large amounts of data and reducing the time to find the information needed.

Analysis of the construction of the cascade UMAC algorithm, taking into account the universality
For remote electronic interaction, it is important to ensure the process of object recognition by the presented parameters (indicators) and the associated authentication process. This problem is most acute in access control systems and authentication and verification systems for message transmission [32][33][34][35][36][37][38][39][40].
The UMAC message authentication code was proposed in [35]. The algorithm is based on families of universal hash functions and provides provable security of the generated MAC [33][34][35][36][37][38][39][40][41][42]. At the same time, the security of the algorithm is justified by the strength of the AES block symmetric cipher (BSC) used in the UMAC scheme in the CBC mode (plain text block chaining) when forming the pad for the third layer of the UHASH-16 or UHASH-32 function (UHASH is a universal hash function, with a fixed hash code of 16 and 32 bits, respectively).
In [36], the construction of multilayer hashing functions by the example of the UMAC algorithm is presented as a symbiosis of multi-stage key universal hashing and the use of a symmetric block cipher to form the so-called pseudo-random pad. The use of universal hashing in the multilayer construction of UMAC allows ensuring the equiprobability of the formation of hash images for the entire set of key data used, on which the proof of the security of the algorithm is based [17,18,25,37,[41][42][43][44]. The use of the AES encryption algorithm provides high cryptographic strength of the UMAC scheme [33][34][35][36][37]45].
The UMAC message authentication code generation scheme uses a multi-level universal hashing construction Hash(K,M,Taglen) and pseudo-random pad generation procedure Pad. The use of universal hashing makes it possible to ensure the equiprobability of the formation of hash images for the entire set of key data used, on which the proof of the algorithm's security is based [45]. The UHash function compresses a message made up of three different layers: -compression: the first layer uses the fast hash family NH to compress the message to a specified ratio; -fixed length hash: the second layer uses the RP hash family, not as fast as NH, but generates a fixed length output using a fixed length key; -strengthen and fold: the third layer uses the IP hash family, which reduces the length of its input to a more appropriate size. Fig. 2 shows the general principle of the algorithm for generating the MAC code using the class of universal hash functions UMAC-16 and UMAC-32.
On the first layer L1 of the Hash function, the NH hash function is used, the result of which is the division of the message into blocks of 1,024 bytes. In this case, messages are received 128 times less than the input (Fig. 3).
After block A is formed, the received data are transferred to the second layer L2 of the Hash function ( Fig. 4): A -input message, L2-Key -key, ADD -concatenation mod 32, MOD p64 -operation of calculating the remainder of division by a 64-bit long prime number, MULT -multiplexer, FF -data, Reset -reset, SEL -select operation, ZERO PADzero vector, B -output block, Compare 2 64 -2 32 -comparison operation, K 2 mod p, K mod p -operations with keys.
This layer uses RP. The universal RP family of hash functions is based on polynomial computation. A string of n words with a length of n bits can be viewed as a polynomial  The purpose of this layer is to transform the result obtained on the first layer using the POLY polynomial function. If the length of the input vector is longer than 1,024 bits, then the polynomial function uses additional parameters to form the remainder.
The third layer L3 of the Hash function uses the IP function (Fig. 5).
The layer with the generic IP hash family reduces the length of its input, since the RP hash layer generates outputs that are much larger than the expected probability of error. It is based on the processing of internal data in the range of values (multiplying the input words with keywords and composing the results). The probability of an error from the previous layer has passed to this one and remains.
The purpose of the third layer is to transform the input vector B with a length of 16 bytes into a string equal to 4 bytes.
The formation of the pseudo-random pad with a cryptographically strong algorithm of the AES symmetric block cipher ensures the cryptographic strength of the UMAC algorithm at the level of the applied cryptoalgorithm [14]. This UMAC generation scheme has potentially high efficiency rates, which is ensured by overlaying pseudo-random pads Pad=Hash(K, Nonce, Taglen) on the generated hash codes Y=Hash(K, M, Taglen). The overlay procedure is equivalent to the bitwise addition operation. This approach is the classic implementation method for the UMAC algorithm.
The considered multi-layer construction in the formation of UMAC has potentially high efficiency rates. But after overlaying pseudo-random pads on the last layer, the UMAC formation algorithm loses the "universality" property of hashing and its collision properties are significantly deteriorated. This is due to the use of block symmetric encryption, which does not guarantee the preservation of the universality property of the resulting MAC code [18,19,32]. To ensure an increase not only in the cryptographic strength of message encryption algorithms for transmission over communication channels, but also to preserve universality, in [19,32] it is proposed to use the universal hashing based on modular transformations. The provision of high cryptographic stability of transmitted messages occurs due to the impossibility of decrypting these messages during computing time. But this method is stable only with existing computing power, and with the advent of high-performance quantum computers, the risk of breaking it will increase.
In [19], the application of universal hashing based on modular transformations using the RSA algorithm (Rivest, Shamir, Adleman), which is based on elliptic curves and the computational complexity of the large-number factorization problem, is considered. At the final stage of hashing, this approach provides processing of strictly universal hashing by a cryptographically strong function based on modular transformations using loop functions In this case, the bulk of information data is processed by the first layers of universal hashing. And the pseudo-random pa is processed by a cryptographically strong strictly universal hashing function based on modular transformations using the RSA algorithm. However, this approach does not ensure the efficiency of crypto-transformations, and its strength in post-quantum cryptography does not meet the requirements.
In [32], an improved method for generating data integrity and authenticity control codes is proposed based on the use of the UMAC algorithm. The first two layers are high-speed, but cryptographically weak universal hashing schemes, the last layer is proposed to be implemented using the developed secure (cryptographically strong) strictly universal hashing scheme based on modular transformations.
Generation of message authentication codes using key hashing, built on the basis of the keyless MASH-2 algorithm with variable initialization vectors, in certain cases allows building universal and strictly universal classes of hash functions. However, this condition is not satisfied for all values of the initial parameters (primes p and q).
Formally, the proposed scheme for the cascade generation of data integrity and authenticity control codes using modular transformations is shown in Fig. 6. This approach makes it possible to ensure the universality and stability of hash codes, but is practically not applicable in the online mode of hash code generation due to the low speed of forming the pseudo-random pad based on the keyless MASH-2 algorithm.
Thus, to eliminate the revealed drawback, it is proposed to use crypto-code constructions on algebraic geometric and damaged codes as a mechanism for forming the pseudo pad for the third layer of the UMAC cascade hashing algorithm.
The main approaches to ensuring the reduction of energy costs for the practical implementation of CCC on ЕС, MEC, DC are considered in [14,46,47]. This approach allows forming a modified UMAC algorithm depending on the requirements and computational capabilities.

Development of algorithms for modified UMAC based on McEliece CCC on EC, MEC and DC
In [14,[47][48][49][50], algorithms for constructing McEliece CCC on EC (MEC), DC, which can be used to form the pseudo-random pad, are considered.
So to build the pad on McEliece CCC on EC, the following algorithm is used: -Step 1. Entering the information to be encoded (use the plain text of the message). Entering the public key ; EC X G -Step 2. Encoding information with an elliptic code. Formation of a code word с Х of the elliptic code given by the matrix EC X G (forming the first part of the pad); -Step 3. Formation of the error vector e, whose weight does not exceed ≤ t -correcting ability of elliptic code (formation of the second part of the pad); -Step 4. Formation of the codogram (pad) * . X X с c e = + Thus, the code generated on EC is the pad that is used on the third layer of the modified UMAC algorithm on McEliece CCC. This approach ensures the preservation of the universality property of the obtained hash code and the required level of efficiency. However, it requires significant energy costs to ensure the required level of durability (it is necessary to build a CCC over GF(2 10 -2 13 ). To reduce energy costs, it is proposed to use McEliece CCC on MEC, which allows reducing energy costs (building CCC over GF(2 6 -2 8 ) and provide the required level of durability.
In [51] (Fig. 7) and in [14] (Fig. 8)  The results of the study showed that the application of this approach in practice contributes both to an increase in the cryptographic strength of the generation of an authentication code and raises the issue of the speed of operations related to this process.
To construct crypto-code constructions based on modified (shortened or extended) cyclic codes on elliptic curves, the parameters presented in Table 1 are used. Table 2 shows the parameters of asymmetric cryptosystems on the corresponding CCC. In [14,[49][50][51], practical algorithms for constructing a modified UMAC algorithm on CCC with MEC are considered.
The proposed approach will make it possible to form the pseudo-random pad with a further reduction in computing resources by reducing the computational field of elliptic curves based on damage (reduction or lengthening of the code sequence).
The process of confirming the integrity of information during transmission from the sending to the receiving side on the basis of checking the verification of codograms and hash codes using the McEliece CCC on MEC is schematically shown in Fig. 9.
The use of the UMAC algorithm on the proposed crypto-code constructions will make it possible to detect modifications of the plain text when transmitted through an open channel. When developing a mathematical model for the formation of a hash code in the UMAC algorithm, a pseudo-random sequence is used, which ensures the cryptographic strength of this hash code. The algorithm of forming the pad is the McEliece crypto-code construction on MEC.
Application of modification changes to elliptic codes allows reducing the load on computing resources and leads to an increase in the efficiency of generating MAC codes in real time.  In [50], it was proposed to use the McEliece crypto-code construction using damaged codes as a mechanism for forming the pseudo-random pad of the third UMAC layer. The main ideas of damaged cryptography are proposed in [52,53]. This approach allows the use of a hybrid CCC (HCCC) construction with multichannel cryptography and provides the maximum substrate formation rate with the required level of security. In addition, it allows reducing energy costs (construction of the HCCC over GF (2 4-2 6 ) while maintaining the required level of hash code strength. Fig. 10   The proposed approach will make it possible to form the pseudo-random pad with a further reduction in computing resources by reducing the computational field of elliptic curves based on damage (reduction or lengthening of the code sequence).

Development of algorithms for assessing the strength of hash codes of the modified UMAC algorithm based on evaluating criteria of universality and strict universality of hash function classes
The proposed algorithms for assessing the strength of hash codes of the modified UMAC algorithm (statistical study of collisional properties of the generated elements h (x)) are based on empirical assessment of the maximum number of keys (hashing rules) for which: 1) for arbitrary x 1 , x 2 ∈A, x 1 ≠x 2 , equality holds: 2) for arbitrary x 1 ∈A and y 1 ∈B equality holds: h(x 1 )=y 1 ; 3) for arbitrary x 1 , x 2 ∈A, x 1 ≠x 2 and y 1 , y 2 ∈B equalities hold: h(x 1 )=y 1 , h(x 2 )=y 2 .
( 3 ) The assessment according to the first criterion corresponds to checking the fulfillment of the condition for the universal class of hash functions, the assessment according to the second and third criterion corresponds to the conditions for the strictly universal class of hash functions.
To estimate the above equalities, the following notation is introduced [54]: , , The first indicator n 1 (x 1 ,x 2 ) characterizes the number of hashing rules (MAC formation rules), under which for the given x 1 , x 2 ∈A, x 1 ≠x 2 , equality (1) holds, i. e. the number of keys for which there is a collision (MAC match) for two input sequences x 1 and x 2 . The second indicator n 2 (x 1 ,y 1 ) characterizes the number of hashing rules (MAC formation rules), under which for the given x 1 ∈A, y 1 ∈B equality (2) holds, i. e., the number of keys, for which the hash code value (MAC) y 1 for the input sequence x 1 does not change.
The third indicator n 3 (x 1 ,x 2 ,y 1 ,y 2 ) characterizes the number of hashing rules (MAC formation rules). For given 1 2 , , ∈ , equality (3) holds, i. e., the number of keys for which for two input sequences x 1 and x 2 their corresponding hash values (MAC) y 1 and y 2 do not change.
Since the number of keys must not exceed their corresponding values кол , the maximum number of such keys for each of the considered set of elements is of interest.
To conduct research, it is necessary to determine the maximums of these values, and then compare the results with the number ( ) The assessment of collisional properties according to the given criteria of universality and strict universality is carried out in the average statistical sense. So, when setting up an experiment, a limited set of elements x 1 , x 2 ∈A, x 1 ≠x 2 and their corresponding hash images (MAC) y 1 , y 2 ∈B is used, considering the relevant results as a sample from the general population.
The natural estimate m  for the mathematical expectation m of a random variable Х is the arithmetic mean of its observed values X i (or statistical mean) [19]: where N -number of implementations of a random variable Х.
The estimate of the variance D  of a random variable Х is determined by the expression: By virtue of the central limit theorem of probability theory for large values of the number of implementations N, the arithmetic mean will have a distribution close to normal, with a mathematical expectation [19]: Moreover, the probability that the estimate m  deviates from its mathematical expectation by less than ε (confidence level) is equal to [19]: (4) where Ф(х) -Laplace function, defined by the expression: When conducting experimental studies of collision properties, it was proposed to use the methods of statistical testing of hypotheses and mathematical statistics [18,19]: 1. From the general population of a random variable X, a sample is formed as follows: -for the average estimate of the mathematical expectation m(n 1 ) and variance D(n 1 ), the maximum n 1 (x 1 , x 2 ) is used as a random variable for which the equality h(x 1 )=h(x 2 ) holds, hence the sample of size N: X 1 , X 2 ,..., X N will be formed by selecting N sets, each of which contains M pairs of elements x 1 , x 2 ∈A, x 1 ≠x 2 and estimated as n 1 (x 1 , x 2 ), i. e. the total volume of formed pairs of elements x 1 ,x 2 ∈A, x 1 ≠x 2 will be NM; -for the average estimate of m(n 2 ) and D(n 2 ), the maximum n 2 (x 1 ,y 1 ) is used as a random variable for which the equality y 1 =h(x 1 ) holds, hence the sample of size N: X 1 , X 2 , ..., X N will be formed by selecting N sets, each of which contains M pairs of elements x 1 ∈A, y 1 ∈B and estimated as n 2 (x 1 ,y 1 ). The total volume of formed pairs of elements x 1 ∈A, y 1 ∈B, will be NM; -for the average estimate of m(n 3 ) and D(n 3 ), the maximum n 3 (x 1 , x 2 , y 1 , y 2 ) is used as a random variable for which the equalities y 1 =h(x 1 ) and y 2 =h(x 2 ) hold, hence the sample of size N: X 1 , X 2 , ..., X N will be formed by selecting N sets, each of which contains M quadruples of elements x 1 , x 2 ∈A, x 1 ≠x 2 , y 1 ,y 2 ∈B and estimated as n 3 (x 1 ,x 2 ,y 1 ,y 2 ), the total volume of formed quadruples will be NM.
2. In experimental studies of the collisional properties of hashing, the arithmetic mean ( ) i m n  of observed maximum values n i and variance ( ), i D n  1,2,3. i = are estimated. 3. The reliability of the average estimates obtained is substantiated as follows. The accuracy ε is recorded and the values of the Laplace function are calculated, which, in accordance with the confidence probability, will give the corresponding confidence probabilities: When the problem is inversely stated, i. e., for a fixed confidence level P σ for a sample size of N, the confidence interval is determined as follows: where t ρ -the root of the equation The algorithm for checking hash codes for compliance with the rules of the universal class of hash functions is shown in Fig. 12. The implementation of the algorithm can be described by the following steps.
Step 3. Forming hash codes H ij for each input message I i using keys K j .
Step 4. Performing a sequential comparison of the obtained hash codes H ij by the same key K j on all incoming messages among themselves on the basis of the following condition: if the hash values match (H ij =H ij+1 ), which indicates the occurrence of a collision (L j ), then 1 is added to the collision counter: The algorithm for checking hash codes for compliance with the requirements of a strictly universal class of hash functions according to the first criterion is shown in Fig. 13. The implementation of the algorithm can be described by the following steps: Step 1. Forming one random incoming message I rand.
Step 2. Forming a hash code H rand of the random message I rand .
Step 5. Forming hash codes H ij for each incoming message I i using keys K j .
Step 6. Performing a sequential comparison of the obtained hash codes H ij by the same key K j on all incoming messages with a hash code H rand of the random message I rand on the basis of the following condition: -if the hash values match (H ij =H rand ), which indicates the occurrence of a collision (L j ), then 1 is added to the collision counter: The algorithm for checking hash codes for compliance with the requirements of a strictly universal class of hash functions according to the second criterion is shown in Fig. 14.
The implementation of the algorithm can be described by the following steps: Step 1. Forming two different random incoming messages I rand1 and I rand2 .
Step 2. Forming hash codes H rand1 and H rand2 for each of the messages I rand1 and I rand2 .
Step 5. Forming hash codes H ij for each incoming message I i using keys K j .
Step 6. Performing a sequential comparison of the obtained hash codes H rand1 and H rand2 by the same key K j for all incoming messages with hash codes of two random messages: -if the hash values match (H ij =H rand1 or H ij =H rand2 ), which indicates the occurrence of a collision (L j ), then 1 is added to the collision counter: 1 1 1. Thus, the proposed algorithm allows evaluating not only the fulfillment of the criteria of universality and strict universality of the obtained MAC code, but also the level of stability.

Discussion of the results of the practical implementation of the modified UMAC on CCC
To ensure the validity of the proposed approach, consider a practical example of the implementation of the modified UMAC cascade algorithm on CCC with EC, MEC and DC.
To model the implementation of the modified UMAC algorithm on algebrogeometric (EC, MEC) and damaged codes based on McElliese CCC (HCCC), the following input data are used: -data block length (bytes) -32; -secret key length -32 bytes; -transmitted plain text І array length -3bytes; -pseudo-random key sequence length (number of subkeys) -1,027; -transmitted plaintext (k-bit information vector over GF(q)) -11; -secret error vector e=00000200 (session key); -masking matrix (user private key -KR i ); -non-degenerate k×k matrix -Х; -n×n permutation matrix P; -diagonal matrix D; -generating matrix G;   Table 3.  Step 2. Forming the second layer. Because the length M is less than 1024 bytes, this level of hashing will not be performed, and we will perform calculations using the third-level hash code.
Step 3. Forming the third layer.
The value of the hash function of the third layer is calculated by the following formula: Step 3. Forming the initialization vector IV=00100000, which determines the location of the code sequence reduction/extension symbols: -for shortened МЕС -* 2323322; х C = -for elongated МЕС -* 23123322.  When using the modified UMAC algorithm with McEliece CCC (HCCC) in the DSA standard, the sender uses a private key (KR i ), the recipient uses the public key KU i to verify the digital signature.
The proposed approach provides the required level of efficiency of hash code generation (online) taking into account the growth of computing resources and the amount of data transmitted. Table 4 shows the results of studies of energy costs for the practical implementation of the proposed algorithms for forming the pad based on the use of McEliece CCC (HCCC) on algebrogeometric (EC, MEC) and damaged codes.  Table 4 confirms the theoretical calculations of the reduction of costs for the practical implementation of McEliece CCC (HCCC) on EC (MEC), DC while maintaining the required indicators of the hash code stability in the modified UMAC algorithm. Table 5 presents the results of studies of the statistical properties of the proposed methods based on the NIST STS 822 package, which confirm the stability of the proposed McEliece CCC (HCCC) on EC (MEC), DC [55]. Thus, the presented results confirm the possibility of implementing this approach to the formation of a modified cascade hashing algorithm using crypto-code constructs as a mechanism to ensure the required level of stability in the post-quantum period. In this case, the use of a particular design depends on the computing power and/or platform on the basis of which the software is developed.
A further area of research is the practical confirmation of statistical estimates of the properties of universality and strict universality of the modified UMAC algorithm with the formation of the pad on the McEliece CCC (HCCC).

Conclusions
1. In the context of increasing computing resources, expanding the scope of the digital economy and e-banking services, one of the conditions for providing security services is the search for new and/or modification of known methods. The growth and integration of modern threats, their hybridity and synergy require the introduction of strict criteria for special mechanisms to ensure authenticity. Among the known algorithms for generating MAC codes, universal hash functions occupy a special place. However, their use without additional hash code encryption does not provide the required level of robustness.
2. Analysis of the formation of hash codes based on the cascading application of universal hash functions showed that the use of a block symmetric AES algorithm as a mechanism of pseudo-random pad in the UMAC algorithm does not allow "preserving" the universality of the hash code. And the improvement of the mechanism based on modular arithmetic (MASH-2 algorithm) does not meet the requirements for the efficiency of transformations. Thus, it is proposed to use one of the promising areas -crypto-code constructions based on algebrogeometric and damaged codes.
3. Modifications of construction of the cascade hashing algorithm on the basis of using CCC on EC, MEC, DC are proposed. This approach allows "preserving" the versatility and all the advantages of this class of hash functions, as well as provides the required parameters for efficiency and durability in the conditions of post-quantum cryptography and the emergence of a full-scale quantum computer.
4. The proposed algorithms for assessing the resilience of hash codes of the modified UMAC algorithm on McEliece crypto-code constructions with EC, MEC, DC allow not only assessing the fulfillment of universality and strict universality criteria, but also provide assessments of hash code resilience to modern threats. The proposed algorithms for the statistical study of collision properties generated by MAC codes are based on an empirical estimate of the maximum number of keys (hashing rules).