1. Introduction
Global optimization targets to discover the global minimum of an optimization problem problem by exploring the entire search space. Typically, a global optimization method aims to discover the global minimum of a continuous function
and hence the global optimization problem is formulated as:
The set
S is defined as:
The vectors
a and
b stand for the left and right bounds respectively for the point
x. A systematic review of the optimization procedure can be found in the work of Rothlauf [
1]. Global optimization refers to techniques that seek the optimal solution to a problem, mainly using traditional mathematical methods, for example methods that try to locate either maxima or minima [
2,
3,
4]. Each optimization problem consists of the decision variables, the problem constraints and the objective function [
5]. The main objective in optimization is to assign appropriate values to the decision variables, so that the objective function is optimized. Problem solving techniques in optimization are divided into deterministic and stochastic approaches [
6]. The most common techniques in the first category are interval methods [
7,
8]. In interval techniques the set
S is divided through a number of iterations into subareas that may contain the global minimum using some criteria. Nevertheless, stochastic optimization methods are used in the majority of cases, because they can be programmed more easily and they do not require any priory information about the objective function. Such techniques may include Controlled Random Search methods [
9,
10,
11], Simulated Annealing methods [
12,
13], Clustering methods [
14,
15,
16] etc. Systematic reviews of stochastic methods can be found in the work of Pardalos et al [
17] or in the work of Fouskakis et al [
18]. Furthermore, due to the widespread use of parallel computing techniques in recent years, a number of techniques have been developed that exploit such architectures [
19,
20].
A group of stochastic programming techniques that have been developed to handle optimization problems are evolutionary techniques. These techniques are biological inspired, heuristic and population-based [
21,
22]. Some techniques that belong to evolutionary techniques are for example Ant Colony Optimization methods [
23,
24], Genetic algorithms [
25,
26,
27], Particle Swarm Optimization (PSO) methods [
28,
29], Differential Evolution techniques [
30,
31], evolutionary strategies [
32,
33], evolutionary programming [
34], genetic programming [
35] etc. These methods have been with success in a series of practical problems from many fields, for example biology [
36,
37], physics [
38,
39], chemistry [
40,
41], agriculture [
42,
43], economics [
44,
45].
Recently, Alsayyed et al [
46] introduced a new bio-inspired metaheuristic algorithm called Giant Armadillo Optimization (GAO). This algorithm aims to replicate the behavior of giant armadillos in the real-world [
47]. The new algorithm is based on the giant armadillo’s hunting strategy of heading towards prey and digging termite mounds.
Owaid et al present a method [
48] concerning the decision-making process in organizational and technical systems management problems, which also uses giant armadillo agents. The article presents a method for maximizing decision-making capacity in organizational and technical systems using artificial intelligence. The research is based on giant armadillo agents that are trained with the help of artificial neural networks [
49,
50] and in addition a genetic algorithm is used to select the best one.
This article focuses on enhancing the effectiveness and the speed of the GAO algorithm by proposing some modifications and more specifically:
Application of termination rules, that are based on asymptotic considerations and they are defined in the recent bibliography. This addition will achieve early termination of the method and will not waste computational time on iterations that do not yield a better estimate of the global minimum of the objective function.
A periodic application of a local search procedure. By using local optimization, the local minima of the objective function will be found more efficiently, which will also lead to a faster discovery of the global minimum.
The new method was tested on a series of objective problems found in the relevant literature and it is compared against an implemented Genetic Algorithm and a variant of the PSO technique. This paper has the following structure: in
Section 2 the steps of the proposed method are described in detail, in
Section 3 the benchmark functions are listed as well as the experimental results and finally in
Section 4 some conclusions and guidelines for future work are provided.
2. The proposed method
The GAO algorithm mimics the process of natural evolution and initially generates a population of candidate solutions, that are possible solutions of the objective problem. The GAO algorithm aims to evolve the population of solutions through iterative steps. The algorithm is divided into two stochastic phases: the exploration phase, where the candidate solutions are updated with a process that mimics the attack of armadillos on termite mounds and the exploitation phase, where the solutions are updated similar to digging in termite mounds. The basic steps of the GAO algorithm are presented below:
- 1.
-
Initialization step
Set as the number of armadillos in the population.
Set the maximum number of allowed generations.
Initialize randomly the armadillos in S.
Set iter=0.
Set the local search rate.
- 2.
-
Evaluation step
- 3.
-
Computation step
-
For
do
- (a)
-
Phase 1: Attack on termite mounds
- -
Construct the termite mounds set
- -
Select the termite mound for armadillo i.
- -
Create a new position for the armadillo according to the formula: where are random numbers in and are random numbers in and
- -
Update the position of the armadillo
i according to:
- (b)
-
Phase 2: Digging in termite mounds
- -
Calculate a new trial position
where
are random numbers in
.
- -
Update the position of the armadillo
i according to:
- (c)
Local search. Draw a random number
. If
then a local optimization algorithm is applied to
. Some local search procedures found in the optimization literature are the BFGS method [
51], the Steepest Descent method [
52], the L-Bfgs method [
53] for large scaled optimization etc. A BFGS variant of Powell [
54] was used in the current work as the local search optimizer.
endfor
- 4.
-
Termination Check Step
3. Experiments
This section will begin by detailing the functions that will be used in the experiments. These functions are widespread in the modern global optimization literature and have been used in many research works. Next, the experiments performed using the current method will be presented and a comparison will be made with two commonly used techniques in the field of global optimization, such as genetic algorithms and particle swarm optimization.
3.1. Experimental Functions
The proposed method was tested on a series of benchmark functions available from the related literature [
57,
58]. The definitions for the functions are listed subsequently.
Bf1 function. The function Bohachevsky 1 is defined as:
with
.
Bf2 function. The Bohachevsky 2 function is defined as:
with
.
Branin function with the following definition: with .
Camel function defined as:
Easom defined as:
with
.
Exponential function defined as:
The global minimum is located at
with value
. The cases of
were used in the conducted experiments.
Gkls function.
a function with
w local minima and dimension
n. This function is provided in [
59] with
. The values
and
were used in the conducted experiments.
Goldstein and Price function
Griewank2 function. The function is given by
The global minimum is located at the
with value 0.
Griewank10 function defined as:
with
.
Hansen function. , .
Hartman 3 function defined as:
with
and
and
Hartman 6 function given by:
with
and
and
Potential function, the well - known Lennard-Jones potential[
60] is used as a test function here and it is defined as:
The values
were adopted in the conducted experiments.
Rastrigin function defined as:
Rosenbrock function.
The values
were used in the provided experiments.
Shekel 7 function.
with
and
.
Shekel 5 function.
with
and
.
Shekel 10 function.
with
and
.
Sinusoidal function defined as:
. The values
and
were examined in the conducted experiments.
Test2N function defined as:
The function has
local minima and the values
were used in the conducted experiments.
Test30N function defined as:
with
. The function has
local minima and the values
were used in the conducted experiments.
3.2. Experimental Results
The used software was coded in ANSI-C++ with the assistance of the freely available Optimus optimization environment, that can be downloaded from
https://github.com/itsoulos/GlobalOptimus/ (accessed on 14 April 2024). The experiments were conducted on an AMD Ryzen 5950X with 128GB of RAM, running Debian Linux. In all experimental tables, the numbers in cells denotes average function calls for 30 independent runs. In each run different seed for the random number generator was used. The decimal numbers in parentheses symbolize the success rate of the method in finding the global minimum of the objective function. If this number does not exist, then the method succeeded in finding the global minimum in all 30 runs. The simulation parameters for the used optimization techniques are listed in
Table 1.
The experimental results for the comparison of the proposed method against other methods found in the literature are outlined in
Table 2. The following applies to this table:
- 1.
The column PROBLEM denotes the objective problem.
- 2.
The column GENETIC denotes the average function calls for the Genetic algorithm. The same number of armadillos and chromosomes and particles was used in the conducted experiments in order to be a fair comparison between the algorithms. Also, the same number of maximum generations and the same stopping criteria were utilized among the different optimization methods.
- 3.
The column PSO stands for the application of a Particle Swarm Optimization method in the objective problem. The number of particles and the stopping rule in the PSO method are the same as in proposed method.
- 4.
The column PROPOSED represents the experimental results for the Gao method with the suggested modifications.
- 5.
The final row denoted as AVERAGE stands for the average results for all the used objective functions.
The statistical comparison for the previous experimental results is depicted in
Figure 1. The previous experiments and their subsequent statistical processing demonstrate that the proposed method significantly outperforms Particle Swarm Optimization in terms of the number of function calls, since it requires 20% fewer function calls on average to efficiently find the global minimum. In addition, the proposed method appears to have similar efficiency in terms of required function calls to that of the Genetic Algorithm.
The reliability of the termination techniques was tested with one more experiment, in which both proposed termination rules were used, and the experimental results for the test benchmark are presented in
Table 3. Also, the statistical comparison for the experiment is shown graphically in
Figure 2.
From the statistical processing of the experimental results, one can find that the termination method using the Similarity criterion requires a lower number of function calls than DoubleBox stopping rule to achieve the goal, which is to effectively find the global minimum. Furthermore, there is no significant difference in the success rate of the two termination techniques as reflected in the success rate in finding the global minimum, which rate remains high for both techniques (around 97%).
Moreover, the effect of the periodical application of the local search technique is explored in the experiments shown in
Table 4, where the local search rate increases from 0.5% to 5%.
As expected, the success rate in finding the global minimum increases as the rate of application of the local minimization technique increases. For the case of the current method this rate increases from 92% to 97% in the experimental results. This finding demonstrates that if this method is combined with effective local minimization techniques, it can lead to more efficient finding of the global minimum for the objective function.
4. Conclusions
Two modifications for the Giant Armadillo Optimization method was suggested in this article. These modifications aimed to improve the efficiency and the speed of the underlying global optimization algorithm. The first modification suggested the periodically application of a local optimization procedure to randomly selected armadillos from the current population. The second modification utilized some stopping rules from the recent bibliography in order to prevent the method from unnecessary iterations, when the global minimum was already discovered. The modified global optimization method was tested against two other global optimization methods from the relevant literature and more specific an implementation of the Genetic Algorithm and a Particle Swarm Optimization variant on a series of well - known test functions. In order to have a fair comparison between these methods, the same number of test solutions (armadillo or chromosomes) as well as the same termination rule were used. The present technique after comparing the experimental results shows that it clearly outperforms the particle optimization and has similar behavior to that of the genetic algorithm. Also, after a series of experiments it was shown that the Similarity termination rule outperforms the DoubleBox termination rule in terms of function calls, without reducing the effectiveness of the proposed method in the task of locating the global minimum.
Since the experimental results show to be extremely promising further efforts can be made for the development of the technique in various fields. For example, an extension could be to develop a termination rule that exploits the particularities of the particular global optimization technique. Among the future extensions of the application may be the use of parallel computing techniques to speed up the optimization process, such as the incorporation of the MPI [
61] or the OpenMP library [
62]. For example, in this direction it could be investigated to parallelize the technique in a similar way as genetic algorithms using islands [
63,
64].
Author Contributions
G.K., V.C. and I.G.T. conceived of the idea and the methodology and G.K. and V.C. implemented the corresponding software. G.K. conducted the experiments, employing objective functions as test cases, and provided the comparative experiments. I.G.T. performed the necessary statistical tests. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
References
- Rothlauf, F.; Rothlauf, F.Optimization problems. Design of Modern Heuristics: Principles and Application 2011, pp. 7–44.
- Horst, R.; Pardalos, P.M.; Van Thoai, N. Introduction to global optimization; Springer Science & Business Media, 2000.
- Weise, T. Global optimization algorithms-theory and application. Self-Published Thomas Weise 2009, 361, 153. [Google Scholar]
- Ovelade, O.N.; Ezugwu, A.E. Ebola Optimization Search Algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. In Proceedings of the 2021 International Conference on Electrical, Computer and Energy Technologies (ICECET). IEEE; 2021; pp. 1–10. [Google Scholar]
- Deb, K.; Sindhya, K.; Hakanen, J. Multi-objective optimization. In Decision sciences; CRC Press, 2016; pp. 161–200.
- Liberti, L.; Kucherenko, S. Comparison of deterministic and stochastic approaches to global optimization. International Transactions in Operational Research 2005, 12, 263–285. [Google Scholar] [CrossRef]
- Casado, L.G.; García, I.; Csendes, T. A new multisection technique in interval methods for global optimization. Computing 2000, 65, 263–269. [Google Scholar] [CrossRef]
- Zhang, X.; Liu, S. Interval algorithm for global numerical optimization. Engineering Optimization 2008, 40, 849–868. [Google Scholar] [CrossRef]
- Price, W. Global optimization by controlled random search. Journal of optimization theory and applications 1983, 40, 333–348. [Google Scholar] [CrossRef]
- Křivỳ, I.; Tvrdík, J. The controlled random search algorithm in optimizing regression models. Computational statistics & data analysis 1995, 20, 229–234. [Google Scholar]
- Ali, M.M.; Törn, A.; Viitanen, S. A numerical comparison of some modified controlled random search algorithms. Journal of Global Optimization 1997, 11, 377–385. [Google Scholar] [CrossRef]
- Aarts, E.; Korst, J.; Michiels, W. Simulated annealing
Search methodologies: introductory tutorials in optimization and decision support techniques2005, pp.187–210.
- Nikolaev, A.G.; Jacobson, S.H. Simulated annealing
Handbook of metaheuristics 2010, pp.1–39.
- Rinnooy Kan, A.; Timmer, G. Stochastic global optimization methods part II: Multi level methods. Mathematical Programming 1987, 39, 57–78. [Google Scholar] [CrossRef]
- Ali, M.M.; Storey, C. Topographical multilevel single linkage. Journal of Global Optimization 1994, 5, 349–358. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Lagaris, I.E. MinFinder: Locating all the local minima of a function. Computer Physics Communications 2006, 174, 166–179. [Google Scholar] [CrossRef]
- Pardalos, P.M.; Romeijn, H.E.; Tuy, H. Recent developments and trends in global optimization. Journal of computational and Applied Mathematics 2000, 124, 209–228. [Google Scholar] [CrossRef]
- Fouskakis, D.; Draper, D. Stochastic optimization: a review. International Statistical Review 2002, 70, 315–349. [Google Scholar] [CrossRef]
- Rocki, K.; Suda, R. An efficient GPU implementation of a multi-start TSP solver for large problem instances. In Proceedings of the Proceedings of the 14th annual conference companion on Genetic and evolutionary computation, 2012, pp.; pp. 1441–1442.
- Van Luong, T.; Melab, N.; Talbi, E.G. GPU-based multi-start local search algorithms. In Proceedings of the Learning and Intelligent Optimization: 5th International Conference, LION 5, Rome, Italy, 17-21 January 2011; Selected Papers 5. Springer, 2011. pp. 321–335. [Google Scholar]
- Bartz-Beielstein, T.; Branke, J.; Mehnen, J.; Mersmann, O. Evolutionary algorithms. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 2014, 4, 178–195. [Google Scholar] [CrossRef]
- Simon, D. Evolutionary optimization algorithms; John Wiley & Sons, 2013.
- Blum, C. Ant colony optimization: Introduction and recent trends. Physics of Life reviews 2005, 2, 353–373. [Google Scholar] [CrossRef]
- Dorigo, M.; Blum, C. Ant colony optimization theory: A survey. Theoretical computer science 2005, 344, 243–278. [Google Scholar] [CrossRef]
- Haldurai, L.; Madhubala, T.; Rajalakshmi, R. A study on genetic algorithm and its applications. International Journal of computer sciences and Engineering 2016, 4, 139. [Google Scholar]
- Jamwal, P.K.; Abdikenov, B.; Hussain, S. Evolutionary optimization using equitable fuzzy sorting genetic algorithm (EFSGA). IEEE Access 2019, 7, 8111–8126. [Google Scholar] [CrossRef]
- Wang, Z.; Sobey, A. A comparative review between Genetic Algorithm use in composite optimisation and the state-of-the-art in evolutionary computation. Composite Structures 2020, 233, 111739. [Google Scholar] [CrossRef]
- Eberhart, R.; Kennedy, J. Particle swarm optimization. Proceedings of the Proceedings of the IEEE international conference on neural networks. Citeseer, 1995, Vol. 4, pp. 1942–1948.
- Wang, D.; Tan, D.; Liu, L. Particle swarm optimization algorithm: an overview. Soft computing 2018, 22, 387–408. [Google Scholar] [CrossRef]
- Price, K.V. Differential evolution. In Handbook of optimization: From classical to modern approach; Springer, 2013; pp. 187–214.
- Pant, M.; Zaheer, H.; Garcia-Hernandez, L.; Abraham, A.; et al. Differential Evolution: A review of more than two decades of research. Engineering Applications of Artificial Intelligence 2020, 90, 103479. [Google Scholar]
- Asselmeyer, T.; Ebeling, W.; Rosé, H. Evolutionary strategies of optimization. Physical Review E 1997, 56, 1171. [Google Scholar] [CrossRef]
- Arnold, D.V. Noisy optimization with evolution strategies; Vol. 8, Springer Science & Business Media, 2002.
- Yao, X.; Liu, Y.; Lin, G. Evolutionary programming made faster. IEEE Transactions on Evolutionary computation 1999, 3, 82–102. [Google Scholar]
- Stephenson, M.; O’Reilly, U.M.; Martin, M.C.; Amarasinghe, S. Genetic programming applied to compiler heuristic optimization. In Proceedings of the European conference on genetic programming. Springer; 2003; pp. 238–253. [Google Scholar]
- Banga, J.R. Optimization in computational systems biology. BMC systems biology 2008, 2, 1–7. [Google Scholar] [CrossRef] [PubMed]
- Beites, T.; Mendes, M.V. Chassis optimization as a cornerstone for the application of synthetic biology based strategies in microbial secondary metabolism. Frontiers in microbiology 2015, 6, 159095. [Google Scholar] [CrossRef] [PubMed]
- Hartmann, A.K.; Rieger, H. Optimization algorithms in physics; Citeseer, 2002.
- Hanuka, A.; Huang, X.; Shtalenkova, J.; Kennedy, D.; Edelen, A.; Zhang, Z.; Lalchand, V.; Ratner, D.; Duris, J. Physics model-informed Gaussian process for online optimization of particle accelerators. Physical Review Accelerators and Beams 2021, 24, 072802. [Google Scholar] [CrossRef]
- Ferreira, S.L.; Lemos, V.A.; de Carvalho, V.S.; da Silva, E.G.; Queiroz, A.F.; Felix, C.S.; da Silva, D.L.; Dourado, G.B.; Oliveira, R.V. Multivariate optimization techniques in analytical chemistry-an overview. Microchemical Journal 2018, 140, 176–182. [Google Scholar] [CrossRef]
- Bechikh, S.; Chaabani, A.; Said, L.B. An efficient chemical reaction optimization algorithm for multiobjective optimization. IEEE transactions on cybernetics 2014, 45, 2051–2064. [Google Scholar] [CrossRef] [PubMed]
- Filip, M.; Zoubek, T.; Bumbalek, R.; Cerny, P.; Batista, C.E.; Olsan, P.; Bartos, P.; Kriz, P.; Xiao, M.; Dolan, A.; et al. Advanced computational methods for agriculture machinery movement optimization with applications in sugarcane production. Agriculture 2020, 10, 434. [Google Scholar] [CrossRef]
- Zhang, D.; Guo, P. Integrated agriculture water management optimization model for water saving potential analysis. Agricultural Water Management 2016, 170, 5–19. [Google Scholar] [CrossRef]
- Intriligator, M.D. Mathematical optimization and economic theory; SIAM, 2002.
- Dixit, A.K. Optimization in economic theory; Oxford University Press, USA, 1990.
- Alsayyed, O.; Hamadneh, T.; Al-Tarawneh, H.; Alqudah, M.; Gochhait, S.; Leonova, I.; Malik, O.P.; Dehghani, M. Giant Armadillo Optimization: A New Bio-Inspired Metaheuristic Algorithm for Solving Optimization Problems. Biomimetics 2023, 8, 619. [Google Scholar] [CrossRef]
- Desbiez, A.; Kluyber, D.; Massocato, G.; Attias, N. Methods for the characterization of activity patterns in elusive species: the giant armadillo in the Brazilian Pantanal. Journal of Zoology 2021, 315, 301–312. [Google Scholar] [CrossRef]
- Owaid, S.R.; Zhuravskyi, Y.; Lytvynenko, O.; Veretnov, A.; Sokolovskyi, D.; Plekhova, G.; Hrinkov, V.; Pluhina, T.; Neronov, S.; Dovbenko, O. DEVELOPMENT OF A METHOD OF INCREASING THE EFFICIENCY OF DECISION-MAKING IN ORGANIZATIONAL AND TECHNICAL SYSTEMS. Eastern-European Journal of Enterprise Technologies, 2024. [Google Scholar]
- Basheer, I.A.; Hajmeer, M. Artificial neural networks: fundamentals, computing, design, and application. Journal of microbiological methods 2000, 43, 3–31. [Google Scholar] [CrossRef] [PubMed]
- Zou, J.; Han, Y.; So, S.S. Overview of artificial neural networks
Artificial neural networks: methods and applications 2009, pp.14–22.
- Fletcher, R. A new approach to variable metric algorithms. The computer journal 1970, 13, 317–322. [Google Scholar] [CrossRef]
- Yuan, Y.x. A new stepsize for the steepest descent method. Journal of Computational Mathematics, 2006; 149, 156. [Google Scholar]
- Zhu, C.; Byrd, R.H.; Lu, P.; Nocedal, J. Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound-constrained optimization. ACM Transactions on mathematical software (TOMS) 1997, 23, 550–560. [Google Scholar] [CrossRef]
- Powell, M. A tolerant algorithm for linearly constrained optimization calculations. Mathematical Programming 1989, 45, 547–566. [Google Scholar] [CrossRef]
- Tsoulos, I.G. Modifications of real code genetic algorithm for global optimization. Applied Mathematics and Computation 2008, 203, 598–607. [Google Scholar] [CrossRef]
- Charilogis, V.; Tsoulos, I.G. Toward an Ideal Particle Swarm Optimizer for Multidimensional Functions. Information 2022, 13. [Google Scholar] [CrossRef]
- Ali, M.M.; Khompatraporn, C.; Zabinsky, Z.B. A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems. Journal of global optimization 2005, 31, 635–672. [Google Scholar] [CrossRef]
- Floudas, C.A.; Pardalos, P.M.; Adjiman, C.; Esposito, W.R.; Gümüs, Z.H.; Harding, S.T.; Klepeis, J.L.; Meyer, C.A.; Schweiger, C.A. Handbook of test problems in local and global optimization; Vol. 33, Springer Science & Business Media, 2013.
- Gaviano, M.; Kvasov, D.E.; Lera, D.; Sergeyev, Y.D. Algorithm 829: Software for generation of classes of test functions with known local and global minima for global optimization. ACM Transactions on Mathematical Software (TOMS) 2003, 29, 469–480. [Google Scholar] [CrossRef]
- Jones, J.E. On the determination of molecular fields.—II. From the equation of state of a gas. Proceedings of the Royal Society of London. Series A, Containing Papers of a Mathematical and Physical Character 1924, 106, 463–477. [Google Scholar]
- Gropp, W.; Lusk, E.; Doss, N.; Skjellum, A. A high-performance, portable implementation of the MPI message passing interface standard. Parallel computing 1996, 22, 789–828. [Google Scholar] [CrossRef]
- Chandra, R. Parallel programming in OpenMP; Morgan kaufmann, 2001.
- Li, C.C.; Lin, C.H.; Liu, J.C. Parallel genetic algorithms on the graphics processing units using island model and simulated annealing. Advances in Mechanical Engineering 2017, 9, 1687814017707413. [Google Scholar] [CrossRef]
- da Silveira, L.A.; Soncco-Álvarez, J.L.; de Lima, T.A.; Ayala-Rincón, M. Parallel island model genetic algorithms applied in NP-hard problems. In Proceedings of the 2019 IEEE Congress on Evolutionary Computation (CEC). IEEE; 2019; pp. 3262–3269. [Google Scholar]
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).