Lyudmila D. Egorova, Lev A. Kazakovtsev, Vladimit N. Krutikov, Elena M. Tovbis, Alexandra V. Fedorova

DOI Number
First page
Last page


Solving high dimentional, multimodal, non-smooth global optimization problems faces challenges concerning quality of solution, computational costs or even the impossibility of solving the problem. Evolutionary algorithms, in particular, differential evolution algorithm proved itself as good method of global optimization. On the other side, approach based on subgradient methods are good for optimizing non-smooth functions. Combination of these two approaches enables to improve the quality of the algorithm, using the best features of both methods. In this paper, a new hybrid evolutionary approach based on differential evolution and subgradient algorithm as the local search procedure is proposed. Behavior of the proposed SSGDE algorithm was studied in a numerical experiment on three groups of generated tests. Comparison of the new hybrid algorithm with the pure DE approach showed the advantage of the SSGDE. It has been experimentally established that the proposed method finds the global minimum in the best way for all considered dimensions of the problem with respect to the differential evolution method. The SSGDE algorithm showed the best results with a significant increase in the number of functions.


global non-smooth optimization, hybrid method, differential evolution, subgradient method.

Full Text:



M. Oszust, G. Sroka and K. Cymerys: A hybridization approach with predicted solution candidates for improving population-based optimization algorithms. Information Sciences. 574(3) (2021), 133–161.

D. Simon: Evolutionary Optimization Algorithms. Biologically-Inspired and Population-Based Approaches to Computer Intelligence. Wiley, 2013.

X.-S. Yang: Nature-Inspired Optimization Algorithms. Elsevier, London, 2014.

P. Garg: A Comparison between Memetic algorithm and Genetic algorithm for the cryptanalysis of Simplified Data Encryption Standard algorithm. IJNSA. 1(1) (2009),34 - 42.

R. Storn and K. Price: Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization. 11 (1997), 341–359.

M. Georgioudakis and V. Plevris: A Comparative Study of Differential Evolution Variants in Constrained Structural Optimization. Front. Built Environ. 6 (2020), 102.

J. Ma and H. Li: Research on Rosenbrock Function Optimization Problem Based on Improved Differential Evolution Algorithm. Journal of Computer and Communications. 7 (2019), 107-120.

P. Yu. Gubin, V. P. Oboskalov, A. Mahn¸itko and A. Gavrilovs: An Investigation into the Effectiveness of the Differential Evolution Method for Optimal Generating nits Maintenance by EENS Criteria. In: 61th International Scientific Conference on Power and Electrical Engineering (RTUCON), Riga Technical University, 2020, pp.1-5.

P. Yu. Gubin, V. P. Oboskalov, A. Mahn¸itko and R. Petrichenko:

Simulated Annealing, Differential Evolution and Directed Search Methods for Generator Maintenance Scheduling. Energies. 13 (2020), 5381. DOI: https://doi.org/10.3390/en13205381.

V. Yu. Ilyichev and E. A. Yurik: Development of methodology for calculation of optimal distribution of electric power between power units of condensing power plant. Izvestiya MGTU MAMI. 15 (2) (2021), 18-25.

W. Yang, E. M. D. Siriwardane, R. Dong, Y. Li and J. Hu: Crystal

structure prediction of materials with high symmetry using differential evolution. J. Phys. Condens. Matter. 33 (2021), 455902.

C. Y. Lee and C. H. Hung: Feature Ranking and Differential Evolution for Feature Selection in Brushless DC Motor Fault Diagnosis. Symmetry 13 (2021), 1291.

S. Chakraborty, A. K. Saha, S. Sharma et al.: Comparative Performance Analysis of Differential Evolution Variants on Engineering Design Problems. J. Bionic. Eng. 19 (2022), 1140–1160.

Z. Wu, N. Cui, L. Zhao, L. Han, X. Hu, H. Cai, D. Gong, L. Xing, X.

Chen and B. Zhu: Estimation of maize evapotranspiration in semi-humid regions of Northern China Using Penman-Monteith model and segmentally optimized Jarvis model. J. Hydrol. 22 (2022), 127483.

D. A. Erokhin and Sh. A. Akhmedova: The development and investigation of the efficiency of the differential evolution algorithm for solving multi-objective optimization problems. Siberian Journal of Science and Technology. 2 (2019), 134–143.

V. Charilogis, I. G. Tsoulos, A. Tzallas and E. Karvounis: Modifications for the Differential Evolution Algorithm. Symmetry 14 (2022), 447. https://doi.org/10.3390/sym14030447.

S. Prabha and R. Yadav: Differential evolution with biological-based mutation operator. Engineering Science and Technology 23(2) (2020), 253-263.

M.-T. Su, C.-H. Chen, C.-J. Lin and C.-T. Lin: A Rule-Based Symbiotic Modified Differential Evolution for SelfOrganizing Neuro-Fuzzy Systems. Applied Soft Computing.11 (2011), 4847–4858.

M. Eftekhari, S. D. Katebi, M. Karimi and A. H. Jahanmiri: Eliciting Transparent Fuzzy Model Using Differential Evolution. Applied Soft Computing. 2 (2008), 466–476.

M. Marinaki, Y. Marinakis and G. E. Stavroulakis: Fuzzy Control Optimized by a Multi-Objective Differential Evolution Algorithm for Vibration Suppression of Smart Structures. Computers & Structures. 147 (2015), 126–137.

C. Cotta and F. Neri: Memetic Algorithms in Continuous Optimization. In: Neri, F., Cotta, C., Moscato, P. (eds) Handbook of Memetic Algorithms. Studies in Computational Intelligence, vol 379 (2012). Springer, Cham.

P. Moscato: On evolution, search, optimization, genetic algorithms and martial arts: Towards memetic algorithms. Technical Report C3P 826, California Institute of Technology, Pasadena, CA, 1989.

D. Rutkowska, M. Pili´nski and L. Rutkowski: Sieci neuronowe, algorytmy genetyczne i systemy rozmyte. Wydawnictwo Naukowe PWN, Warszawa, 1999.

M. Wozniak: Hybrid Classifiers: Methods of Data, Knowledge, and Classifier Combination. Springer Publishing Company, 2013.

F. Valdez, J. C. Vazquez and P. Melin: A New Hybrid Method Based on ACO and PSO with Fuzzy Dynamic Parameter Adaptation for Modular Neural Networks Optimization. In: Castillo, O., Melin, P. (eds) Fuzzy Logic Hybrid Extensions of Neural and Optimization Algorithms: Theory and Applications. Studies in Computational Intelligence, vol 940 (2021). Springer, Cham.

V. Krutikov, S. Gutova, E. Tovbis, L. Kazakovtsev and E. Semenkin: Relaxation Subgradient Algorithms with Machine Learning Procedures. Mathematics. 10 (2022), 3959. DOI:10.3390/math10213959.

V. Krutikov, V. Meshechkin, E. Kagan and L. Kazakovtsev: Machine Learning Algorithms of Relaxation Subgradient Method with Space Extension. In: Mathematical Optimization Theory and Operations Research: 20th International Conference,

MOTOR 2021, July 2021, 477–492. https://doi.org/10.1007/978-3-030-77876-7 32

N. Z. Shor: Minimization Methods for Nondifferentiable Functions. Springer: Berlin, Germany, 1985.

Y. Nesterov: Subgradient optimization. John Wiley and Sons, Inc, 2009.

I. V. Sergienko and P. I. Statsyuk: On N. Z. Shor’s three scientific ideas. Cybernetics and System Analysis. 48 (2012), 2-16.

V. N. Krutikov and D. V. Aryshev: Implementation of algorithms for a peer-to-peer family of subgradient methods. Investigated in Russia. 43 (2004), 464-473.

V. N. Krutikov and Ya. N. Vershinin: Subgradient minimization method with correction of descent vectors based on pairs of learning relations. Bulletin of the Kemerovo State University. 1(57) (2014), 46-54. https://doi.org/10.21603/2078-8975-2014-1-46-54

V. N. Krutikov and N. S. Samoylenko: On the rate of convergence of the subgradient method with a change in the metric and its application in schemes of neural network approximations. Bulletin of the Tomsk State University. Mathematics and mechanics. 55 (2018), 22 - 37. DOI: 10/17223/19988621/55/3.

J. D. Lee, M. Simchowitz, M. I. Jordan and B. Recht: Gradient descent only converges to minimizers. In: Proceedings of the 29th Annual Conference on Learning Theory (COLT). USA, June 23-26, 2016, pp. 1246–1257.

Y. Drori: The exact information-based complexity of smooth convex minimization J. Complexity, 2017, 39, pp. 1–16.

D. Kim and J. A. Fessler: Optimized first-order methods for smooth convex optimization Math. Prog. Ser. A (2016) V. 159(1–2), pp. 81–107.

V. F. Demyanov : Nonsmooth Optimization. In Nonlinear Optimization; Lecture Notes in Mathematics; Di Pillo, G., Schoen, F., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; Volume 1989, pp. 55–163.

V. Feoktistov and S. Janaqi: Hybridization of differential evolution with leastsquare support vector machines. In: Proceedings of the Annual Machine Learning Conference of Belgium and The Netherlands. BENELEARN 2004, Vrije Universiteit Brussels, Belgium, January 2004, pp. 53–57.

A. V. Eremeev and J. V. Kovalenko : Optimal Recombination in Genetic Algorithms for Combinatorial Optimization Problems - Part I. Yugoslav Journal of Operations Research. 24 (2014), 1-20.DOI: 10.2298/YJOR130731040E.

E. Wirsansky: Hands-On Genetic Algorithms with Python. Birmingham – Mumbai, Packt Publishing, 2020.

R. Rockafellar : Convex analysis. Princeton, Princeton University Press, 1997.

X.-S. Yang: Nature-Inspired Optimization Algorithms: Challenges and Open Problems. Journal of Computational Science 46 (2020), 101104.

M. Arg´aez, L. Vel´azquez, C. Quintero, H. Klie and M. F. Wheeler: A Hybrid Algorithm for Global Optimization Problems. Reliab. Comput. 15 (2011), 230-241.

M. Dehghani and P. Trojovsk´y: Hybrid leader based optimization: a new stochastic optimization algorithm for solving optimization applications. Sci. Rep. 12 (2022), 5549. DOI: https://doi.org/10.1038/s41598-022-09514-0.

P. T. H. Nguyen and D. Sudholt : Memetic algorithms outperform evolutionary algorithms in multimodal optimisation. Artificial Intelligence. 287 (2020), 103345.

DOI: https://doi.org/10.22190/FUMI230802054E


  • There are currently no refbacks.

© University of Niš | Created on November, 2013
ISSN 0352-9665 (Print)
ISSN 2406-047X (Online)