loading page

Gradient descent method for group particles based on Improved Genetic Algorithm
  • +1
  • yulong shan,
  • Shijun Zhao,
  • Qiuhan Li,
  • Ren Zhang
yulong shan
National University of Defense Technology
Author Profile
Shijun Zhao
National University of Defense Technology

Corresponding Author:[email protected]

Author Profile
Qiuhan Li
National University of Defense Technology
Author Profile
Ren Zhang
National University of Defense Technology
Author Profile

Abstract

Artificial intelligence (AI) has been a hot research topic in recent years and the algorithm is its technical core. Many new algorithms have been proposed to solve complex non-linear optimization problems, and each of them has its own advantages and disadvantages. This study improves the selection operator of Genetic Algorithm (GA) by combining artificial selection with probabilistic selection to improve the convergence speed and the success rate when searching for the global optimal solution using GA. At the same time, considering that Gradient Descent (GD) method has high convergence rate, and GA has strong global search ability, we propose the algorithm of Gradient Descent method for group particles (GDFGP) based on GD and improved GA. Two experiments are carried out to verify the effectiveness of the new algorithm on convergence rate and global search ability. The results show that the improved GA had better convergence rate than the traditional GA. The success rate for obtaining the global optimal solution of GDFGP is higher than that of traditional and improved GA, especially when solving the more complex nonlinear optimization problems.